Phase-shift analysis is a commonly used technique to extract the scattering amplitudes of a two-body strong-interaction scattering process from the experimentally measured quantities | total cross-section, di erential cross-section, polarization and spin-correlation parameters. However, at a xed energy, all the scattering amplitudes can be multiplied by an arbitrary angle-dep phase-factor without a ecting the measurables. It would seem then that the phase of the..
quantum electrodynamics. In quasilinear approximation, the integral equation is solved by Mellin transformation, followed by the calculation of the Muskhelishvili index of the resultant singular integral operator.
In a previous paper we have shown that in quantum chromodynamics the gluon propagator vanishes in the infrared limit, while the ghost propagator is more singular than a simple pole. These results were obtained after angular averaging, but in the current paper we go beyond this approximation and perform an exact calculation of the angular integrals. The powers of the infrared behaviour of the propagators are changed substantially. We ﬁnd the very intriguing result that the gluon propagator vanishes in the (...) infrared exactly like p2, whilst the ghost propagator is exactly as singular as 1/p4. We also ﬁnd that the value of the infrared ﬁxed point of the QCD coupling is much decreased from the y-max estimate: it is now equal to 4π/3. Following a recent study by von Smekal et al., we analyzed in Ref. the coupled Dyson- Schwinger equations for the gluon and ghost form factors F and G. The approximations were two-fold: ﬁrstly the vertices were taken bare, and secondly angular averaging was introduced (the so-called y-max approximation). Deferring to later work an improvement of the vertices. (shrink)
Nonlocality in quantum mechanics does not follow from nonseparability, nor does classical stochastic independence imply physical independence. In this paper an explicit proof of a Bell inequality is recalled, and an analysis of the Aspect experiment in terms of noncontextual, but indefinite weights, or improper probabilities, is given.
An inﬁnite number of elastically colliding balls is considered in a classical, and then in a relativistic setting. Energy and momentum are not necessarily conserved globally, even though each collision does separately conserve them. This result holds in particular when the total mass of all the balls is ﬁnite, and even when the spatial extent and temporal duration of the process are also ﬁnite. Further, the process is shown to be indeterministic: there is an arbitrary parameter in the general solution (...) that corresponds to the injection of an arbitrary amount of energy (classically), or energy-momentum (relativistically), into the system at the point of accumulation of the locations of the balls. Speciﬁc examples are given that illustrate these counter-intuitive results, including one in which all the balls move with the same velocity after every collision has taken place. (shrink)
Theo Kuipers describes four kinds of research program and the question is raised here as to whether string theory could be accommodated by one of them, or whether it should be classified in a new, fifth kind of research program.
When the Parrondo effect was discovered a few years ago (Harmer and Abbott 1999a, 1999b), it was hailed as a possible mechanism whereby, in a kind of collaboration of failure, losing strategies could be combined to yield profit. The precise relevance of the Parrondo effect to natural and social phenomena is however still unclear. In this paper we give specific examples, first in the artificial setting of a gambling machine, and then in more natural applications to genetics and to environmental (...) policies. This last example touches questions of rational behaviour and expected utility in a novel setting. (shrink)
After recalling proofs of the Bell inequality based on the assumptions of separability and of noncontextuality, the most general noncontextual contrapositive conditional probabilities consistent with the Aspect experiment are constructed. In general these probabilities are not all positive.
This minicourse on quantum mechanics is intended for students who have already been rather well exposed to the subject at an elementary level. It is assumed that they have surmounted the first conceptual hurdles and also have struggled with the Schrödinger equation in one dimension.
The coupled Dyson-Schwinger equations for the gluon and ghost propagators in QCD are shown to have solutions that correspond to a unique running coupling that has a nite infrared xed point and the expected logarithmic decrease in the ultraviolet. The infrared coupling is large enough to support chiral symmetry breaking and quarks are not con ned, but they cannot be isolated.
An eight parameter family of the most general nonnegative quadruple probabilities is constructed for EPR-Bohm-Aharonov experiments when only 3 pairs of analyser settings are used. It is a simultaneous representation of 3 Bohr-incompatible experimental conﬁgurations valid for arbitrary quantum states.
De meesten van ons kunnen zich moeiteloos beelden uit hun schooljaren voor de geest halen. Dikwijls zijn die beelden vaag en fragmentarisch, maar herkenbaar genoeg: het jongetje met het ziekenfondsbrilletje, het meisje met de afgezakte kousen, de onderwijzeres met de grote ruiten rok. Wie begiftigd is met een sterke auditieve verbeelding (en oud genoeg) kan zelfs weer horen hoe de schoolbel klingelde of hoe de kroontjespennen over het papier krasten. Sommigen zijn zelfs in staat om zich geuren te herinneren, of (...) aanrakingen, of de smaak van madeleine-koekjes gedoopt in lindebloesemthee. (shrink)
The classical electrodynamics of point charges can be made finite by the introduction of effects that temporally precede their causes. The idea of retrocausality is also inherent in the Feynman propagators of quantum electrodynamics. The notion allows a new understanding of the violation of the Bell inequalities, and of the world view revealed by quantum mechanics. Published in The Universe, Visions and Perspectives, edited by N. Dadhich and A. Kembhavi, Kluwer Academic Publishers, 2000, pages 35-50.
A recent claim that in quantum chromodynamics in the Landau gauge the gluon propagator vanishes in the infrared limit, while the ghost propagator is more singular than a simple pole, is investigated analytically and numerically. This picture is shown to be supported even at the level in which the vertices in the Dyson- Schwinger equations are taken to be bare. The gauge invariant running coupling is shown to be uniquely determined by the equations and to have a large ﬁnite infrared (...) limit. S0556-2821 98 04421-X.. (shrink)
Many physicists believe that time constitutes a serious problem in quantum mechanics. We show nevertheless that quantum mechanics does not involve a special problem for time, and that there is no fundamental asymmetry between space and time in quantum mechanics over and above the asymmetry that already exists in classical physics. The apparent problem of time arises when the time parameter is put on a par with dynamical position variables rather than with the coordinates of space. The commutation relations and (...) uncertainty relations are generally considered to embody the essential content of elementary quantum mechanics, but the traditional mathematical expression of the uncertainty principle it shown to be quite unsatisfactory. It is the total energy that decrees whether or not the time variables of a system can be sharply determined. (shrink)
The Michelson-Morley experiment suggests the hypothesis that the two-way speed of light is constant, and this is consistent with a more general invariance than that of Lorentz. On adding the requirement that physical laws have the same form in all inertial frames, as Einstein did, the transformation specializes to that of Lorentz.
Various arguments have been put forward to show that Zeno-like paradoxes are still with us. A particularly interesting one involves a cube composed of colored slabs that geometrically decrease in thickness. We first point out that this argument has already been nullified by Paul Benacerraf. Then we show that nevertheless a further problem remains, one that withstands Benacerraf’s critique. We explain that the new problem is isomorphic to two other Zeno-like predicaments: a problem described by Alper and Bridger in 1998 (...) and a modified version of the problem that Benardete introduced in 1964. Finally, we present a solution to the three isomorphic problems. (shrink)
The notion of probabilistic support is beset by well-known problems. In this paper we add a new one to the list: the problem of transitivity. Tomoji Shogenji has shown that positive probabilistic support, or confirmation, is transitive under the condition of screening off. However, under that same condition negative probabilistic support, or disconfirmation, is intransitive. Since there are many situations in which disconfirmation is transitive, this illustrates, but now in a different way, that the screening-off condition is too restrictive. We (...) therefore weaken this condition to what we call ‘partial’ screening off. We show that the domain defined by partial screening off comprises two mutually exclusive subdomains. In one subdomain disconfirmation is indeed transitive, but confirmation is then intransitive. In the other, confirmation is transitive, but here disconfirmation is once more intransitive. (shrink)
This paper examines the ethical dimensions of the closure process of an English large long-stay institution for people with learning difficulties during the last quarter of the twentieth century. It does this primarily through an analysis of oral historical interview data stemming from those managers who implemented rundown. The paper illustrates the ways in which their testimonies indicate the presence of a morally infused dominant rhetoric, which was based upon the therapeutic benefits of closure, informed by the ideas of normalisation (...) and social role valorisation. However, the paper argues that this principled managerial perspective had unfortunate ethical consequences, in that it under-acknowledged, marginalised and discredited staff viewpoints which raised pertinent issues relating to the downsizing of this particular hospital. (shrink)
A major objection to epistemic infinitism is that it seems to make justification impossible. For if there is an infinite chain of reasons, each receiving its justification from its neighbour, then there is no justification to inherit in the first place. Some have argued that the objection arises from misunderstanding the character of justification. Justification is not something that one reason inherits from another; rather it gradually emerges from the chain as a whole. Nowhere however is it made clear what (...) exactly is meant by emergence. The aim of this paper is to fill that lacuna: we describe a detailed procedure for the emergence of justification that enables us to see exactly how justification surfaces from a chain of reasons. (shrink)
So far no known measure of confirmation of a hypothesis by evidence has satisfied a minimal requirement concerning thresholds of acceptance. In contrast, Shogenji’s new measure of justification (Shogenji, Synthese, this number 2009) does the trick. As we show, it is ordinally equivalent to the most general measure which satisfies this requirement. We further demonstrate that this general measure resolves the problem of the irrelevant conjunction. Finally, we spell out some implications of the general measure for the Conjunction Effect; in (...) particular we give an example in which the effect occurs in a larger domain, according to Shogenji justification, than Carnap’s measure of confirmation would have led one to expect. (shrink)
This paper is the third and final one in a sequence of three. All three papers emphasize that a proposition can be justified by an infinite regress, on condition that epistemic justification is interpreted probabilistically. The first two papers showed this for one-dimensional chains and for one-dimensional loops of propositions, each proposition being justified probabilistically by its precursor. In the present paper we consider the more complicated case of two-dimensional nets, where each "child" proposition is probabilistically justified by two "parent" (...) propositions. Surprisingly, it turns out that probabilistic justification in two dimensions takes on the form of Mandelbrot's iteration. Like so many patterns in nature, probabilistic reasoning might in the end be fractal in character. (shrink)
Suppose q is some proposition, and let -/- P(q) = v0 (1) -/- be the proposition that the probability of q is v0.1 How can one know that (1) is true? One cannot know it for sure, for all that may be asserted is a further probabilistic statement like -/- P(P(q) = v0) = v1, (2) -/- which states that the probability that (1) is true is v1. But the claim (2) is also subject to some further statement of an (...) even higher probability: -/- P(P(P(q) = v0) = v1) = v2, (3) -/- and so on. Thus, an infinite regress emerges of probabilities of probabilities, and the question arises as to whether this regress is vicious or harmless. -/- Radical probabilists would like to claim that it is harmless, but Nicholas Rescher (2010), in his scholarly and very stimulating Infinite Regress: The Theory and History of Varieties of Change, argues that it is vicious. He believes that an infinite hierarchy of probabilities makes it impossible to know anything about the probability of the original proposition q: -/- unless some claims are going to be categorically validated and not just adjudged probabilistically, the radically probabilistic epistemology envisioned here is going to be beyond the prospect of implementation. . . . If you can indeed be certain of nothing, then how can you be sure of your probability assessments. If all you ever have is a nonterminatingly regressive claim of the format . . . the probability is .9 that (the probability is .9 that (the probability of q is .9)) then in the face of such a regress, you would know effectively nothing about the condition of q. After all, without a categorically established factual basis of some sort, there is no way of assessing probabilities. But if these requisites themselves are never categorical but only probabilistic, then we are propelled into a vitiating regress of presuppositions. (shrink)
Continuous sedation until death (CSD), the act of reducing or removing the consciousness of an incurably ill patient until death, often provokes medical–ethical discussions in the opinion sections of medical and nursing journals. Some argue that CSD is morally equivalent to physician-assisted death (PAD), that it is a form of “slow euthanasia.” A qualitative thematic content analysis of opinion pieces was conducted to describe and classify arguments that support or reject a moral difference between CSD and PAD. Arguments pro and (...) contra a moral difference refer basically to the same ambiguous themes, namely intention, proportionality, withholding artificial nutrition and hydration, and removing consciousness. This demonstrates that the debate is first and foremost a semantic rather than a factual dispute, focusing on the normative framework of CSD. Given the prevalent ambiguity, the debate on CSD appears to be a classical symbolic struggle for moral authority. (shrink)
From 1929 onwards, C. I. Lewis defended the foundationalist claim that judgements of the form 'x is probable' only make sense if one assumes there to be a ground y that is certain (where x and y may be beliefs, propositions, or events). Without this assumption, Lewis argues, the probability of x could not be anything other than zero. Hans Reichenbach repeatedly contested Lewis's idea, calling it "a remnant of rationalism". The last move in this debate was a challenge by (...) Lewis, defying Reichenbach to produce a regress of probability values that yields a number other than zero. Reichenbach never took up the challenge, but we will meet it on his behalf, as it were. By presenting a series converging to a limit, we demonstrate that x can have a definite and computable probability, even if its justification consists of an infinite number of steps. Next we show the invalidity of a recent riposte of foundationalists that this limit of the series can be the ground of justification. Finally we discuss the question where justification can come from if not from a ground. (shrink)
This article presents a generalization of the Condorcet Jury Theorem. All results to date assume a fixed value for the competence of jurors, or alternatively, a fixed probability distribution over the possible competences of jurors. In this article, we develop the idea that we can learn the competence of the jurors by the jury vote. We assume a uniform prior probability assignment over the competence parameter, and we adapt this assignment in the light of the jury vote. We then compute (...) the posterior probability, conditional on the jury vote, of the hypothesis voted over. We thereby retain the central results of Condorcet, but we also show that the posterior probability depends on the size of the jury as well as on the absolute margin of the majority. (shrink)
An actual infinity of colliding balls can be in a configuration in which the laws of mechanics lead to logical inconsistency. It is argued that one should therefore limit the domain of these laws to a finite, or only a potentially infinite number of elements. With this restriction indeterminism, energy nonconservation and creatio ex nihilo no longer occur. A numerical analysis of finite systems of colliding balls is given, and the asymptotic behaviour that corresponds to the potentially infinite system is (...) inferred. (shrink)
A common objection to coherentism is that it cannot account for truth: it gives us no reason to prefer a true theory over a false one, if both theories are equally coherent. By extending Susan Haack's crossword metaphor, the authors argue that there could be circumstances under which this objection is untenable. Although these circumstances are remote, they are in full accordance with the most ambitious modern theories in physics. Coherence may perhaps be truth conducive.
In an earlier paper we have shown that a proposition can have a well-defined probability value, even if its justification consists of an infinite linear chain. In the present paper we demonstrate that the same holds if the justification takes the form of a closed loop. Moreover, in the limit that the size of the loop tends to infinity, the probability value of the justified proposition is always well-defined, whereas this is not always so for the infinite linear chain. This (...) suggests that infinitism sits more comfortably with a coherentist view of justification than with an approach in which justification is portrayed as a linear process. (shrink)
We have earlier shown by construction that a proposition can have a welldefined nonzero probability, even if it is justified by an infinite probabilistic regress. We thought this to be an adequate rebuttal of foundationalist claims that probabilistic regresses must lead either to an indeterminate, or to a determinate but zero probability. In a comment, Frederik Herzberg has argued that our counterexamples are of a special kind, being what he calls ‘solvable’. In the present reaction we investigate what Herzberg means (...) by solvability. We discuss the advantages and disadvantages of making solvability a sine qua non , and we ventilate our misgivings about Herzberg’s suggestion that the notion of solvability might help the foundationalist. (shrink)
Various arguments have been put forward to show that Zeno-like paradoxes are still with us. A particularly interesting one involves a cube composed of colored slabs that geometrically decrease in thickness. We first point out that this argument has already been nullified by Paul Benacerraf. Then we show that nevertheless a further problem remains, one that withstands Benacerraf s critique. We explain that the new problem is isomorphic to two other Zeno-like predicaments: a problem described by Alper and Bridger in (...) 1998 and a modified version of the problem that Benardete introduced in 1964. Finally, we present a solution to the three isomorphic problems. (shrink)
An infinite number of elastically colliding balls is considered in a classical, and then in a relativistic setting. Energy and momentum are not necessarily conserved globally, even though each collision does separately conserve them. This result holds in particular when the total mass of all the balls is finite, and even when the spatial extent and temporal duration of the process are also finite. Further, the process is shown to be indeterministic: there is an arbitrary parameter in the general solution (...) that corresponds to the injection of an arbitrary amount of energy (classically), or energy-momentum (relativistically), into the system at the point of accumulation of the locations of the balls. Specific examples are given that illustrate these counter-intuitive results, including one in which all the balls move with the same velocity after every collision has taken place. (shrink)
Today it is generally assumed that epistemic justification comes in degrees. The consequences, however, have not been adequately appreciated. In this paper we show that the assumption invalidates some venerable attacks on infinitism: once we accept that epistemic justification is gradual, an infinitist stance makes perfect sense. It is only without the assumption that infinitism runs into difficulties.
Can some evidence confirm a conjunction of two hypotheses more than it confirms either of the hypotheses separately? We show that it can, moreover under conditions that are the same for ten different measures of confirmation. Further we demonstrate that it is even possible for the conjunction of two disconfirmed hypotheses to be confirmed by the same evidence.
A recent thought experiment has shed interesting new light on the core problem of Zeno’s Achilles. A ball apparently can, and cannot collide with an infinite, open set of balls. It is the purpose of this paper to make the new development accessible to the general philosophical community and to suggest a direction in which the problem may perhaps be solved.
Reichenbach’s use of ‘posits’ to defend his frequentistic theory of probability has been criticized on the grounds that it makes unfalsifiable predictions. The justice of this criticism has blinded many to Reichenbach’s second use of a posit, one that can fruitfully be applied to current debates within epistemology. We show first that Reichenbach’s alternative type of posit creates a difficulty for epistemic foundationalists, and then that its use is equivalent to a particular kind of Jeffrey conditionalization. We conclude that, under (...) particular circumstances, Reichenbach’s approach and that of the Bayesians amount to the same thing, thereby presenting us with a new instance in which chance and credence coincide. (shrink)
It is widely held that the paradox of Achilles and the Tortoise, introduced by Zeno of Elea around 460 B.C., was solved by mathematical advances in the nineteenth century. The techniques of Weierstrass, Dedekind and Cantor made it clear, according to this view, that Achilles’ difficulty in traversing an infinite number of intervals while trying to catch up with the tortoise does not involve a contradiction, let alone a logical absurdity. Yet ever since the nineteenth century there have been dissidents (...) claiming that the apparatus of Weierstrass et al. has not resolved the paradox, and that serious problems remain. It seems that these claims have received unexpected support from recent developments in mathematical physics. This support has however remained largely unnoticed by historians of philosophy, presumably because the relevant debates are cast in mathematical-technical terms that are only accessible to people with the relevant training. That is unfortunate, since the debates in question might well profit from input by philosophers in general and historians of philosophy in particular. Below we will first recall the Achilles paradox, and describe the way in which nineteenth century mathematics supposedly solved it. Then we discuss recent work that contests this solution, reiterating the dissident dogma that no mathematical approach whatsoever can even come close to solving the original Achilles. We shall argue that this dissatisfaction with a mathematical solution is inadequate as it stands, but that it can perhaps be reformulated in the light of new developments in mathematical physics. (shrink)
A Zenonian supertask involving an infinite number of colliding balls is considered, under the restriction that the total mass of all the balls is finite. Classical mechanics leads to the conclusion that momentum, but not necessarily energy, must be conserved. Relativistic mechanics, on the other hand, implies that energy and momentum conservation are always violated. Quantum mechanics, however, seems to rule out the Zeno configuration as an inconsistent system.
We have never entirely agreed with Daniel Cohnitz on the status and rôle of thought experiments. Several years ago, enjoying a splendid lunch together in the city of Ghent, we cheerfully agreed to disagree on the matter; and now that Cohnitz has published his considered opinion of our views, we are glad that we have the opportunity to write a rejoinder and to explicate some of our disagreements. We choose not to deal here with all the issues that Cohnitz raises, (...) but rather to restrict ourselves to three specific points. (shrink)
Can some evidence confirm a conjunction of two hypotheses more than it confirms either of the hypotheses separately? We show that it can, moreover under conditions that are the same for nine different measures of confirmation. Further we demonstrate that it is even possible for the conjunction of two disconfirmed hypotheses to be confirmed by the same evidence.
A Zenonian supertask involving an infinite number of identical colliding balls is generalized to include balls with different masses. Under the restriction that the total mass of all the balls is finite, classical mechanics leads to velocities that have no upper limit. Relativistic mechanics results in velocities bounded by that of light, but energy and momentum are not conserved, implying indeterminism. The notion that both determinism and the conservation laws might be salvaged via photon creation is shown to be flawed.
Quantum electrodynamics is a time-symmetric theory that is part of the electroweak interaction, which is invariant under a generalized form of this symmetry, the PCT transformation. The thesis is defended that the arrow of time in electrodynamics is a consequence of the assumption of an initial state of high order, together with the quantum version of the equiprobability postulate.
Richard Jeffrey’s radical probabilism (‘probability all the way down’) is augmented by the claim that probability cannot be turned into certainty, except by data that logically exclude all alternatives. Once we start being uncertain, no amount of updating will free us from the treadmill of uncertainty. This claim is cast first in objectivist and then in subjectivist terms.