Fermi-Dirac statistics are one of two kinds of statistics exhibited by !identical quantum particles, the other being !Bose-Einstein statistics. Such particles are called fermions and bosons respectively (the terminology is due to Dirac [1902-1984] ). In the light of the !spin-statistics theorem, and consistent with observation, fermions are invariably spinors (of half-integral spin), whilst bosons are invariably scalar or vector particles (of integral spin). See !spin.
Identity. From very early days of quantum theory it was recognized that quanta were statistically strange (see !Bose-Einstein statistics). Suspicion fell on the identity of quanta, of how they are to be counted , . It was not until Dirac’s [1902-1984] work of 1926 (and his discovery of !Fermi-Dirac statistics ) that the nature of the novelty was clear: the quantum state of exactly similar particles of the same mass, charge, and spin must be symmetrized, yielding states either symmetric or (...) antisymmetric under permutations. This is the symmetry postulate (SP). (shrink)
The Born rule is derived from operational assumptions, together with assumptions of quantum mechanics that concern only the deterministic development of the state. Unlike Gleason’s theorem, the argument applies even if probabilities are de…ned for only a single resolution of the identity, so it applies to a variety of foundational approaches to quantum mechanics. It also provides a probability rule for state spaces that are not Hilbert spaces.
I heard about and laid hold of the idea of a four dimensional frame for a fresh apprehension of physical phenomena, which afterwards led me to send a paper, ‘The Universe Rigid’, to the Fortnightly Review (a paper which was rejected by Frank Harris as ‘incomprehensible’), and gave me a frame for my …rst scienti…c fantasia, The Time Machine. If there was a Universe rigid, and hitherto uniform, the character of the consequent world would depend entirely, I argued along (...) strictly materialist lines, upon the velocity of this initial displacement. The disturbance would spread outward with everincreasing complication. But I discovered no way, and there was no one to show me a way to get on from such elementarystruggles with primary concepts, to a sound understanding of contemporary experimental physics.”(H. G. Wells, “Experiment in Autobiography”, 1934, p.172). (shrink)
Probabilities may be subjective or objective; we are concerned with both kinds of probability, and the relationship between them. The fundamental theory of objective probability is quantum mechanics: it is argued that neither Bohr's Copenhagen interpretation, nor the pilot-wave theory, nor stochastic state-reduction theories, give a satisfactory answer to the question of what objective probabilities are in quantum mechanics, or why they should satisfy the Born rule; nor do they give any reason why subjective probabilities should track objective ones. But (...) it is shown that if probability only arises with decoherence, then they must be given by the Born rule. That further, on the Everett interpretation, we have a clear statement of what probabilities are, in terms of purely categorical physical properties; and finally, along lines laid out by Deutsch and Wallace, that there is a clear basis in the axioms of decision theory as to why subjective probabilities should track these objective ones. These results hinge critically on the absence of hidden-variables or any other mechanism (such as state-reduction) from the physical interpretation of the theory. The account of probability has traditionally been considered the principal weakness of the Everett interpretation; on the contrary it emerges as one of its principal strengths. (shrink)
We demonstrate that the quantum-mechanical description of composite physical systems of an arbitrary number of similar fermions in all their admissible states, mixed or pure, for all finite-dimensional Hilbert spaces, is not in conflict with Leibniz's Principle of the Identity of Indiscernibles (PII). We discern the fermions by means of physically meaningful, permutation-invariant categorical relations, i.e. relations independent of the quantum-mechanical probabilities. If, indeed, probabilistic relations are permitted as well, we argue that similar bosons can also be discerned in all (...) their admissible states; but their categorical discernibility turns out to be a state-dependent matter. In all demonstrated cases of discernibility, the fermions and the bosons are discerned (i) with only minimal assumptions on the interpretation of quantum mechanics; (ii) without appealing to metaphysical notions, such as Scotusian haecceitas, Lockean substrata, Postian transcendental individuality or Adamsian primitive thisness; and (iii) without revising the general framework of classical elementary predicate logic and standard set theory, thus without revising standard mathematics. This confutes: (a) the currently dominant view that, provided (i) and (ii), the quantum-mechanical description of such composite physical systems always conflicts with PII; and (b) that if PII can be saved at all, the only way to do it is by adopting one or other of the thick metaphysical notions mentioned above. Among the most general and influential arguments for the currently dominant view are those due to Schrödinger, Margenau, Cortes, Dalla Chiara, Di Francia, Redhead, French, Teller, Butterfield, Giuntini, Mittelstaedt, Castellani, Krause and Huggett. We review them succinctly and critically as well as related arguments by van Fraassen and Massimi. Introduction: The Currently Dominant View 1.1 Weyl on Leibniz's principle 1.2 Intermezzo: Terminology and Leibnizian principles 1.3 The rise of the currently dominant view 1.4 Overview Elements of Quantum Mechanics 2.1 Physical states and physical magnitudes 2.2 Composite physical systems of similar particles 2.3 Fermions and bosons 2.4 Physical properties 2.5 Varieties of quantum mechanics Analysis of Arguments 3.1 Analysis of the Standard Argument 3.2 Van Fraassen's analysis 3.3 Massimi's analysis The Logic of Identity and Discernibility 4.1 The language of quantum mechanics 4.2 Identity of physical systems 4.3 Indiscernibility of physical systems 4.4 Some kinds of discernibility Discerning Elementary Particles 5.1 Preamble 5.2 Fermions 5.3 Bosons Concluding Discussion CiteULike Connotea Del.icio.us What's this? (shrink)
We demonstrate that the quantum-mechanical description of composite physical systems of an arbitrary number of similar fermions in all their admissible states, mixed or pure, for all finite-dimensional Hilbert spaces, is not in conflict with Leibniz's Principle of the Identity of Indiscernibles (PII). We discern the fermions by means of physically meaningful, permutation-invariant categorical relations, i.e. relations independent of the quantum-mechanical probabilities. If, indeed, probabilistic relations are permitted as well, we argue that similar bosons can also be discerned in all (...) their admissible states; but their categorical discernibility turns out to be a state-dependent matter. In all demonstrated cases of discernibility, the fermions and the bosons are discerned (i) with only minimal assumptions on the interpretation of quantum mechanics; (ii) without appealing to metaphysical notions, such as Scotusian haecceitas, Lockean substrata, Postian transcendental individuality or Adamsian primitive thisness; and (iii) without revising the general framework of classical elementary predicate logic and standard set theory, thus without revising standard mathematics. This confutes: (a) the currently dominant view that, provided (i) and (ii), the quantum-mechanical description of such composite physical systems always conflicts with PII; and (b) that if PII can be saved at all, the only way to do it is by adopting one or other of the thick metaphysical notions mentioned above. Among the most general and influential arguments for the currently dominant view are those due to Schrodinger, Margenau, Cortes, Dalla Chiara, Di Francia, Redhead, French, Teller, Butterfield, Giuntini, Mittelstaedt, Castellani, Krause and Huggett. We review them succinctly and critically as well as related arguments by van Fraassen and Massimi. (shrink)
Following Lewis, it is widely held that branching worlds differ in important ways from diverging worlds. There is, however, a simple and natural semantics under which ordinary sentences uttered in branching worlds have much the same truth values as they conventionally have in diverging worlds. Under this semantics, whether branching or diverging, speakers cannot say in advance which branch or world is theirs. They are uncertain as to the outcome. This same semantics ensures the truth of utterances typically made about (...) quantum mechanical contingencies, including statements of uncertainty, if the Everett interpretation of quantum mechanics is true. The ‘incoherence problem’ of the Everett interpretation, that it can give no meaning to the notion of uncertainty, is thereby solved. IntroductionMetaphysicsPersonal fissionBranching worldsPhysicsObjections. (shrink)
A relationist will account for the use of ‘left’ and ‘right’ in terms of relative orientations, and other properties and relations invariant under mirroring. This analysis will apply whenever mirroring is a symmetry, so it certainly applies to classical mechanics; we argue it applies to any physical theory formulated on a manifold: it is in this sense an a priori symmetry. It should apply in particular to parity violating theories in quantum mechanics; mirror symmetry is only broken in such theories (...) as a special symmetry. (shrink)
Particle indistinguishability has always been considered a purely quantum mechanical concept. In parallel, indistinguishable particles have been thought to be entities that are not properly speaking objects at all. I argue, to the contrary, that the concept can equally be applied to classical particles, and that in either case particles may (with certain exceptions) be counted as objects even though they are indistinguishable. The exceptions are elementary bosons (for example photons).
The concept of classical indistinguishability is analyzed and defended against a number of well-known criticisms, with particular attention to the Gibbs’paradox. Granted that it is as much at home in classical as in quantum statistical mechanics, the question arises as to why indistinguishability, in quantum mechanics but not in classical mechanics, forces a change in statistics. The answer, illustrated with simple examples, is that the equilibrium measure on classical phase space is continuous, whilst on Hilbert space it is discrete. The (...) relevance of names, or equivalently, properties stable in time that can be used as names, is also discussed. (shrink)
Bohr’s interpretation of quantum mechanics has been criticized as incoherent and opportunistic, and based on doubtful philosophical premises. If so Bohr’s influence, in the pre-war period of 1927–1939, is the harder to explain, and the acceptance of his approach to quantum mechanics over de Broglie’s had no reasonable foundation. But Bohr’s interpretation changed little from the time of its first appearance, and stood independent of any philosophical presuppositions. The principle of complementarity is itself best read as a conjecture of unusually (...) wide scope, on the nature and future course of explanations in the sciences (and not only the physical sciences). If it must be judged a failure today, it is not because of any internal inconsistency. (shrink)
Tian Yu Cao has written a serious and scholarly book covering a great deal of physics. He ranges from classical relativity theory, both special and general, to relativistic quantum …eld theory, including non-Abelian gauge theory, renormalization theory, and symmetry-breaking, presenting a detailed and very rich picture of the mainstream developments in quantum physics; a remarkable feat. It has, moreover, a philosophical message: according to Cao, the development of these theories is inconsistent with a Kuhnian view of theory change, and supports (...) better a quali…ed realism. (shrink)
It is shown that the Hilbert-Bernays-Quine principle of identity of indiscernibles applies uniformly to all the contentious cases of symmetries in physics, including permutation symmetry in classical and quantum mechanics. It follows that there is no special problem with the notion of objecthood in physics. Leibniz's principle of sufficient reason is considered as well; this too applies uniformly. But given the new principle of identity, it no longer implies that space, or atoms, are unreal.
I fear I did not express myself as simply as I might have. My objection to Cao is that something must be ceded to Kuhn. One can of course try to oppose Kuhn's thesis root and branch, but to do so one had better counter his concrete examples, or one had better present an equally persuasive and wide-ranging history, but to a different effect. Perhaps Cao thinks his book provides just such an alternative, but alas, here a history of 20th (...) century field theories simply doesn't cut any ice: it is a history of _normal_ science, in Kuhn's terms, at least the way Cao tells it, devoting no time at all to the development of quantum physics, and hardly any to the discovery of special relativity, the two really revolutionary steps in physics in the last century. True, there remains one other plausible example of revolutionary science, and on this Cao does have something to say: the quantization of gravity. But this revolution is still in the making; one simply doesn't know whether gravity will be accommodated along the lines Cao suggests; one just doesn't know if his â€œgauge field program'â€, â€œgeometric programâ€, and â€œquantum field.. (shrink)
What is the meaning of general covariance? We learn something about it from the hole argument, due originally to Einstein. In his search for a theory of gravity, he noted that if the equations of motion are covariant under arbitrary coordinate transformations, then particle coordinates at a given time can be varied arbitrarily - they are underdetermined - even if their values at all earlier times are held fixed. It is the same for the values of fields. The argument can (...) also be made out in terms of transformations acting on the points of the manifold, rather than on the coordinates assigned to the points. So the equations of motion do not fix the particle positions, or the values of fields at manifold points, or particle coordinates, or fields as functions of the coordinates, even when they are specified at all earlier times. It is surely the business of physics to predict these sorts of quantities, given their values at earlier times. The principle of general covariance therefore seems untenable. (shrink)
Cao makes two claims of particular philosophical interest, in his book "The Conceptual Development of 20th Century Field Theories". (i) The history of these developments refutes Kuhn's relativistic epistemology, and (tacitly) (2) the question of realism in quantum field theory can be addressed independent of one's views on the probem of measurement. I argue that Cao is right on the first score, although for reasons different from the ones he cites, but wrong on the second. In support of the first (...) of these claims, I review in detail the correspondence between the treatment of critical phenomena in condensed matter physics, and of scaling in the renormalization group of RQFT. (shrink)
Special relativity is most naturally formulated as a theory of spacetime geometry, but within the spacetime framework probability appears to be a purely epistemic notion. It is possible that progress can be made with rather different approaches - covariant stochastic equations, in particular - but the results to date are not encouraging. However, it seems a non-epistemic notion of probability can be made out in Minkowski space on Everett's terms. I shall work throughout with the consistent histories formalism. I shall (...) start with a conservative interpretation, and then go on to Everett's. (shrink)
Is tense real and objective? Can the fact that something is past, say, be wholly objective, consistent with special relativity? The answer is yes, but only so long as the distinction has no ontological ground. There is a closely related question. Is the contrast between the determinate and the indeterminate real and objective, consistent with relativity and quantum mechanics? The answer is again yes, but only if the contrast has no ontological ground. Various accounts of it are explored, according to (...) different approaches to quantum mechanics. The Everett interpretation is much the most successful in accounting for it. (shrink)
A variety of ideas arising in decoherence theory, and in the ongoing debate over Everett's relative-state theory, can be linked to issues in relativity theory and the philosophy of time, specifically the relational theory of tense and of identity over time. These have been systematically presented in companion papers (Saunders 1995; 1996a); in what follows we shall consider the same circle of ideas, but specifically in relation to the interpretation of probability, and its identification with relations in the Hilbert Space (...) norm. The familiar objection that Everett's approach yields probabilities different from quantum mechanics is easily dealt with. The more fundamental question is how to interpret these probabilities consistent with the relational theory of change, and the relational theory of identity over time. I shall show that the relational theory needs nothing more than the physical, minimal criterion of identity as defined by Everett's theory, and that this can be transparently interpreted in terms of the ordinary notion of the chance occurrence of an event, as witnessed in the present. It is in this sense that the theory has empirical content. (shrink)
The relational approach to tense holds that the now, passage, and becoming are to be understood in terms of relations between events. The debate over the adequacy of this framework is illustrated by a comparative study of the sense in which physical theories, (in)deterministic and (non)relativistic, can lend expression to the metaphysics at issue. The objective is not to settle the matter, but to clarify the nature of this metaphysics and to establish that the same issues are at stake in (...) the relational approach to value-definiteness and probability in quantum mechanics. They concern the existence of a unique present, respectively actuality, and a notion of identity over time that cannot be paraphrased in terms of relations. (shrink)
State-reduction and the notion of actuality are compared to passage through time and the notion of the present; already in classical relativity the latter give rise to difficulties. The solution proposed here is to treat both tense and value-definiteness as relational properties or facts as relations; likewise the notions of change and probability. In both cases essential characteristics are absent: temporal relations are tenselessly true; probabilistic relations are deterministically true.The basic ideas go back to Everett, although the technical development makes (...) use of the decoherent histories theory of Griffiths, Omnès, and Gell-Mann and Hartle. Alternative interpretations of the decoherent histories framework are also considered. (shrink)
Debates over the significance of the particle concept, and the problem of locality-how do we represent localized phenomena?-appear to presuppose that particles and observed phenomena are things rather than events. Well-known theorems (Hergerfelt, Reeh-Schlieder), and a recent variant of Hergerfelt's theorem due to David Malement, present a problem of locality only given the tacit appeal to the concept of thing, in fact an individual, in a sense contrary to particle indistinguishability. There is no difficulty with the particle concept per se, (...) but it is a global construction more than one step removed from events actually observed, which are represented by local integrals over self-adjoint field densities. (shrink)
We review the decoherent histories approach to the interpretation of quantum mechanics. The Everett relative-state theory is reformulated in terms of decoherent histories. A model of evolutionary adaptation is shown to imply decoherence. A general interpretative framework is proposed: probability and value-definiteness are to have a similar status to the attribution of tense in classical spacetime theory.
A heuristic comparison is made of relativistic and non-relativistic quantum theory. To this end the Segal approach is described for the non-specialist. The significance of antimatter to the local and microcausal properties of the fields is laid bare. The fundamental difference between relativistic and non-relativistic (complex) fields is traced to the existence of two kinds of complex numbers in the relativistic case. Their relation to covariant and Newton-Wigner locality is formulated.
The vacuum is fast emerging as the central structure of modern physics. This collection brings together philosophically-minded specialists who engage these issues in the context of classical gravity, quantum electrodynamics, and the grand unification program. The vacuum emerges as the synthesis of concepts of space, time, and matter; in the context of relativity and the quantum this new synthesis represents a structure of the most intricate and novel complexity. This book is a work in modern metaphysics, in which the concepts (...) of substance and space interweave in the most intangible of forms, the background and context of our physical experience: vacuum, void, or nothingness. (shrink)
It is argued that Lewis's approach to Elga's Sleeping Beaty problem is untenable and,<span class='Hi'></span> therefore,<span class='Hi'></span> the universality of the betting approach to probability has not been breached.