What would it mean to apply quantum theory, without restriction and without involving any notion of measurement and state reduction, to the whole universe? What would realism about the quantum state then imply? This book brings together an illustrious team of philosophers and physicists to debate these questions. The contributors broadly agree on the need, or aspiration, for a realist theory that unites micro- and macro-worlds. But they disagree on what this implies. Some argue that if unitary quantum evolution has (...) unrestricted application, and if the quantum state is taken to be something physically real, then this universe emerges from the quantum state as one of countless others, constantly branching in time, all of which are real. The result, they argue, is many worlds quantum theory, also known as the Everett interpretation of quantum mechanics. No other realist interpretation of unitary quantum theory has ever been found. Others argue in reply that this picture of many worlds is in no sense inherent to quantum theory, or fails to make physical sense, or is scientifically inadequate. The stuff of these worlds, what they are made of, is never adequately explained, nor are the worlds precisely defined; ordinary ideas about time and identity over time are compromised; no satisfactory role or substitute for probability can be found in many worlds theories; they can't explain experimental data; anyway, there are attractive realist alternatives to many worlds. Twenty original essays, accompanied by commentaries and discussions, examine these claims and counterclaims in depth. They consider questions of ontology - the existence of worlds; probability - whether and how probability can be related to the branching structure of the quantum state; alternatives to many worlds - whether there are one-world realist interpretations of quantum theory that leave quantum dynamics unchanged; and open questions even given many worlds, including the multiverse concept as it has arisen elsewhere in modern cosmology. A comprehensive introduction lays out the main arguments of the book, which provides a state-of-the-art guide to many worlds quantum theory and its problems. (shrink)
Particle indistinguishability has always been considered a purely quantum mechanical concept. In parallel, indistinguishable particles have been thought to be entities that are not properly speaking objects at all. I argue, to the contrary, that the concept can equally be applied to classical particles, and that in either case particles may (with certain exceptions) be counted as objects even though they are indistinguishable. The exceptions are elementary bosons (for example photons).
We demonstrate that the quantum-mechanical description of composite physical systems of an arbitrary number of similar fermions in all their admissible states, mixed or pure, for all finite-dimensional Hilbert spaces, is not in conflict with Leibniz's Principle of the Identity of Indiscernibles (PII). We discern the fermions by means of physically meaningful, permutation-invariant categorical relations, i.e. relations independent of the quantum-mechanical probabilities. If, indeed, probabilistic relations are permitted as well, we argue that similar bosons can also be discerned in all (...) their admissible states; but their categorical discernibility turns out to be a state-dependent matter. In all demonstrated cases of discernibility, the fermions and the bosons are discerned (i) with only minimal assumptions on the interpretation of quantum mechanics; (ii) without appealing to metaphysical notions, such as Scotusian haecceitas, Lockean substrata, Postian transcendental individuality or Adamsian primitive thisness; and (iii) without revising the general framework of classical elementary predicate logic and standard set theory, thus without revising standard mathematics. This confutes: (a) the currently dominant view that, provided (i) and (ii), the quantum-mechanical description of such composite physical systems always conflicts with PII; and (b) that if PII can be saved at all, the only way to do it is by adopting one or other of the thick metaphysical notions mentioned above. Among the most general and influential arguments for the currently dominant view are those due to Schrodinger, Margenau, Cortes, Dalla Chiara, Di Francia, Redhead, French, Teller, Butterfield, Giuntini, Mittelstaedt, Castellani, Krause and Huggett. We review them succinctly and critically as well as related arguments by van Fraassen and Massimi. (shrink)
A variety of ideas arising in decoherence theory, and in the ongoing debate over Everett's relative-state theory, can be linked to issues in relativity theory and the philosophy of time, specifically the relational theory of tense and of identity over time. These have been systematically presented in companion papers (Saunders 1995; 1996a); in what follows we shall consider the same circle of ideas, but specifically in relation to the interpretation of probability, and its identification with relations in the Hilbert Space (...) norm. The familiar objection that Everett's approach yields probabilities different from quantum mechanics is easily dealt with. The more fundamental question is how to interpret these probabilities consistent with the relational theory of change, and the relational theory of identity over time. I shall show that the relational theory needs nothing more than the physical, minimal criterion of identity as defined by Everett's theory, and that this can be transparently interpreted in terms of the ordinary notion of the chance occurrence of an event, as witnessed in the present. It is in this sense that the theory has empirical content. (shrink)
It is shown that the Hilbert-Bernays-Quine principle of identity of indiscernibles applies uniformly to all the contentious cases of symmetries in physics, including permutation symmetry in classical and quantum mechanics. It follows that there is no special problem with the notion of objecthood in physics. Leibniz's principle of sufficient reason is considered as well; this too applies uniformly. But given the new principle of identity, it no longer implies that space, or atoms, are unreal.
It is widely accepted that the notion of an inertial frame is central to Newtonian mechanics and that the correct space-time structure underlying Newton’s methods in Principia is neo-Newtonian or Galilean space-time. I argue to the contrary that inertial frames are not needed in Newton’s theory of motion, and that the right space-time structure for Newton’s Principia requires the notion of parallelism of spatial directions at different times and nothing more. Only relative motions are definable in this framework, never absolute (...) ones. (shrink)
Following Lewis, it is widely held that branching worlds differ in important ways from diverging worlds. There is, however, a simple and natural semantics under which ordinary sentences uttered in branching worlds have much the same truth values as they conventionally have in diverging worlds. Under this semantics, whether branching or diverging, speakers cannot say in advance which branch or world is theirs. They are uncertain as to the outcome. This same semantics ensures the truth of utterances typically made about (...) quantum mechanical contingencies, including statements of uncertainty, if the Everett interpretation of quantum mechanics is true. The ‘incoherence problem’ of the Everett interpretation, that it can give no meaning to the notion of uncertainty, is thereby solved. IntroductionMetaphysicsPersonal fissionBranching worldsPhysicsObjections. (shrink)
This is a self-contained introduction to the Everett interpretation of quantum mechanics. It is the introductory chapter of Many Worlds? Everett, quantum theory, and reality, S. Saunders, J. Barrett, A. Kent, and D. Wallace, Oxford University Press.
According to the Everett interpretation, branching structure and ratios of norms of branch amplitudes are the objective correlates of chance events and chances; that is, 'chance' and 'chancing', like 'red' and 'colour', pick out objective features of reality, albeit not what they seemed. Once properly identified, questions about how and in what sense chances can be observed can be treated as straightforward dynamical questions. On that basis, given the unitary dynamics of quantum theory, it follows that relative and never absolute (...) chances can be observed; that only on repetition of a large numbers of similar trials can relative probabilities be measured; and so on. The epistemology of objective chances can in this way be worked out from the dynamics. its curious features are thus explained. But one aspect of chance set-ups seems to resist this subsuming of chancing to branching: how is it that chance involves uncertainty? And if that is not possible, on Everettian lines, then the whole project is doomed. I argue that in fact there is no difficulty in making sense of uncertainty in the face of branching. Contrary to initial impressions, the unitary formalism is consistent with a well-defined notion of self-locating uncertainty. It is also consistent without: the mathematics under-determines the metaphysics in these respects. (shrink)
State-reduction and the notion of actuality are compared to passage through time and the notion of the present; already in classical relativity the latter give rise to difficulties. The solution proposed here is to treat both tense and value-definiteness as relational properties or facts as relations; likewise the notions of change and probability. In both cases essential characteristics are absent: temporal relations are tenselessly true; probabilistic relations are deterministically true. The basic ideas go back to Everett, although the technical development (...) makes use of the decoherent histories theory of Griffiths, Omnès, and Gell-Mann and Hartle. Alternative interpretations of the decoherent histories framework are also considered. (shrink)
But this picture of a ‘block universe’, composed of a timeless web of ‘world-lines’ in a four-dimensional space, however strongly suggested by the theory of relativity, is a piece of gratuitous metaphysics. Since the concept of change, of something happening, is an inseparable component of the common-sense concept of time and a necessary component of the scientist's view of reality, it is quite out of the question that theoretical physics should require us to hold the Eleatic view that nothing happens (...) in ‘the objective world’. Here, as so often in the philosophy of science, a useful limitation in the form of representation is mistaken for a deficiency of the universe. (shrink)
The paper defends a view of structural realism similar to that of French and Ladyman, although it differs from theirs in an important respect: I do not take indistinguishabiity of particles in quantum mechanics to have the significance they think it has. It also differs from Cao's view of structural realism, criticized in my "Critical Notice: Cao's `The Conceptual Development of 20th Century Field Theories".
The concept of classical indistinguishability is analyzed and defended against a number of well-known criticisms, with particular attention to the Gibbs’paradox. Granted that it is as much at home in classical as in quantum statistical mechanics, the question arises as to why indistinguishability, in quantum mechanics but not in classical mechanics, forces a change in statistics. The answer, illustrated with simple examples, is that the equilibrium measure on classical phase space is continuous, whilst on Hilbert space it is discrete. The (...) relevance of names, or equivalently, properties stable in time that can be used as names, is also discussed. (shrink)
We review the decoherent histories approach to the interpretation of quantum mechanics. The Everett relative-state theory is reformulated in terms of decoherent histories. A model of evolutionary adaptation is shown to imply decoherence. A general interpretative framework is proposed: probability and value-definiteness are to have a similar status to the attribution of tense in classical spacetime theory.
The Everett interpretation of quantum mechanics divides naturally into two parts: first, the interpretation of the structure of the quantum state, in terms of branching, and second, the interpretation of this branching structure in terms of probability. This is the second of two reviews of the Everett interpretation, and focuses on probability. Branching processes are identified as chance processes, and the squares of branch amplitudes are chances. Since branching is emergent, physical probability is emergent as well.
The Born rule is derived from operational assumptions, together with assumptions of quantum mechanics that concern only the deterministic development of the state. Unlike Gleason’s theorem, the argument applies even if probabilities are de…ned for only a single resolution of the identity, so it applies to a variety of foundational approaches to quantum mechanics. It also provides a probability rule for state spaces that are not Hilbert spaces.
The Everett interpretation of quantum mechanics divides naturally into two parts: first, the interpretation of the structure of the quantum state, in terms of branching, and second, the interpretation of this branching structure in terms of probability. This is the first of two reviews of the Everett interpretation, and focuses on structure, with particular attention to the role of decoherence theory. Written in terms of the quantum histories formalism, decoherence theory just is the theory of branching structure, in Everett's sense.
Probabilities may be subjective or objective; we are concerned with both kinds of probability, and the relationship between them. The fundamental theory of objective probability is quantum mechanics: it is argued that neither Bohr's Copenhagen interpretation, nor the pilot-wave theory, nor stochastic state-reduction theories, give a satisfactory answer to the question of what objective probabilities are in quantum mechanics, or why they should satisfy the Born rule; nor do they give any reason why subjective probabilities should track objective ones. But (...) it is shown that if probability only arises with decoherence, then they must be given by the Born rule. That further, on the Everett interpretation, we have a clear statement of what probabilities are, in terms of purely categorical physical properties; and finally, along lines laid out by Deutsch and Wallace, that there is a clear basis in the axioms of decision theory as to why subjective probabilities should track these objective ones. These results hinge critically on the absence of hidden-variables or any other mechanism (such as state-reduction) from the physical interpretation of the theory. The account of probability has traditionally been considered the principal weakness of the Everett interpretation; on the contrary it emerges as one of its principal strengths. (shrink)
The relational approach to tense holds that the now, passage, and becoming are to be understood in terms of relations between events. The debate over the adequacy of this framework is illustrated by a comparative study of the sense in which physical theories, (in)deterministic and (non)relativistic, can lend expression to the metaphysics at issue. The objective is not to settle the matter, but to clarify the nature of this metaphysics and to establish that the same issues are at stake in (...) the relational approach to value-definiteness and probability in quantum mechanics. They concern the existence of a unique present, respectively actuality, and a notion of identity over time that cannot be paraphrased in terms of relations. (shrink)
What is the meaning of general covariance? We learn something about it from the hole argument, due originally to Einstein. In his search for a theory of gravity, he noted that if the equations of motion are covariant under arbitrary coordinate transformations, then particle coordinates at a given time can be varied arbitrarily - they are underdetermined - even if their values at all earlier times are held fixed. It is the same for the values of fields. The argument can (...) also be made out in terms of transformations acting on the points of the manifold, rather than on the coordinates assigned to the points. So the equations of motion do not fix the particle positions, or the values of fields at manifold points, or particle coordinates, or fields as functions of the coordinates, even when they are specified at all earlier times. It is surely the business of physics to predict these sorts of quantities, given their values at earlier times. The principle of general covariance therefore seems untenable. (shrink)
The vacuum is fast emerging as the central structure of modern physics. This collection brings together philosophically-minded specialists who engage these issues in the context of classical gravity, quantum electrodynamics, and the grand unification program. The vacuum emerges as the synthesis of concepts of space, time, and matter; in the context of relativity and the quantum this new synthesis represents a structure of the most intricate and novel complexity. This book is a work in modern metaphysics, in which the concepts (...) of substance and space interweave in the most intangible of forms, the background and context of our physical experience: vacuum, void, or nothingness. (shrink)
The concept of indistinguishable particles in quantum theory is fundamental to questions of ontology. All ordinary matter is made of electrons, protons, neutrons, and photons and they are all indistinguishable particles. Yet the concept itself has proved elusive, in part because of the interpretational difficulties that afflict quantum theory quite generally, and in part because the concept was so central to the discovery of the quantum itself, by Planck in 1900; it came encumbered with revolution. I offer a deflationary reading (...) of the concept ‘indistinguishable’ that is identical to Gibbs’ concept ‘generic phase’, save that it is defined for state spaces with only finitely-many states of bounded volume and energy. That, and that alone, makes for the difference between the quantum and Gibbs concepts of indistinguishability. This claim is heretical on several counts, but here we consider only the content of the claim itself, and its bearing on the early history of quantum theory rather than in relation to contemporary debates about particle indistinguishability and permutation symmetry. It powerfully illuminates that history. (shrink)
The problem of measurement is usually thought of as a problem of physics. Certainly the straightforward solutions modify or supplement the basic equations. One might instead conclude that theories are only instruments for coördinating observable phenomena, and perhaps that is what Bohr’s Copenhagen interpretation really comes down to. For most of us this sort of “philosophical” resolution of the problem of measurement is not acceptable.
Bohr’s interpretation of quantum mechanics has been criticized as incoherent and opportunistic, and based on doubtful philosophical premises. If so Bohr’s influence, in the pre-war period of 1927–1939, is the harder to explain, and the acceptance of his approach to quantum mechanics over de Broglie’s had no reasonable foundation. But Bohr’s interpretation changed little from the time of its first appearance, and stood independent of any philosophical presuppositions. The principle of complementarity is itself best read as a conjecture of unusually (...) wide scope, on the nature and future course of explanations in the sciences (and not only the physical sciences). If it must be judged a failure today, it is not because of any internal inconsistency. (shrink)
This is a systematic review of the concept of indistinguishability, in both classical and quantum mechanics, with particular attention to Gibbs paradox. Section 1 is on the Gibbs paradox; section 2 is a defense of classical indistinguishability, notwithstanding the widely-held view, that classical particles can always be distinguished by their trajectories. The last section is about the notion of object more generally, and on whether indistinguishables should be thought of as objects at all.
A heuristic comparison is made of relativistic and non-relativistic quantum theory. To this end the Segal approach is described for the non-specialist. The significance of antimatter to the local and microcausal properties of the fields is laid bare. The fundamental difference between relativistic and non-relativistic (complex) fields is traced to the existence of two kinds of complex numbers in the relativistic case. Their relation to covariant and Newton-Wigner locality is formulated.
Tian Yu Cao has written a serious and scholarly book covering a great deal of physics. He ranges from classical relativity theory, both special and general, to relativistic quantum …eld theory, including non-Abelian gauge theory, renormalization theory, and symmetry-breaking, presenting a detailed and very rich picture of the mainstream developments in quantum physics; a remarkable feat. It has, moreover, a philosophical message: according to Cao, the development of these theories is inconsistent with a Kuhnian view of theory change, and supports (...) better a quali…ed realism. (shrink)
A relationist will account for the use of ‘left’ and ‘right’ in terms of relative orientations, and other properties and relations invariant under mirroring. This analysis will apply whenever mirroring is a symmetry, so it certainly applies to classical mechanics; we argue it applies to any physical theory formulated on a manifold: it is in this sense an a priori symmetry. It should apply in particular to parity violating theories in quantum mechanics; mirror symmetry is only broken in such theories (...) as a special symmetry. (shrink)
It is argued that Lewis's approach to Elga's Sleeping Beaty problem is untenable and, therefore, the universality of the betting approach to probability has not been breached.
The Gibbs Paradox is essentially a set of open questions as to how sameness of gases or fluids are to be treated in thermodynamics and statistical mechanics. They have a variety of answers, some restricted to quantum theory, some to classical theory. The solution offered here applies to both in equal measure, and is based on the concept of particle indistinguishability. Correctly understood, it is the elimination of sequence position as a labelling device, where sequences enter at the level of (...) the tensor product of one-particle state spaces. In both cases it amounts to passing to the quotient space under permutations. ‘Distinguishability’, in the sense in which it is usually used in classical statistical mechanics, is a mathematically convenient, but physically muddled, fiction. (shrink)
Debates over the significance of the particle concept, and the problem of locality-how do we represent localized phenomena?-appear to presuppose that particles and observed phenomena are things rather than events. Well-known theorems (Hergerfelt, Reeh-Schlieder), and a recent variant of Hergerfelt's theorem due to David Malement, present a problem of locality only given the tacit appeal to the concept of thing, in fact an individual, in a sense contrary to particle indistinguishability. There is no difficulty with the particle concept per se, (...) but it is a global construction more than one step removed from events actually observed, which are represented by local integrals over self-adjoint field densities. (shrink)
Is tense real and objective? Can the fact that something is past, say, be wholly objective, consistent with special relativity? The answer is yes, but only so long as the distinction has no ontological ground. There is a closely related question. Is the contrast between the determinate and the indeterminate real and objective, consistent with relativity and quantum mechanics? The answer is again yes, but only if the contrast has no ontological ground. Various accounts of it are explored, according to (...) different approaches to quantum mechanics. The Everett interpretation is much the most successful in accounting for it. (shrink)
A reply to a comment by Paul Tappenden (BJPS 59 (2008) pp. 307-314) on S. Saunders and D. Wallace, "Branching and Uncertainty" (BJPS 59 (2008) pp. 298-306).
Following a long-term international collaboration between leaders in cosmology and the philosophy of science, this volume addresses foundational questions at the limit of science across these disciplines, questions raised by observational and theoretical progress in modern cosmology. Space missions have mapped the Universe up to its early instants, opening up questions on what came before the Big Bang, the nature of space and time, and the quantum origin of the Universe. As the foundational volume of an emerging academic discipline, experts (...) from relevant fields lay out the fundamental problems of contemporary cosmology and explore the routes toward finding possible solutions. Written for graduates and researchers in physics and philosophy, particular efforts are made to inform academics from other fields, as well as the educated public, who wish to understand our modern vision of the Universe, related philosophical questions, and the significant impacts on scientific methodology. (shrink)
Identity. From very early days of quantum theory it was recognized that quanta were statistically strange (see !Bose-Einstein statistics). Suspicion fell on the identity of quanta, of how they are to be counted [1], [2]. It was not until Dirac’s [1902-1984] work of 1926 (and his discovery of !Fermi-Dirac statistics [3]) that the nature of the novelty was clear: the quantum state of exactly similar particles of the same mass, charge, and spin must be symmetrized, yielding states either symmetric or (...) antisymmetric under permutations. This is the symmetry postulate (SP). (shrink)
Special relativity is most naturally formulated as a theory of spacetime geometry, but within the spacetime framework probability appears to be a purely epistemic notion. It is possible that progress can be made with rather different approaches - covariant stochastic equations, in particular - but the results to date are not encouraging. However, it seems a non-epistemic notion of probability can be made out in Minkowski space on Everett's terms. I shall work throughout with the consistent histories formalism. I shall (...) start with a conservative interpretation, and then go on to Everett's. (shrink)
Fermi-Dirac statistics are one of two kinds of statistics exhibited by !identical quantum particles, the other being !Bose-Einstein statistics. Such particles are called fermions and bosons respectively (the terminology is due to Dirac [1902-1984] [1]). In the light of the !spin-statistics theorem, and consistent with observation, fermions are invariably spinors (of half-integral spin), whilst bosons are invariably scalar or vector particles (of integral spin). See !spin.