According to Bayesian epistemology, the epistemically rational agent updates her beliefs by conditionalization: that is, her posterior subjective probability after taking account of evidence X, pnew, is to be set equal to her prior conditional probability pold(·|X). Bayesians can be challenged to provide a justification for their claim that conditionalization is recommended by rationality—whence the normative force of the injunction to conditionalize? There are several existing justifications for conditionalization, but none directly addresses the idea that conditionalization will be epistemically rational (...) if and only if it can reasonably be expected to lead to epistemically good outcomes. We apply the approach of cognitive decision theory to provide a justification for conditionalization using precisely that idea. We assign epistemic utility functions to epistemically rational agents; an agent’s epistemic utility is to depend both upon the actual state of the world and on the agent’s credence distribution over possible states. We prove that, under independently motivated conditions, conditionalization is the unique updating rule that maximizes expected epistemic utility. (shrink)
David Wallace argues that we should take quantum theory seriously as an account of what the world is like--which means accepting the idea that the universe is constantly branching into new universes. He presents an accessible but rigorous account of the 'Everett interpretation', the best way to make coherent sense of quantum physics.
What would it mean to apply quantum theory, without restriction and without involving any notion of measurement and state reduction, to the whole universe? What would realism about the quantum state then imply? This book brings together an illustrious team of philosophers and physicists to debate these questions. The contributors broadly agree on the need, or aspiration, for a realist theory that unites micro- and macro-worlds. But they disagree on what this implies. Some argue that if unitary quantum evolution has (...) unrestricted application, and if the quantum state is taken to be something physically real, then this universe emerges from the quantum state as one of countless others, constantly branching in time, all of which are real. The result, they argue, is many worlds quantum theory, also known as the Everett interpretation of quantum mechanics. No other realist interpretation of unitary quantum theory has ever been found. Others argue in reply that this picture of many worlds is in no sense inherent to quantum theory, or fails to make physical sense, or is scientifically inadequate. The stuff of these worlds, what they are made of, is never adequately explained, nor are the worlds precisely defined; ordinary ideas about time and identity over time are compromised; no satisfactory role or substitute for probability can be found in many worlds theories; they can't explain experimental data; anyway, there are attractive realist alternatives to many worlds. Twenty original essays, accompanied by commentaries and discussions, examine these claims and counterclaims in depth. They consider questions of ontology - the existence of worlds; probability - whether and how probability can be related to the branching structure of the quantum state; alternatives to many worlds - whether there are one-world realist interpretations of quantum theory that leave quantum dynamics unchanged; and open questions even given many worlds, including the multiverse concept as it has arisen elsewhere in modern cosmology. A comprehensive introduction lays out the main arguments of the book, which provides a state-of-the-art guide to many worlds quantum theory and its problems. (shrink)
A systematic analysis is made of the relations between the symmetries of a classical field and the symmetries of the one-particle quantum system that results from quantizing that field in regimes where interactions are weak. The results are applied to gain a greater insight into the phenomenon of antimatter.
An examination is made of the way in which particles emerge from linear, bosonic, massive quantum field theories. Two different constructions of the one-particle subspace of such theories are given, both illustrating the importance of the interplay between the quantum-mechanical linear structure and the classical one. Some comments are made on the Newton-Wigner representation of one-particle states, and on the relationship between the approach of this paper and those of Segal, and of Haag and Ruelle.
I analyse the conceptual and mathematical foundations of Lagrangian quantum field theory (QFT) (that is, the ‘naive’ (QFT) used in mainstream physics, as opposed to algebraic quantum field theory). The objective is to see whether Lagrangian (QFT) has a sufficiently firm conceptual and mathematical basis to be a legitimate object of foundational study, or whether it is too ill-defined. The analysis covers renormalisation and infinities, inequivalent representations, and the concept of localised states; the conclusion is that Lagrangian QFT (at least (...) as described here) is a perfectly respectable physical theory, albeit somewhat different in certain respects from most of those studied in foundational work. (shrink)
I argue against the currently prevalent view that algebraic quantum field theory (AQFT) is the correct framework for philosophy of quantum field theory and that “conventional” quantum field theory (CQFT), of the sort used in mainstream particle physics, is not suitable for foundational study. In doing so, I defend that position that AQFT and CQFT should be understood as rival programs to resolve the mathematical and physical pathologies of renormalization theory, and that CQFT has succeeded in this task and AQFT (...) has failed. I also defend CQFT from recent criticisms made by Doreen Fraser. (shrink)
What ontology does realism about the quantum state suggest? The main extant view in contemporary philosophy of physics is wave-function realism . We elaborate the sense in which wave-function realism does provide an ontological picture, and defend it from certain objections that have been raised against it. However, there are good reasons to be dissatisfied with wave-function realism, as we go on to elaborate. This motivates the development of an opposing picture: what we call spacetime state realism , a view (...) which takes the states associated to spacetime regions as fundamental. This approach enjoys a number of beneficial features, although, unlike wave-function realism, it involves non-separability at the level of fundamental ontology. We investigate the pros and cons of this non-separability, arguing that it is a quite acceptable feature, even one which proves fruitful in the context of relativistic covariance. A companion paper discusses the prospects for combining a spacetime-based ontology with separability, along lines suggested by Deutsch and Hayden. (shrink)
I address the problem of indefiniteness in quantum mechanics: the problem that the theory, without changes to its formalism, seems to predict that macroscopic quantities have no definite values. The Everett interpretation is often criticised along these lines, and I shall argue that much of this criticism rests on a false dichotomy: that the macroworld must either be written directly into the formalism or be regarded as somehow illusory. By means of analogy with other areas of physics, I develop the (...) view that the macroworld is instead to be understood in terms of certain structures and patterns which emerge from quantum theory (given appropriate dynamics, in particular decoherence). I extend this view to the observer, and in doing so make contact with functionalist theories of mind. (shrink)
I present a proof of the quantum probability rule from decision-theoretic assumptions, in the context of the Everett interpretation. The basic ideas behind the proof are those presented in Deutsch's recent proof of the probability rule, but the proof is simpler and proceeds from weaker decision-theoretic assumptions. This makes it easier to discuss the conceptual ideas involved in the proof, and to show that they are defensible.
I consider exactly what is involved in a solution to the probability problem of the Everett interpretation, in the light of recent work on applying considerations from decision theory to that problem. I suggest an overall framework for understanding probability in a physical theory, and conclude that this framework, when applied to the Everett interpretation, yields the result that that interpretation satisfactorily solves the measurement problem. Introduction What is probability? 2.1 Objective probability and the Principal Principle 2.2 Three ways of (...) satisfying the functional definition 2.3 Cautious functionalism 2.4 Is the functional definition complete? The Everett interpretation and subjective uncertainty 3.1 Interpreting quantum mechanics 3.2 The need for subjective uncertainty 3.3 Saunders' argument for subjective uncertainty 3.4 Objections to Saunders' argument 3.5 Subjective uncertainty again: arguments from interpretative charity 3.6 Quantum weights and the functional definition of probability Rejecting subjective uncertainty 4.1 The fission program 4.2 Against the fission program Justifying the axioms of decision theory 5.1 The primitive status of the decision-theoretic axioms 5.2 Holistic scepticism 5.3 The role of an explanation of decision theory Conclusion. (shrink)
Following Lewis, it is widely held that branching worlds differ in important ways from diverging worlds. There is, however, a simple and natural semantics under which ordinary sentences uttered in branching worlds have much the same truth values as they conventionally have in diverging worlds. Under this semantics, whether branching or diverging, speakers cannot say in advance which branch or world is theirs. They are uncertain as to the outcome. This same semantics ensures the truth of utterances typically made about (...) quantum mechanical contingencies, including statements of uncertainty, if the Everett interpretation of quantum mechanics is true. The ‘incoherence problem’ of the Everett interpretation, that it can give no meaning to the notion of uncertainty, is thereby solved. IntroductionMetaphysicsPersonal fissionBranching worldsPhysicsObjections. (shrink)
An analysis is made of Deutsch's recent claim to have derived the Born rule from decision-theoretic assumptions. It is argued that Deutsch's proof must be understood in the explicit context of the Everett interpretation, and that in this context, it essentially succeeds. Some comments are made about the criticism of Deutsch's proof by Barnum, Caves, Finkelstein, Fuchs, and Schack; it is argued that the flaw which they point out in the proof does not apply if the Everett interpretation is assumed.
Spontaneous symmetry breaking in quantum systems, such as ferromagnets, is normally described as degeneracy of the ground state; however, it is well established that this degeneracy only occurs in spatially infinite systems, and even better established that ferromagnets are not spatially infinite. I review this well-known paradox, and consider a popular solution where the symmetry is explicitly broken by some external field which goes to zero in the infinite-volume limit; although this is formally satisfactory, I argue that it must be (...) rejected as a physical explanation of SSB since it fails to reproduce some important features of the phenomenology. Motivated by considerations from the analogous classical system, I argue that SSB in finite systems should be understood in terms of the approximate decoupling of the system's state space into dynamically-isolated sectors, related by a symmetry transformation; I use the formalism of decoherent histories to make this more precise and to quantify the effect, showing that it is more than sufficient to explain SSB in realistic systems and that it goes over in a smooth and natural way to the infinite limit. (shrink)
This is a discussion of how we can understand the world-view given to us by the Everett interpretation of quantum mechanics, and in particular the role played by the concept of 'world'. The view presented is that we are entitled to use 'many-worlds' terminology even if the theory does not specify the worlds in the formalism; this is defended by means of an extensive analogy with the concept of an 'instant' or moment of time in relativity, with the lack of (...) a preferred foliation of spacetime being compared with the lack of a preferred basis in quantum theory. Implications for identity of worlds over time, and for relativistic quantum mechanics, are discussed. (shrink)
An investigation is made into how the foundations of statistical mechanics are affected once we treat classical mechanics as an approximation to quantum mechanics in certain domains rather than as a theory in its own right; this is necessary if we are to understand statistical-mechanical systems in our own world. Relevant structural and dynamical differences are identified between classical and quantum mechanics (partly through analysis of technical work on quantum chaos by other authors). These imply that quantum mechanics significantly affects (...) a number of foundational questions, including the nature of statistical probability and the direction of time. (shrink)
NGC 1300 (shown in figure 1) is a spiral galaxy 65 million light years from Earth.1 We have never been there, and (although I would love to be wrong about this) we will never go there; all we will ever know about NGC 1300 is what we can see of it from sixty-five million light years away, and what we can infer from our best physics. Fortunately, “what we can infer from our best physics” is actually quite a lot. To (...) take a particular example: our best theory of galaxies tells us that that hazy glow is actually made up of the light of hundreds of billions of stars; our best theories of planetary formation tell us that a sizable fraction of those stars.. (shrink)
This is a preliminary version of an article to appear in the forthcoming Ashgate Companion to the New Philosophy of Physics.In it, I aim to review, in a way accessible to foundationally interested physicists as well as physics-informed philosophers, just where we have got to in the quest for a solution to the measurement problem. I don't advocate any particular approach to the measurement problem (not here, at any rate!) but I do focus on the importance of decoherence theory to (...) modern attempts to solve the measurement problem, and I am fairly sharply critical of some aspects of the "traditional" formulation of the problem. (shrink)
It seems to be widely assumed that the only effect of the Ghirardi-Rimini-Weber dynamical collapse mechanism on the `tails' of the wavefunction is to reduce their weight. In consequence it seems to be generally accepted that the tails behave exactly as do the various branches in the Everett interpretation except for their much lower weight. These assumptions are demonstrably inaccurate: the collapse mechanism has substantial and detectable effects within the tails. The relevance of this misconception for the dynamical-collapse theories is (...) debatable, though. (shrink)
Coordinate-based approaches to physical theories remain standard in mainstream physics but are largely eschewed in foundational discussion in favour of coordinate-free differential-geometric approaches. I defend the conceptual and mathematical legitimacy of the coordinate-based approach for foundational work. In doing so, I provide an account of the Kleinian conception of geometry as a theory of invariance under symmetry groups; I argue that this conception continues to play a very substantial role in contemporary mathematical physics and indeed that supposedly ``coordinate-free'' differential geometry (...) relies centrally on this conception of geometry. I discuss some foundational and pedagogical advantages of the coordinate-based formulation and briefly connect it to some remarks of Norton on the historical development of geometry in physics during the establishment of the general theory of relativity. (shrink)
I point out a radical indeterminism in potential-based formulations of Newtonian gravity once we drop the condition that the potential vanishes at infinity. This indeterminism, which is well known in theoretical cosmology but has received little attention in foundational discussions, can be removed only by specifying boundary conditions at all instants of time, which undermines the theory's claim to be fully cosmological, i.e., to apply to the Universe as a whole. A recent alternative formulation of Newtonian gravity due to Saunders (...) pp.22-48) provides a conceptually satisfactory cosmology but fails to reproduce the Newtonian limit of general relativity in homogenous but anisotropic universes. I conclude that Newtonian gravity lacks a fully satisfactory cosmological formulation. (shrink)
I argue against the currently-prevalent view in philosophy of physics that algebraic quantum field theory is the correct framework for philosophy of quantum field theory and that "conventional" quantum field theory, of the sort used in mainstream particle physics, is not suitable for foundational study. In doing so, I defend the position that AQFT and CQFT, understood in an appropriate sense, ought to be understood as rival programs to resolve the mathematical and physical pathologies of renormalization theory, and that CQFT (...) has succeeded in this task and AQFT has failed. I also defend CQFT from recent criticisms made by Doreen Fraser. (shrink)
I discuss the statistical mechanics of gravitating systems and in particular its cosmological implications, and argue that many conventional views on this subject in the foundations of statistical mechanics embody significant confusion; I attempt to provide a clearer and more accurate account. In particular, I observe that (i) the role of gravity in entropy calculations must be distinguished from the entropy of gravity, that (ii) although gravitational collapse is entropy-increasing, this is not usually because the collapsing matter itself increases in (...) entropy, and that (iii) the Second Law of thermodynamics does not owe its validity to the statistical mechanics of gravitational collapse. (shrink)
An extended analysis is given of the program, originally suggested by Deutsch, of solving the probability problem in the Everett interpretation by means of decision theory. Deutsch's own proof is discussed, and alternatives are presented which are based upon different decision theories and upon Gleason's Theorem. It is argued that decision theory gives Everettians most or all of what they need from `probability'. Contact is made with Lewis's Principal Principle linking subjective credence with objective chance: an Everettian Principal Principle is (...) formulated, and shown to be at least as defensible as the usual Principle. Some consequences of (Everettian) quantum mechanics for decision theory itself are also discussed. -/- [NB: this this long (70 pages) and occasionally rambling online-only paper has been almost entirely superseded by material in subsequent; if something is not included in them it usually means that I have had second thoughts. I include it for completeness only. (shrink)
I develop the decision-theoretic approach to quantum probability, originally proposed by David Deutsch, into a mathematically rigorous proof of the Born rule in (Everett-interpreted) quantum mechanics. I sketch the argument informally, then prove it formally, and lastly consider a number of proposed ``counter-examples'' to show exactly which premises of the argument they violate. (This is a preliminary version of a chapter to appear --- under the title ``How to prove the Born Rule'' --- in Saunders, Barrett, Kent and Wallace, "Many (...) worlds? Everett, quantum theory and reality", forthcoming from Oxford University Press.). (shrink)
I investigate the consequences for semantics, and in particular for the semantics of tense, if time is assumed to have a branching structure not out of metaphysical necessity (to solve some philosophical problem) but just as a contingent physical fact, as is suggested by a currently-popular approach to the interpretation of quantum mechanics.
The quantum theory of de Broglie and Bohm solves the measurement problem, but the hypothetical corpuscles play no role in the argument. The solution ﬁnds a more natural home in the Everett interpretation.
Mathematically, gauge theories are extraordinarily rich --- so rich, in fact, that it can become all too easy to lose track of the connections between results, and become lost in a mass of beautiful theorems and properties: indeterminism, constraints, Noether identities, local and global symmetries, and so on. -/- One purpose of this short article is to provide some sort of a guide through the mathematics, to the conceptual core of what is actually going on. Its focus is on the (...) Lagrangian, variational-problem description of classical mechanics, from which the link between gauge symmetry and the apparent violation of determinism is easy to understand; only towards the end will the Hamiltonian description be considered. -/- The other purpose is to warn against adopting too unified a perspective on gauge theories. It will be argued that the meaning of the gauge freedom in a theory like general relativity is (at least from the Lagrangian viewpoint) significantly different from its meaning in theories like electromagnetism. The Hamiltonian framework blurs this distinction, and orthodox methods of quantization obliterate it; this may, in fact, be genuine progress, but it is dangerous to be guided by mathematics into conflating two conceptually distinct notions without appreciating the physical consequences. (shrink)
What is called ``orthodox'' quantum mechanics, as presented in standard foundational discussions, relies on two substantive assumptions --- the projection postulate and the eigenvalue-eigenvector link --- that do not in fact play any part in practical applications of quantum mechanics. I argue for this conclusion on a number of grounds, but primarily on the grounds that the projection postulate fails correctly to account for repeated, continuous and unsharp measurements and that the eigenvalue-eigenvector link implies that virtually all interesting properties are (...) maximally indefinite pretty much always. I present an alternative way of conceptualising quantum mechanics that does a better job of representing quantum mechanics as it is actually used, and in particular that eliminates use of either the projection postulate or the eigenvalue-eigenvector link, and I reformulate the measurement problem within this new presentation of orthodoxy. (shrink)
Deutsch and Hayden have proposed an alternative formulation of quantum mechanics which is completely local. We argue that their proposal must be understood as having a form of ‘gauge freedom’ according to which mathematically distinct states are physically equivalent. Once this gauge freedom is taken into account, their formulation is no longer local.
I attempt to get as clear as possible on the chain of reasoning by which irreversible macrodynamics is derivable from time-reversible microphysics, and in particular to clarify just what kinds of assumptions about the initial state of the universe, and about the nature of the microdynamics, are needed in these derivations. I conclude that while a “Past Hypothesis” about the early Universe does seem necessary to carry out such derivations, that Hypothesis is not correctly understood as a constraint on the (...) early Universe’s entropy. (shrink)
The relation between micro-objects and macro-objects advocated by Kim is even more problematic than Ross & Spurrett (R&S) argue, for reasons rooted in physics. R&S's own ontological proposals are much more satisfactory from a physicist's viewpoint but may still be problematic. A satisfactory theory of macroscopic ontology must be as independent as possible of the details of microscopic physics.
The Everett interpretation of quantum mechanics - better known as the Many-Worlds Theory - has had a rather uneven reception. Mainstream philosophers have scarcely heard of it, save as science fiction. In philosophy of physics it is well known but has historically been fairly widely rejected. Among physicists, it is taken very seriously indeed, arguably tied for first place in popularity with more traditional operationalist views of quantum mechanics. In this article, I provide a fairly short and self-contained introduction to (...) the Everett interpretation as it is currently understood. I use little technical machinery, although I do assume the reader has encountered the measurement problem already. (shrink)
Decoherence is widely felt to have something to do with the quantum measurement problem, but getting clear on just what is made diffcult by the fact that the "measurement problem", as traditionally presented in foundational and philosophical discussions, has become somewhat disconnected from the conceptual problems posed by real physics. This, in turn, is because quantum mechanics as discussed in textbooks and in foundational discussions has become somewhat removed from scientific practice, especially where the analysis of measurement is concerned. This (...) paper has two goals: firstly, to present an account of how quantum measurements are actually dealt with in modern physics and to state the measurement problem from the perspective of that account; and secondly, to clarify what role decoherence plays in modern measurement theory and what effect it has on the various strategies that have been proposed to solve the measurement problem. (shrink)
Quantum mechanics, and classical mechanics, are framework theories that incorporate many different concrete theories which in general cannot be arranged in a neat hierarchy, but discussion of ‘the ontology of quantum mechanics’ tends to proceed as if quantum mechanics were a single concrete theory, specifically the physics of nonrelativistically moving point particles interacting by long-range forces. I survey the problems this causes and make some suggestions for how a more physically realistic perspective ought to influence the metaphysics of quantum mechanics.
Using as a starting point recent and apparently incompatible conclusions by Simon Saunders and Eleanor Knox, I revisit the question of the correct spacetime setting for Newtonian physics. I argue that understood correctly, these two versions of Newtonian physics make the same claims both about the background geometry required to define the theory, and about the inertial structure of the theory. In doing so I illustrate and explore in detail the view—espoused by Knox, and also by Harvey Brown —that inertial (...) structure is defined by the dynamics governing subsystems of a larger system. This clarifies some interesting features of Newtonian physics, notably the distinction between using the theory to model subsystems of a larger whole and using it to model complete Universes, and the scale-relativity of spacetime structure. (shrink)
Using the parametrised representation of field theory I demonstrate that in both local and global cases, internal and spacetime symmetries can be treated precisely on a par, so that gravitational theories may be regarded as gauge theories in a completely standard sense.
In this article, I briefly explain the quantum measurement problem and the Everett interpretation, in a way that is faithful to modern physics and yet accessible to readers without any physics training. I then consider the metaphysical lessons for ontology from quantum mechanics under the Everett interpretation. My conclusions are largely negative: I argue that very little can be said in full generality about the ontology of quantum mechanics, because quantum mechanics, like abstract classical mechanics, is a framework within which (...) we can consider different physical theories which have very little in common at the level of ontology. Along the way I discuss, and criticise, several positive ontological proposals that have been made in the context of the Everett interpretation: ontologies based on the so-called "eigenstate-eigenvalue link", ontologies based on taking the "many-worlds" language seriously at the fundamental level, and ontologies that treat the wavefunction as a complex field on a high-dimensional space. (shrink)
I explore the debate about causal versus evidential decision theory, and its recent developments in the work of Andy Egan, through the method of some simple games based on agents' predictions of each other's actions. My main focus is on the requirement for rational agents to act in a way which is consistent over time and its implications for such games and their more realistic cousins.
I give a brief account of the way in which thermodynamics and statistical mechanics actually work as contemporary scientific theories, and in particular of what statistical mechanics contributes to thermodynamics over and above any supposed underpinning of the latter's general principles. In doing so, I attempt to illustrate that statistical mechanics should not be thought of wholly or even primarily as itself a foundational project for thermodynamics, and that conceiving of it this way potentially distorts the foundational study of statistical (...) mechanics itself. (shrink)
I distinguish between two versions of the black hole information-loss paradox. The first arises from apparent failure of unitarity on the spacetime of a completely evaporating black hole, which appears to be non-globally-hyperbolic; this is the most commonly discussed version of the paradox in the foundational and semipopular literature, and the case for calling it `paradoxical' is less than compelling. But the second arises from a clash between a fully-statistical-mechanical interpretation of black hole evaporation and the quantum-field-theoretic description used in (...) derivations of the Hawking effect. This version of the paradox arises long before a black hole completely evaporates, seems to be the version that has played a central role in quantum gravity, and is genuinely paradoxical. After explicating the paradox, I discuss the implications of more recent work on AdS/CFT duality and on the `Firewall paradox', and conclude that the paradox is if anything now sharper. The article is written at a introductory level and does not assume advanced knowledge of quantum gravity. (shrink)
I review the role of probability in contemporary physics and the origin of probabilistic time asymmetry, beginning with the pre-quantum case but concentrating on quantum theory. I argue that quantum mechanics radically changes the pre-quantum situation and that the philosophical nature of objective probability in physics, and of probabilistic asymmetry in time, is dependent on the correct resolution of the quantum measurement problem.
I discuss classical and quantum recurrence theorems in a unified manner, treating both as generalisations of the fact that a system with a finite state space only has so many places to go. Along the way I prove versions of the recurrence theorem applicable to dynamics on linear and metric spaces, and make some comments about applications of the classical recurrence theorem in the foundations of statistical mechanics.