According to Bayesian epistemology, the epistemically rational agent updates her beliefs by conditionalization: that is, her posterior subjective probability after taking account of evidence X, new, is to be set equal to her prior conditional probability old(.|X). Bayesians can be challenged to provide a justification for their claim that conditionalization is recommended by rationality --- whence the normative force of the injunction to conditionalize?
An extended analysis is given of the program, originally suggested by Deutsch, of solving the probability problem in the Everett interpretation by means of decision theory. Deutsch's own proof is discussed, and alternatives are presented which are based upon different decision theories and upon Gleason's Theorem. It is argued that decision theory gives Everettians most or all of what they need from `probability'. Contact is made with Lewis's Principal Principle linking subjective credence with objective chance: an Everettian Principal Principle is (...) formulated, and shown to be at least as defensible as the usual Principle. Some consequences of (Everettian) quantum mechanics for decision theory itself are also discussed. -/- [NB: this this long (70 pages) and occasionally rambling online-only paper has been almost entirely superseded by material in subsequent; if something is not included in them it usually means that I have had second thoughts. I include it for completeness only. (shrink)
I attempt to get as clear as possible on the chain of reasoning by which irreversible macrodynamics is derivable from time-reversible microphysics, and in particular to clarify just what kinds of assumptions about the initial state of the universe, and about the nature of the microdynamics, are needed in these derivations. I conclude that while a “Past Hypothesis” about the early Universe does seem necessary to carry out such derivations, that Hypothesis is not correctly understood as a constraint on the (...) early Universe’s entropy. (shrink)
I criticise the view that the relativity and equivalence principles are consequences of the small-scale structure of the metric in general relativity, by arguing that these principles also apply to systems with non-trivial self-gravitation and hence non-trivial spacetime curvature (such as black holes). I provide an alternative account, incorporating aspects of the criticised view, which allows both principles to apply to systems with self-gravity.
Dewey Wallace tells the story of several prominent English Calvinist actors and thinkers in the first generations after the beginning of the Restoration. In the midst of conflicts between Church and Dissent and the intellectual challenges of the dawning age of Enlightenment, these five individuals and groups dealt with deism, anti-Trinitarianism, and scoffing atheism - usually understood as godlessness - by choosing different emphases in their defense and promotion of Calvinist piety and theology. In each case there was not only (...) persistence in an earlier Calvinist trajectory, but also a transformation of the Calvinist heritage into a new mode of thinking and acting. The different paths taken illustrate the rich variety of English Calvinism in the period. This study offers description and analysis of the mystical Calvinism of Peter Sterry, the hermeticist Calvinism of Theophilus Gale, the evangelical Calvinism of Joseph Alleine and the circle that promoted his legacy, the natural theology of the moderate Calvinist Presbyterians Richard Baxter, William Bates, and John Howe, and the Church of England Calvinism of John Edwards. Wallace seeks to overturn conventional clichés about Calvinism: that it was anti-mystical, that it allowed no scope for the ''ancient theology'' that characterized much of Renaissance learning, that its piety was harshly predestinarian, that it was uninterested in natural theology, and that it had been purged from the established church by the end of the seventeenth century. Shapers of English Calvinism, 1660-1714 illuminates the religious and intellectual history of the era between the Reformation and modernity, offering fascinating insight into the development of Calvinism and also into English Puritanism as it transitioned into Dissent. (shrink)
I develop the decision-theoretic approach to quantum probability, originally proposed by David Deutsch, into a mathematically rigorous proof of the Born rule in (Everett-interpreted) quantum mechanics. I sketch the argument informally, then prove it formally, and lastly consider a number of proposed ``counter-examples'' to show exactly which premises of the argument they violate. (This is a preliminary version of a chapter to appear --- under the title ``How to prove the Born Rule'' --- in Saunders, Barrett, Kent and Wallace, "Many (...) worlds? Everett, quantum theory and reality", forthcoming from Oxford University Press.). (shrink)
I explore the debate about causal versus evidential decision theory, and its recent developments in the work of Andy Egan, through the method of some simple games based on agents' predictions of each other's actions. My main focus is on the requirement for rational agents to act in a way which is consistent over time and its implications for such games and their more realistic cousins.
NGC 1300 (shown in figure 1) is a spiral galaxy 65 million light years from Earth.1 We have never been there, and (although I would love to be wrong about this) we will never go there; all we will ever know about NGC 1300 is what we can see of it from sixty-five million light years away, and what we can infer from our best physics. Fortunately, “what we can infer from our best physics” is actually quite a lot. To (...) take a particular example: our best theory of galaxies tells us that that hazy glow is actually made up of the light of hundreds of billions of stars; our best theories of planetary formation tell us that a sizable fraction of those stars.. (shrink)
I discuss the statistical mechanics of gravitating systems and in particular its cosmological implications, and argue that many conventional views on this subject in the foundations of statistical mechanics embody significant confusion; I attempt to provide a clearer and more accurate account. In particular, I observe that (i) the role of gravity in entropy calculations must be distinguished from the entropy of gravity, that (ii) although gravitational collapse is entropy-increasing, this is not usually because the collapsing matter itself increases in (...) entropy, and that (iii) the Second Law of thermodynamics does not owe its validity to the statistical mechanics of gravitational collapse. (shrink)
In 1962, the philosopher Richard Taylor used six commonly accepted presuppositions to imply that human beings have no control over the future. David Foster Wallace not only took issue with Taylor's method, which, according to him, scrambled the relations of logic, language, and the physical world, but also noted a semantic trick at the heart of Taylor's argument. -/- Fate, Time, and Language presents Wallace's brilliant critique of Taylor's work. Written long before the publication of his fiction and essays, Wallace's (...) thesis reveals his great skepticism of abstract thinking made to function as a negation of something more genuine and real. He was especially suspicious of certain paradigms of thought-the cerebral aestheticism of modernism, the clever gimmickry of postmodernism-that abandoned "the very old traditional human verities that have to do with spirituality and emotion and community." As Wallace rises to meet the challenge to free will presented by Taylor, we witness the developing perspective of this major novelist, along with his struggle to establish solid logical ground for his convictions. This volume, edited by Steven M. Cahn and Maureen Eckert, reproduces Taylor's original article and other works on fatalism cited by Wallace. James Ryerson's introduction connects Wallace's early philosophical work to the themes and explorations of his later fiction, and Jay Garfield supplies a critical biographical epilogue. (shrink)
What ontology does realism about the quantum state suggest? The main extant view in contemporary philosophy of physics is wave-function realism . We elaborate the sense in which wave-function realism does provide an ontological picture, and defend it from certain objections that have been raised against it. However, there are good reasons to be dissatisfied with wave-function realism, as we go on to elaborate. This motivates the development of an opposing picture: what we call spacetime state realism , a view (...) which takes the states associated to spacetime regions as fundamental. This approach enjoys a number of beneficial features, although, unlike wave-function realism, it involves non-separability at the level of fundamental ontology. We investigate the pros and cons of this non-separability, arguing that it is a quite acceptable feature, even one which proves fruitful in the context of relativistic covariance. A companion paper discusses the prospects for combining a spacetime-based ontology with separability, along lines suggested by Deutsch and Hayden. (shrink)
A systematic analysis is made of the relations between the symmetries of a classical field and the symmetries of the one-particle quantum system that results from quantizing that field in regimes where interactions are weak. The results are applied to gain a greater insight into the phenomenon of antimatter.
Following Lewis, it is widely held that branching worlds differ in important ways from diverging worlds. There is, however, a simple and natural semantics under which ordinary sentences uttered in branching worlds have much the same truth values as they conventionally have in diverging worlds. Under this semantics, whether branching or diverging, speakers cannot say in advance which branch or world is theirs. They are uncertain as to the outcome. This same semantics ensures the truth of utterances typically made about (...) quantum mechanical contingencies, including statements of uncertainty, if the Everett interpretation of quantum mechanics is true. The 'incoherence problem' of the Everett interpretation, that it can give no meaning to the notion of uncertainty, is thereby solved. (shrink)
This is a preliminary version of an article to appear in the forthcoming Ashgate Companion to the New Philosophy of Physics.In it, I aim to review, in a way accessible to foundationally interested physicists as well as physics-informed philosophers, just where we have got to in the quest for a solution to the measurement problem. I don't advocate any particular approach to the measurement problem (not here, at any rate!) but I do focus on the importance of decoherence theory to (...) modern attempts to solve the measurement problem, and I am fairly sharply critical of some aspects of the "traditional" formulation of the problem. (shrink)
I present a proof of the quantum probability rule from decision-theoretic assumptions, in the context of the Everett interpretation. The basic ideas behind the proof are those presented in Deutsch's recent proof of the probability rule, but the proof is simpler and proceeds from weaker decision-theoretic assumptions. This makes it easier to discuss the conceptual ideas involved in the proof, and to show that they are defensible.
Deutsch and Hayden have proposed an alternative formulation of quantum mechanics which is completely local. We argue that their proposal must be understood as having a form of `gauge freedom' according to which mathematically distinct states are physically equivalent. Once this gauge freedom is taken into account, their formulation is no longer local.
According to Bayesian epistemology, the epistemically rational agent updates her beliefs by conditionalization: that is, her posterior subjective probability after taking account of evidence X, pnew, is to be set equal to her prior conditional probability pold(·|X). Bayesians can be challenged to provide a justification for their claim that conditionalization is recommended by rationality—whence the normative force of the injunction to conditionalize? There are several existing justifications for conditionalization, but none directly addresses the idea that conditionalization will be epistemically rational (...) if and only if it can reasonably be expected to lead to epistemically good outcomes. We apply the approach of cognitive decision theory to provide a justification for conditionalization using precisely that idea. We assign epistemic utility functions to epistemically rational agents; an agent’s epistemic utility is to depend both upon the actual state of the world and on the agent’s credence distribution over possible states. We prove that, under independently motivated conditions, conditionalization is the unique updating rule that maximizes expected epistemic utility. (shrink)
I consider exactly what is involved in a solution to the probability problem of the Everett interpretation, in the light of recent work on applying considerations from decision theory to that problem. I suggest an overall framework for understanding probability in a physical theory, and conclude that this framework, when applied to the Everett interpretation, yields the result that that interpretation satisfactorily solves the measurement problem. Introduction What is probability? 2.1 Objective probability and the Principal Principle 2.2 Three ways of (...) satisfying the functional definition 2.3 Cautious functionalism 2.4 Is the functional definition complete? The Everett interpretation and subjective uncertainty 3.1 Interpreting quantum mechanics 3.2 The need for subjective uncertainty 3.3 Saunders' argument for subjective uncertainty 3.4 Objections to Saunders' argument 3.5 Subjective uncertainty again: arguments from interpretative charity 3.6 Quantum weights and the functional definition of probability Rejecting subjective uncertainty 4.1 The fission program 4.2 Against the fission program Justifying the axioms of decision theory 5.1 The primitive status of the decision-theoretic axioms 5.2 Holistic scepticism 5.3 The role of an explanation of decision theory Conclusion. (shrink)
I analyse the conceptual and mathematical foundations of Lagrangian quantum field theory (QFT) (that is, the ‘naive’ (QFT) used in mainstream physics, as opposed to algebraic quantum field theory). The objective is to see whether Lagrangian (QFT) has a sufficiently firm conceptual and mathematical basis to be a legitimate object of foundational study, or whether it is too ill-defined. The analysis covers renormalisation and infinities, inequivalent representations, and the concept of localised states; the conclusion is that Lagrangian QFT (at least (...) as described here) is a perfectly respectable physical theory, albeit somewhat different in certain respects from most of those studied in foundational work. (shrink)
The quantum theory of de Broglie and Bohm solves the measurement problem, but the hypothetical corpuscles play no role in the argument. The solution finds a more natural home in the Everett interpretation.
The quantum theory of de Broglie and Bohm solves the measurement problem, but the hypothetical corpuscles play no role in the argument. The solution ﬁnds a more natural home in the Everett interpretation.
I investigate the consequences for semantics, and in particular for the semantics of tense, if time is assumed to have a branching structure not out of metaphysical necessity (to solve some philosophical problem) but just as a contingent physical fact, as is suggested by a currently-popular approach to the interpretation of quantum mechanics.
The relation between micro-objects and macro-objects advocated by Kim is even more problematic than Ross & Spurrett (R&S) argue, for reasons rooted in physics. R&S's own ontological proposals are much more satisfactory from a physicist's viewpoint but may still be problematic. A satisfactory theory of macroscopic ontology must be as independent as possible of the details of microscopic physics.
I address the problem of indefiniteness in quantum mechanics: the problem that the theory, without changes to its formalism, seems to predict that macroscopic quantities have no definite values. The Everett interpretation is often criticised along these lines, and I shall argue that much of this criticism rests on a false dichotomy: that the macroworld must either be written directly into the formalism or be regarded as somehow illusory. By means of analogy with other areas of physics, I develop the (...) view that the macroworld is instead to be understood in terms of certain structures and patterns which emerge from quantum theory (given appropriate dynamics, in particular decoherence). I extend this view to the observer, and in doing so make contact with functionalist theories of mind. (shrink)
An analysis is made of Deutsch's recent claim to have derived the Born rule from decision-theoretic assumptions. It is argued that Deutsch's proof must be understood in the explicit context of the Everett interpretation, and that in this context, it essentially succeeds. Some comments are made about the criticism of Deutsch's proof by Barnum, Caves, Finkelstein, Fuchs, and Schack; it is argued that the flaw which they point out in the proof does not apply if the Everett interpretation is assumed.
Mathematically, gauge theories are extraordinarily rich --- so rich, in fact, that it can become all too easy to lose track of the connections between results, and become lost in a mass of beautiful theorems and properties: indeterminism, constraints, Noether identities, local and global symmetries, and so on. -/- One purpose of this short article is to provide some sort of a guide through the mathematics, to the conceptual core of what is actually going on. Its focus is on the (...) Lagrangian, variational-problem description of classical mechanics, from which the link between gauge symmetry and the apparent violation of determinism is easy to understand; only towards the end will the Hamiltonian description be considered. -/- The other purpose is to warn against adopting too unified a perspective on gauge theories. It will be argued that the meaning of the gauge freedom in a theory like general relativity is (at least from the Lagrangian viewpoint) significantly different from its meaning in theories like electromagnetism. The Hamiltonian framework blurs this distinction, and orthodox methods of quantization obliterate it; this may, in fact, be genuine progress, but it is dangerous to be guided by mathematics into conflating two conceptually distinct notions without appreciating the physical consequences. (shrink)
This is a discussion of how we can understand the world-view given to us by the Everett interpretation of quantum mechanics, and in particular the role played by the concept of 'world'. The view presented is that we are entitled to use 'many-worlds' terminology even if the theory does not specify the worlds in the formalism; this is defended by means of an extensive analogy with the concept of an 'instant' or moment of time in relativity, with the lack of (...) a preferred foliation of spacetime being compared with the lack of a preferred basis in quantum theory. Implications for identity of worlds over time, and for relativistic quantum mechanics, are discussed. (shrink)
An investigation is made into how the foundations of statistical mechanics are affected once we treat classical mechanics as an approximation to quantum mechanics in certain domains rather than as a theory in its own right; this is necessary if we are to understand statistical-mechanical systems in our own world. Relevant structural and dynamical differences are identified between classical and quantum mechanics (partly through analysis of technical work on quantum chaos by other authors). These imply that quantum mechanics significantly affects (...) a number of foundational questions, including the nature of statistical probability and the direction of time. (shrink)