The purpose of this paper is to give a brief survey the implications of the theories of modern physics for the doctrine of determinism. The survey will reveal a curious feature of determinism: in some respects it is fragile, requiring a number of enabling assumptions to give it a ﬁghting chance; but in other respects it is quite robust and very diﬃcult to kill. The survey will also aim to show that, apart from its own intrinsic interest, determinism is (...) an excellent device for probing the foundations of classical, relativistic, and quantum physics. The survey is conducted under three major presuppositions. First, I take a realistic attitude towards scientiﬁc theories in that I assume that to give an interpretation of a theory is, at a minimum, to specify what the world would have to be like in order for the theory to be true. But we will see that the demand for a deterministic interpretation of a theory can force us to abandon a naively realistic reading of the theory. Second, I reject the “no laws” view of science and assume that the ﬁeld equations or laws of motion of the most fundamental theories of current physics represent science’s best guesses as to the form of the basic laws of nature. Third, I take determinism to be an ontological doctrine, a doctrine about the temporal evolution of the world. This ontological doctrine must not be confused with predictability, which is an epistemological doctrine, the failure of which need not entail a failure of determinism. From time to time I will comment on ways in which predictability can fail in a deterministic setting. Finally, my survey will concentrate on the Laplacian variety of determinism according to which the instantaneous state of the world at any time uniquely determines the state at any other time. The plan of the survey is as follows. Section 2 illustrates the fragility of determinism by means of a Zeno type example. Then sections 3 and 4 survey successively the fortunes of determinism in the Newtonian and the special relativistic settings.. (shrink)
the success of classical equilibrium statistical mechanics. Our claim is based on the observations that dynamical systems for which statistical mechanics works are most likely not ergodic, and that ergodicity is both too strong and too weak a condition for the required explanation: one needs only ergodic-like behavior for the finite set of observables that matter, but the behavior must ensure that the approach to equilibrium for these obsersvables is on the appropriate..
We discuss the possibility to build and operate a time machine, a device that produces closed timelike curves (CTCs). We specify the spacetime structure needed to implement a time machine and assess attempted no-go results against time machines in classical general relativity, semi-classical quantum gravity, quantum field theory on curved spacetime, and in Euclidean quantum gravity. Such no-go theorems for time machines would show that, under physically reasonable conditions, CTCs cannot develop in spacetimes initially free of these pathologies. Our review (...) indicates that an investigation of the prospects of achieving no-go results has not been entirely successful in establishing such generality. At the same time, the pursuit of chronology protection results has proved to be a fruitful way to probe the foundations of classical GTR and the interface between general relativity and quantum field theory. (shrink)
It is argued that seemingly “merely technical” issues about the existence and uniqueness of self-adjoint extensions of symmetric operators in quantum mechanics have interesting implications for foundations problems in classical and quantum physics. For example, pursuing these technical issues reveals a sense in which quantum mechanics can cure some of the forms of indeterminism that crop up in classical mechanics; and at the same time it reveals the possibility of a form of indeterminism in quantum mechanics that is quite distinct (...) from the indeterminism of state vector collapse. More generally, the examples considered indicate that the classical–quantum correspondence is more intricate and delicate than is generally appreciated. The aim of the article is to give a series of examples that reveal why the technical issues about self-adjointness are relevant to the philosophy of science and that help to make the issues accessible to philosophers of science. (shrink)
We address the question of whether it is possible to operate a time machine by manipulating matter and energy so as to manufacture closed timelike curves. This question has received a great deal of attention in the physics literature, with attempts to prove no-go theorems based on classical general relativity and various hybrid theories serving as steps along the way towards quantum gravity. Despite the effort put into these no-go theorems, there is no widely accepted definition of a time machine. (...) We explain the conundrum that must be faced in providing a satisfactory definition and propose a resolution. Roughly, we require that all extensions of the time machine region contain closed timelike curves; the actions of the time machine operator are then sufficiently “potent” to guarantee that closed timelike curves appear. We then review no-go theorems based on classical general relativity, semi-classical quantum gravity, quantum field theory on curved spacetime, and Euclidean quantum gravity. Our verdict on the question of our title is that no result of sufficient generality to underwrite a confident “yes” has been proven. Our review of the no-go results does, however, highlight several foundational problems at the intersection of general relativity and quantum physics that lend substance to the search for an answer. (shrink)
Various fault modes of determinism in classical physics are outlined. It is shown how quantum mechanics can cure some forms of classical indeterminism. †To contact the author, please write to: Department of HPS, University of Pittsburgh, 1017 Cathedral of Learning, Pittsburgh, PA 15260; e‐mail: firstname.lastname@example.org.
Although C. D. Broad's notion of Becoming has received a fair amount of attention in the philosophy-of-time literature, there are no serious attempts to show how to replace the standard 'block' spacetime models by models that are more congenial to Broad's idea that the sum total of existence is continuously increased by Becoming or the coming into existence of events. In the Newtonian setting Broad-type models can be constructed in a cheating fashion by starting with a Newtonian block model, carving (...) chips off the block, and assembling the chips in an appropriately structured way. However, attempts to construct Broad-type models in a non-cheating fashion reveal a number of problematic aspects of Becoming that have not received adequate attention in the literature. The paper then turns to an assessment of the problem and prospects of adapting Becoming models to relativistic spacetimes. The results of the assessment differ in both minor and major ways from the ones in the extant literature. Finally, the paper describes how the causal set approach to quantum gravity promises to provide a mechanism for realizing Becoming, though the form of Becoming that emerges may not conform to any of the versions discussed in the philosophical literature. (shrink)
The overaraching goal of this paper is to elucidate the nature of superselection rules in a manner that is accessible to philosophers of science and that brings out the connections between superselection and some of the most fundamental interpretational issues in quantum physics. The formalism of von Neumann algebras is used to characterize three different senses of superselection rules (dubbed, weak, strong, and very strong) and to provide useful necessary and sufficient conditions for each sense. It is then shown how (...) the Haag–Kastler algebraic approach to quantum physics holds the promise of a uniform and comprehensive account of the origin of superselection rules. Some of the challenges that must be met before this promise can be kept are discussed. The focus then turns to the role of superselection rules in solutions to the measurement problem and the emergence of classical properties. It is claimed that the role for “hard” superselection rules is limited, but “soft” (a.k.a. environmental) superselection rules or N. P. Landsman’s situational superselection rules may have a major role to play. Finally, an assessment is given of the recently revived attempts to deconstruct superselection rules. (shrink)
This is the editors' introduction to a new anthology of commissioned articles covering the various branches of philosophy of physics. We introduce the articles in terms of the three pillars of modern physics: relativity theory, quantum theory and thermal physics. We end by discussing the present state, and future prospects, of fundamental physics.
The ambition of this volume is twofold: to provide a comprehensive overview of the field and to serve as an indispensable reference work for anyone who wants to work in it. For example, any philosopher who hopes to make a contribution to the topic of the classical-quantum correspondence will have to begin by consulting Klaas Landsman’s chapter. The organization of this volume, as well as the choice of topics, is based on the conviction that the important problems in the philosophy (...) of physics arise from studying the foundations of the fundamental theories of physics. It follows that there is no sharp line to be drawn between philosophy of physics and physics itself. Some of the best work in the philosophy of physics is being done by physicists, as witnessed by the fact that several of the contributors to the volume are theoretical physicists: viz., Ellis, Emch, Harvey, Landsman, Rovelli, ‘t Hooft, the last of whom is a Nobel laureate. Key features - Definitive discussions of the philosophical implications of modern physics - Masterly expositions of the fundamental theories of modern physics - Covers all three main pillars of modern physics: relativity theory, quantum theory, and thermal physics - Covers the new sciences grown from these theories: for example, cosmology from relativity theory; and quantum information and quantum computing, from quantum theory - Contains special Chapters that address crucial topics that arise in several different theories, such as symmetry and determinism - Written by very distinguished theoretical physicists, including a Nobel Laureate, as well as by philosophers - Definitive discussions of the philosophical implications of modern physics - Masterly expositions of the fundamental theories of modern physics - Covers all three main pillars of modern physics: relativity theory, quantum theory, and thermal physics - Covers the new sciences that have grown from these theories: for example, cosmology from relativity theory; and quantum information and quantum computing, from quantum theory - Contains special Chapters that address crucial topics that arise in several different theories, such as symmetry and determinism - Written by very distinguished theoretical physicists, including a Nobel Laureate, as well as by philosophers. (shrink)
Although the philosophical literature on the foundations of quantum field theory recognizes the importance of Haag’s theorem, it does not provide a clear discussion of the meaning of this theorem. The goal of this paper is to make up for this deficit. In particular, it aims to set out the implications of Haag’s theorem for scattering theory, the interaction picture, the use of non-Fock representations in describing interacting fields, and the choice among the plethora of the unitarily inequivalent representations of (...) the canonical commutation relations for free and interacting fields. (shrink)
This is the first part of a two-part article in which we defend the thesis of Humean Supervenience about Laws of Nature (HS). According to this thesis, two possible worlds cannot differ on what is a law of nature unless they also differ on the Humean base. The Humean base is easy to characterize intuitively, but there is no consensus on how, precisely, it should be defined. Here in Part I, we present and motivate a characterization of the Humean base (...) that, we argue, enables HS to capture what is really stake in the debate, without taking on extraneous commitments. (shrink)
In Part I, we presented and motivated a new formulation of Humean Supervenience about Laws of Nature (HS). Here in Part II, we present an epistemological argument in defense of HS, thus formulated. Our contention is that one can combine a modest realism about laws of nature with a proper recognition of the importance of empirical testability in the epistemology of science only if one accepts HS.
In 1894 Pierre Curie announced what has come to be known as Curie's Principle: the asymmetry of effects must be found in their causes. In the same publication Curie discussed a key feature of what later came to be known as spontaneous symmetry breaking: the phenomena generally do not exhibit the symmetries of the laws that govern them. Philosophers have long been interested in the meaning and status of Curie's Principle. Only comparatively recently have they begun to delve into the (...) mysteries of spontaneous symmetry breaking. The present paper aims to advance the discussion of both of these twin topics by tracing their interaction in classical physics, ordinary quantum mechanics and quantum field theory. The features of spontaneous symmetry that are peculiar to quantum field theory have received scant attention in the philosophical literature. These features are highlighted here, along with an explanation of why Curie's Principle, though valid in quantum field theory, is nearly vacuous in that context. (shrink)
Given its importance in modern physics, philosophers of science have paid surprisingly little attention to the subject of symmetries and invariances, and they have largely neglected the subtopic of symmetry breaking. I illustrate how the topic of laws and symmetries brings into fruitful interaction technical issues in physics and mathematics with both methodological issues in philosophy of science, such as the status of laws of physics, and metaphysical issues, such as the nature of objectivity.
We discuss the intertwined topics of Fulling non-uniqueness and the Unruh effect. The Fulling quantization, which is in some sense the natural one for an observer uniformly accelerated through Minkowski spacetime to adopt, is often heralded as a quantization of the Klein-Gordon field which is both physically relevant and unitarily inequivalent to the standard Minkowski quantization. We argue that the Fulling and Minkowski quantizations do not constitute a satisfactory example of physically relevant, unitarily inequivalent quantizations, and indicate what it would (...) take to settle the open question of whether a satisfactory example exists. A popular gloss on the Unruh effect has it that an observer uniformly accelerated through the Minkowski vacuum experiences a thermal flux of Rindler quanta. Taking the Unruh effect, so glossed, to establish that the notion of particle must be relativized to a reference frame, some would use it to demote the particle concept from fundamental status. We explain why technical results do not support the popular gloss and why the attempted demotion of the particle concept is both unsuccessful and unnecessary. Fulling non-uniqueness and the Unruh effect merit attention despite these negative verdicts because they provide excellent vehicles for illustrating key concepts of quantum field theory and for probing foundational issues of considerable philosophical interest. (shrink)
The cosmological constant is back. Several lines of evidence point to the conclusion that either there is a positive cosmological constant or else the universe is filled with a strange form of matter (“quintessence”) that mimics some of the effects of a positive lambda. This paper investigates the implications of the former possibility. Two senses in which the cosmological constant can be a constant are distinguished: the capital Λ sense in which lambda is a universal constant on a par with (...) the charge of the electron, and the lower case λ sense in which lambda is a humble constant of integration. The latter interpretation has been touted as the means to a solution to various problems in physics. These claims are critically examined with an eye to discerning the implications for philosophy of science and foundations of physics. (shrink)
The constrained Hamiltonian formalism is recommended as a means for getting a grip on the concepts of gauge and gauge transformation. This formalism makes it clear how the gauge concept is relevant to understanding Newtonian and classical relativistic theories as well as the theories of elementary particle physics; it provides an explication of the vague notions of "local" and "global" gauge transformations; it explains how and why a fibre bundle structure emerges for theories which do not wear their bundle structure (...) on their sleeves; it illuminates the connections of the gauge concept to issues of determinism and what counts as a genuine "observable"; and it calls attention to problems which arise in attempting to quantize gauge theories. Some of the limitations and problematic aspects of the formalism are also discussed. (shrink)
The philosophical literature on time and change is fixated on the issue of whether the B-series account of change is adequate or whether real change requires Becoming of either the property-based variety of McTaggart's A-series or the non-property-based form embodied in C. D. Broad's idea of the piling up of successive layers of existence. For present purposes it is assumed that the B-series suffices to ground real change. But then it is noted that modern science in the guise of Einstein's (...) general theory poses a threat to real change by implying that none of the genuine physical magnitudes countenanced by the theory changes its value with time. The aims of this paper are to explain how this seemingly paradoxical conclusion arises and to assess the merits and demerits of possible reactions to it. (shrink)
David Albert's Time and Chance (2000) provides a fresh and interesting perspective on the problem of the direction of time. Unfortunately, the book opens with a highly non-standard exposition of time reversal invariance that distorts the subsequent discussion. The present article not only has the remedial goal of setting the record straight about the meaning of time reversal invariance, but it also aims to show how the niceties of this symmetry concept matter to the problem of the direction of time (...) and to related foundation issues in physics. (shrink)
Many have claimed that ceteris paribus (CP) laws are a quite legitimate feature of scientific theories, some even going so far as to claim that laws of all scientific theories currently on offer are merely CP. We argue here that one of the common props of such a thesis, that there are numerous examples of CP laws in physics, is false. Moreover, besides the absence of genuine examples from physics, we suggest that otherwise unproblematic claims are rendered untestable by the (...) mere addition of the CP operator. Thus, “CP all Fs are Gs” when read as a straightforward statement of fact, cannot be the stuff of scientific theory. Rather, we suggest that when ``ceteris paribus'' appears in scientific works it plays a pragmatic role of pointing to more respectable claims. (shrink)
Physicists who work on canonical quantum gravity will sometimes remark that the general covariance of general relativity is responsible for many of the thorniest technical and conceptual problems in their ﬁeld.1 In particular, it is sometimes alleged that one can trace to this single source a variety of deep puzzles about the nature of time in quantum gravity, deep disagreements surrounding the notion of ‘observable’ in classical and quantum gravity, and deep questions about the nature of the existence of spacetime (...) in general relativity. (shrink)
This vital study offers a new interpretation of Hume's famous "Of Miracles," which notoriously argues against the possibility of miracles. By situating Hume's popular argument in the context of the 18th century debate on miracles, Earman shows Hume's argument to be largely unoriginal and chiefly without merit where it is original. Yet Earman constructively conceives how progress can be made on the issues that Hume's essay so provocatively posed about the ability of eyewitness testimony to establish the credibility of marvelous (...) and miraculous events. (shrink)
Inflationary cosmology won a large following on the basis of the claim that it solves various problems that beset the standard big bang model. We argue that these problems concern not the empirical adequacy of the standard model but rather the nature of the explanations it offers. Furthermore, inflationary cosmology has not been able to deliver on its proposed solutions without offering models which are increasingly complicated and contrived, which depart more and more from the standard model it was supposed (...) to improve upon, and which sever the connection between cosmology and particle physics that initially made the inflationary paradigm so attractive. Nevertheless, inflationary cosmology remains a promising research program, not least because it offers an explanation of the origin of the density perturbations that seeded the formation of galaxies and other cosmic structures. Tests of this explanation are underway and may settle the issue of whether inflation played an important role in the early universe. (shrink)
Much of the literature on "ceteris paribus" laws is based on a misguided egalitarianism about the sciences. For example, it is commonly held that the special sciences are riddled with ceteris paribus laws; from this many commentators conclude that if the special sciences are not to be accorded a second class status, it must be ceteris paribus all the way down to fundamental physics. We argue that the (purported) laws of fundamental physics are not hedged by ceteris paribus clauses and (...) provisos. Furthermore, we show that not only is there no persuasive analysis of the truth conditions for ceteris paribus laws, there is not even an acceptable account of how they are to be saved from triviality or how they are to be melded with standard scientific methodology. Our way out of this unsatisfactory situation to reject the widespread notion that the achievements and the scientific status of the special sciences must be understood in terms of ceteris paribus laws. (shrink)
We discuss two supertasks invented recently by Laraudogoitia [1996, 1997]. Both involve an infinite number of particle collisions within a finite amount of time and both compromise determinism. We point out that the sources of the indeterminism are rather different in the two cases—one involves unbounded particle velocities, the other involves particles with no lower bound to their sizes—and consequently that the implications for determinism are rather different—one form of indeterminism affects Newtonian but not relativistic physics, while the other form (...) is insensitive to the classical vs relativistic distinction. We also note some interesting linkages among supertasks, indeterminism and foundations problems in the general theory of relativity. (shrink)
We argue that, contrary to some analyses in the philosophy of science literature, ergodic theory falls short in explaining the success of classical equilibrium statistical mechanics. Our claim is based on the observations that dynamical systems for which statistical mechanics works are most likely not ergodic, and that ergodicity is both too strong and too weak a condition for the required explanation: one needs only ergodic-like behaviour for the finite set of observables that matter, but the behaviour must ensure that (...) the approach to equilibrium for these observables is on the appropriate time-scale. (shrink)
Recent attempts to cast Hume’s argument against miracles in a Bayesian form are examined. It is shown how the Bayesian apparatus does serve to clarify the structure and substance of Hume’s argument. But the apparatus does not underwrite Hume’s various claims, such as that no testimony serves to establish the credibility of a miracle; indeed, the Bayesian analysis reveals various conditions under which it would be reasonable to reject the more interesting of Hume’s claims.
The standard theory of computation excludes computations whose completion requires an infinite number of steps. Malament-Hogarth spacetimes admit observers whose pasts contain entire future-directed, timelike half-curves of infinite proper length. We investigate the physical properties of these spacetimes and ask whether they and other spacetimes allow the observer to know the outcome of a computation with infinitely many steps.
There is currently no viable alternative to the Bayesian analysis of scientific inference, yet the available versions of Bayesianism fail to do justice to several aspects of the testing and confirmation of scientific hypotheses. Bayes or Bust? provides the first balanced treatment of the complex set of issues involved in this nagging conundrum in the philosophy of science. Both Bayesians and anti-Bayesians will find a wealth of new insights on topics ranging from Bayes's original paper to contemporary formal learning theory. (...) In a paper published posthumously in 1763, the Reverend Thomas Bayes made a seminal contribution to the understanding of "analogical or inductive reasoning." Building on his insights, modem Bayesians have developed an account of scientific inference that has attracted numerous champions as well as numerous detractors. Earman argues that Bayesianism provides the best hope for a comprehensive and unified account of scientific inference, yet the presently available versions of Bayesianisin fail to do justice to several aspects of the testing and confirming of scientific theories and hypotheses. By focusing on the need for a resolution to this impasse, Earman sharpens the issues on which a resolution turns. John Earman is Professor of History and Philosophy of Science at the University of Pittsburgh. (shrink)
The cosmic censorship hypothesis states that the general theory of relativity has built in mechanisms to prevent the formation of "naked singularities," pathologies in the spacetime structure that lead to a breakdown in predictability and determinism. This paper discusses some attempts to turn the vague hypothesis into a precise conjecture. Evidence in favor of and against the conjecture is briefly reviewed. Finally the possibility of forming naked singularities via black hole evaporation due to Hawking radiation is discussed.
Spacetime substantivalism leads to a radical form of indeterminism within a very broad class of spacetime theories which include our best spacetime theory, general relativity. Extending an argument from Einstein, we show that spacetime substantivalists are committed to very many more distinct physical states than these theories' equations can determine, even with the most extensive boundary conditions.
After reviewing recent literature from physics and philosophy, it is concluded that we are still far from having a satisfying explanation of the nature and origins of irreversibility. It is proposed that the most fruitful approach to this problem is to concentrate on conditions needed for a rigorous derivation of the Boltzmann equation.
Nearly all accounts of the genesis of special relativity unhesitatingly assume that the theory was worked out in a roughly five week period following the discovery of the relativity of simultaneity. Not only is there no direct evidence for this common presupposition, there are numerous considerations which militate against it. The evidence suggests it is far more reasonable that Einstein was already in possession of the Lorentz and field transformations, that he had applied these to the dynamics of the electron, (...) and that portions of the 1905 paper had actually been drafted well before the epistemological analysis of time. (shrink)
Various senses in which laws of nature are supposed to be "universal" are distinguished. Conditions designed to capture the content of the more important of these senses are proposed and the relations among these conditions are examined. The status of universality requirements is briefly discussed.
It is argued that the main problem with "the problem of the direction of time" is to figure out what the problem is or is supposed to be. Towards this end, an attempt is made to disentangle and to classify some of the many issues which have been discussed under the label of 'the direction of time'. Secondly, some technical apparatus is introduced in the hope of producing a sharper formulation of the issues than they have received in the philosophical (...) literature. Finally, some tentative suggestions about the central issues are offered. In particular, it is suggested that entropy and irreversibility are much less crucial to the central issues than most philosophers would have us believe. This suggestion is not made because of any firm conviction of its correctness but rather because it helps to focus the discussion on some basic but long neglected assumptions which underlie traditional approaches. (shrink)
This paper presents a critical examination of claims advanced by several philosophers to the effect that 'time travel' represents a physical possibility and that the interpretation of certain actually observed phenomena in terms of 'time travel' is both legitimate and advantageous. It is argued that (a) no convincing motivation for the introduction of the time travel hypothesis has been presented; (b) no coherent and interesting sense of 'going backward in time' has been supplied which makes 'time travel' compatible with Special (...) Relativity; (c) even the conceptual possibility of 'time travel' is an unsettled and somewhat nebulous question. (shrink)