Many have claimed that ceteris paribus laws are a quite legitimate feature of scientific theories, some even going so far as to claim that laws of all scientific theories currently on offer are merely CP. We argue here that one of the common props of such a thesis, that there are numerous examples of CP laws in physics, is false. Moreover, besides the absence of genuine examples from physics, we suggest that otherwise unproblematic claims are rendered untestable by the mere (...) addition of the CP operator. Thus, “CP all Fs are Gs”, when read as a straightforward statement of fact, cannot be the stuff of scientific theory. Rather, we suggest that when “ceteris paribus” appears in scientific works it plays a pragmatic role of pointing to more respectable claims. (shrink)
There is currently no viable alternative to the Bayesian analysis of scientific inference, yet the available versions of Bayesianism fail to do justice to several aspects of the testing and confirmation of scientific hypotheses. Bayes or Bust? provides the first balanced treatment of the complex set of issues involved in this nagging conundrum in the philosophy of science. Both Bayesians and anti-Bayesians will find a wealth of new insights on topics ranging from Bayes’s original paper to contemporary formal learning theory.In (...) a paper published posthumously in 1763, the Reverend Thomas Bayes made a seminal contribution to the understanding of "analogical or inductive reasoning." Building on his insights, modem Bayesians have developed an account of scientific inference that has attracted numerous champions as well as numerous detractors. Earman argues that Bayesianism provides the best hope for a comprehensive and unified account of scientific inference, yet the presently available versions of Bayesianisin fail to do justice to several aspects of the testing and confirming of scientific theories and hypotheses. By focusing on the need for a resolution to this impasse, Earman sharpens the issues on which a resolution turns. John Earman is Professor of History and Philosophy of Science at the University of Pittsburgh. (shrink)
There is currently no viable alternative to the Bayesian analysis of scientific inference, yet the available versions of Bayesianism fail to do justice to several aspects of the testing and confirmation of scientific hypotheses. Bayes or Bust? provides the first balanced treatment of the complex set of issues involved in this nagging conundrum in the philosophy of science. Both Bayesians and anti-Bayesians will find a wealth of new insights on topics ranging from Bayes's original paper to contemporary formal learning theory. (...) In a paper published posthumously in 1763, the Reverend Thomas Bayes made a seminal contribution to the understanding of "analogical or inductive reasoning." Building on his insights, modem Bayesians have developed an account of scientific inference that has attracted numerous champions as well as numerous detractors. Earman argues that Bayesianism provides the best hope for a comprehensive and unified account of scientific inference, yet the presently available versions of Bayesianisin fail to do justice to several aspects of the testing and confirming of scientific theories and hypotheses. By focusing on the need for a resolution to this impasse, Earman sharpens the issues on which a resolution turns. John Earman is Professor of History and Philosophy of Science at the University of Pittsburgh. (shrink)
It has become something of a dogma in the philosophy of science that modern cosmology has completed Boltzmann's program for explaining the statistical validity of the Second Law of thermodynamics by providing the low entropy initial state needed to ground the asymmetry in entropic behavior that underwrites our inference about the past. This dogma is challenged on several grounds. In particular, it is argued that it is likely that the Boltzmann entropy of the initial state of the universe is an (...) ill-defined or severely hobbled concept. It is also argued that even if the entropy of the initial state of the universe had a well-defined, low value, this would not suffice to explain why thermodynamics works as well as it does for the kinds of systems we care about. Because the role of Boltzmann entropy in our inferences to the past has been vastly overrated, the failure of the Boltzmann program does not pose a serious problem for our knowledge of the past. But it does call a different explanation of why thermodynamics works as well as it does. A suggestion is offered for a different approach. (shrink)
Spacetime substantivalism leads to a radical form of indeterminism within a very broad class of spacetime theories which include our best spacetime theory, general relativity. Extending an argument from Einstein, we show that spacetime substantivalists are committed to very many more distinct physical states than these theories' equations can determine, even with the most extensive boundary conditions.
Much of the literature on "ceteris paribus" laws is based on a misguided egalitarianism about the sciences. For example, it is commonly held that the special sciences are riddled with ceteris paribus laws; from this many commentators conclude that if the special sciences are not to be accorded a second class status, it must be ceteris paribus all the way down to fundamental physics. We argue that the (purported) laws of fundamental physics are not hedged by ceteris paribus clauses and (...) provisos. Furthermore, we show that not only is there no persuasive analysis of the truth conditions for ceteris paribus laws, there is not even an acceptable account of how they are to be saved from triviality or how they are to be melded with standard scientific methodology. Our way out of this unsatisfactory situation to reject the widespread notion that the achievements and the scientific status of the special sciences must be understood in terms of ceteris paribus laws. (shrink)
Given its importance in modern physics, philosophers of science have paid surprisingly little attention to the subject of symmetries and invariances, and they have largely neglected the subtopic of symmetry breaking. I illustrate how the topic of laws and symmetries brings into fruitful interaction technical issues in physics and mathematics with both methodological issues in philosophy of science, such as the status of laws of physics, and metaphysical issues, such as the nature of objectivity.
In 1894 Pierre Curie announced what has come to be known as Curie's Principle: the asymmetry of effects must be found in their causes. In the same publication Curie discussed a key feature of what later came to be known as spontaneous symmetry breaking: the phenomena generally do not exhibit the symmetries of the laws that govern them. Philosophers have long been interested in the meaning and status of Curie's Principle. Only comparatively recently have they begun to delve into the (...) mysteries of spontaneous symmetry breaking. The present paper aims to advance the discussion of both of these twin topics by tracing their interaction in classical physics, ordinary quantum mechanics and quantum field theory. The features of spontaneous symmetry that are peculiar to quantum field theory have received scant attention in the philosophical literature. These features are highlighted here, along with an explanation of why Curie's Principle, though valid in quantum field theory, is nearly vacuous in that context. (shrink)
Although the philosophical literature on the foundations of quantum field theory recognizes the importance of Haag’s theorem, it does not provide a clear discussion of the meaning of this theorem. The goal of this paper is to make up for this deficit. In particular, it aims to set out the implications of Haag’s theorem for scattering theory, the interaction picture, the use of non-Fock representations in describing interacting fields, and the choice among the plethora of the unitarily inequivalent representations of (...) the canonical commutation relations for free and interacting fields. (shrink)
The philosophical literature on time and change is fixated on the issue of whether the B-series account of change is adequate or whether real change requires Becoming of either the property-based variety of McTaggart's A-series or the non-property-based form embodied in C. D. Broad's idea of the piling up of successive layers of existence. For present purposes it is assumed that the B-series suffices to ground real change. But then it is noted that modern science in the guise of Einstein's (...) general theory poses a threat to real change by implying that none of the genuine physical magnitudes countenanced by the theory changes its value with time. The aims of this paper are to explain how this seemingly paradoxical conclusion arises and to assess the merits and demerits of possible reactions to it. (shrink)
We address the question of whether it is possible to operate a time machine by manipulating matter and energy so as to manufacture closed timelike curves. This question has received a great deal of attention in the physics literature, with attempts to prove no- go theorems based on classical general relativity and various hybrid theories serving as steps along the way towards quantum gravity. Despite the effort put into these no-go theorems, there is no widely accepted definition of a time (...) machine. We explain the conundrum that must be faced in providing a satisfactory definition and propose a resolution. Roughly, we require that all extensions of the time machine region contain closed timelike curves; the actions of the time machine operator are then sufficiently "potent" to guarantee that closed timelike curves appear. We then review no-go theorems based on classical general relativity, semi-classical quantum gravity, quantum field theory on curved spacetime, and Euclidean quantum gravity. Our verdict on the question of our title is that no result of sufficient generality to underwrite a confident "yes" has been proven. Our review of the no-go results does, however, highlight several foundational problems at the intersection of general relativity and quantum physics that lend substance to the search for an answer. (shrink)
This vital study offers a new interpretation of Hume's famous "Of Miracles," which notoriously argues against the possibility of miracles. By situating Hume's popular argument in the context of the 18th century debate on miracles, Earman shows Hume's argument to be largely unoriginal and chiefly without merit where it is original. Yet Earman constructively conceives how progress can be made on the issues that Hume's essay so provocatively posed about the ability of eyewitness testimony to establish the credibility of marvelous (...) and miraculous events. (shrink)
In this first part of a two-part paper, we describe efforts in the early decades of this century to restrict the extent of violations of the Second Law of thermodynamics that were brought to light by the rise of the kinetic theory and the identification of fluctuation phenomena. We show how these efforts mutated into Szilard’s proposal that Maxwell’s Demon is exorcised by proper attention to the entropy costs associated with the Demon’s memory and information acquisition. In the second part (...) we will argue that the information theoretic exorcisms of the Demon provide largely illusory benefits. According to the case, they either return a presupposition that can be had without information theoretic consideration or they postulate a broader connection between information and entropy than can be sustained. (shrink)
Physicists who work on canonical quantum gravity will sometimes remark that the general covariance of general relativity is responsible for many of the thorniest technical and conceptual problems in their ﬁeld.1 In particular, it is sometimes alleged that one can trace to this single source a variety of deep puzzles about the nature of time in quantum gravity, deep disagreements surrounding the notion of ‘observable’ in classical and quantum gravity, and deep questions about the nature of the existence of spacetime (...) in general relativity. (shrink)
Inflationary cosmology won a large following on the basis of the claim that it solves various problems that beset the standard big bang model. We argue that these problems concern not the empirical adequacy of the standard model but rather the nature of the explanations it offers. Furthermore, inflationary cosmology has not been able to deliver on its proposed solutions without offering models which are increasingly complicated and contrived, which depart more and more from the standard model it was supposed (...) to improve upon, and which sever the connection between cosmology and particle physics that initially made the inflationary paradigm so attractive. Nevertheless, inflationary cosmology remains a promising research program, not least because it offers an explanation of the origin of the density perturbations that seeded the formation of galaxies and other cosmic structures. Tests of this explanation are underway and may settle the issue of whether inflation played an important role in the early universe. (shrink)
David Albert's Time and Chance (2000) provides a fresh and interesting perspective on the problem of the direction of time. Unfortunately, the book opens with a highly non-standard exposition of time reversal invariance that distorts the subsequent discussion. The present article not only has the remedial goal of setting the record straight about the meaning of time reversal invariance, but it also aims to show how the niceties of this symmetry concept matter to the problem of the direction of time (...) and to related foundation issues in physics. (shrink)
This is the first part of a two-part article in which we defend the thesis of Humean Supervenience about Laws of Nature (HS). According to this thesis, two possible worlds cannot differ on what is a law of nature unless they also differ on the Humean base. The Humean base is easy to characterize intuitively, but there is no consensus on how, precisely, it should be defined. Here in Part I, we present and motivate a characterization of the Humean base (...) that, we argue, enables HS to capture what is really stake in the debate, without taking on extraneous commitments. (shrink)
The simplest case of quantum field theory on curved spacetime—that of the Klein–Gordon field on a globally hyperbolic spacetime—reveals a dilemma: In generic circumstances, either there is no dynamics for this quantum field, or else there is a dynamics that is not unitarily implementable. We do not try to resolve the dilemma here, but endeavour to spell out the consequences of seizing one or the other horn of the dilemma.
Although C. D. Broad's notion of Becoming has received a fair amount of attention in the philosophy-of-time literature, there are no serious attempts to show how to replace the standard 'block' spacetime models by models that are more congenial to Broad's idea that the sum total of existence is continuously increased by Becoming or the coming into existence of events. In the Newtonian setting Broad-type models can be constructed in a cheating fashion by starting with a Newtonian block model, carving (...) chips off the block, and assembling the chips in an appropriately structured way. However, attempts to construct Broad-type models in a non-cheating fashion reveal a number of problematic aspects of Becoming that have not received adequate attention in the literature. The paper then turns to an assessment of the problem and prospects of adapting Becoming models to relativistic spacetimes. The results of the assessment differ in both minor and major ways from the ones in the extant literature. Finally, the paper describes how the causal set approach to quantum gravity promises to provide a mechanism for realizing Becoming, though the form of Becoming that emerges may not conform to any of the versions discussed in the philosophical literature. (shrink)
Like moths attracted to a bright light, philosophers are drawn to glitz. So in discussing the notions of ‘gauge’, ‘gauge freedom’, and ‘gauge theories’, they have tended to focus on examples such as Yang–Mills theories and on the mathematical apparatus of fibre bundles. But while Yang–Mills theories are crucial to modern elementary particle physics, they are only a special case of a much broader class of gauge theories. And while the fibre bundle apparatus turned out, in retrospect, to be the (...) right formalism to illuminate the structure of Yang–Mills theories, the strength of this apparatus is also its weakness: the fibre bundle formalism is very flexible and general, and, as such, fibre bundles can be seen lurking under, over, and around every bush. What is needed is an explanation of what the relevant bundle structure is and how it arises, especially for theories that are not initially formulated in fibre bundle language. Here I will describe an approach that grows out of the conviction that, at least for theories that can be written in Lagrangian/Hamiltonian form, gauge freedom arises precisely when there are Lagrangian/Hamiltonian constraints of an appropriate character. This conviction is shared, if only tacitly, by that segment of the physics community that works on constrained Hamiltonian systems. (shrink)
We argue that, contrary to some analyses in the philosophy of science literature, ergodic theory falls short in explaining the success of classical equilibrium statistical mechanics. Our claim is based on the observations that dynamical systems for which statistical mechanics works are most likely not ergodic, and that ergodicity is both too strong and too weak a condition for the required explanation: one needs only ergodic-like behaviour for the finite set of observables that matter, but the behaviour must ensure that (...) the approach to equilibrium for these observables is on the appropriate time-scale. (shrink)
This paper surveys the issue of relativistic causality within the framework of algebraic quantum field theory . In doing so, we distinguish various notions of causality formulated in the literature and study their relationships, and thereby we offer what we hope to be a useful taxonomy. We propose that the most direct expression of relativistic causality in AQFT is captured not by the spectrum condition but rather by the axiom of local primitive causality, in that it entails a form of (...) local determinism for quantum fields which generalizes the constraint of no superluminal propagation of classical field theories to relativistic quantum field theory. We discuss the status of the axiom of micro-causality by locating its place within a large family of separability/independence/locality conditions developed for AQFT and also by relating it to so-called no-signalling theorems. And we also provide a critical survey of attempts to understand the implications for relativistic causality of the distant correlations endemic to the states in models of AQFT satisfying the standard axioms, and we provide an assessment of attempts to employ Reichenbach's common cause principle in AQFT to defuse worries that these distant correlations implicate direct causal connections between relatively spacelike events. (shrink)
The constrained Hamiltonian formalism is recommended as a means for getting a grip on the concepts of gauge and gauge transformation. This formalism makes it clear how the gauge concept is relevant to understanding Newtonian and classical relativistic theories as well as the theories of elementary particle physics; it provides an explication of the vague notions of "local" and "global" gauge transformations; it explains how and why a fibre bundle structure emerges for theories which do not wear their bundle structure (...) on their sleeves; it illuminates the connections of the gauge concept to issues of determinism and what counts as a genuine "observable"; and it calls attention to problems which arise in attempting to quantize gauge theories. Some of the limitations and problematic aspects of the formalism are also discussed. (shrink)
A criterion is proposed to ensure that classical relativistic fields do not propagate superluminally. If this criterion does indeed serve as a sufficient condition for no superluminal propagation it follows that various other criteria found in the physics literature cannot serve as necessary conditions since they can fail although the proffered condition holds. The rejected criteria rely on energy conditions that are believed to hold for most classical fields used in actual applications. But these energy conditions are known to fail (...) at small scales for quantum fields. It is argued that such a failure is not necessarily a cause for concern about superluminal propagation in the quantum regime since the proffered criterion of no superluminal propagation for classical fields has a natural analog for quantum fields and, further, this quantum analog condition provably holds for some quantum fields despite the violation of energy conditions. The apparatus developed here also offers a different approach to treating the Reichenbach-Salmon cases of "pseudo-causal processes" and helps to clarify the issue of whether relativity theory is consistent with superluminal propagation. (shrink)
David Lewis' "Principal Principle" is a purported principle of rationality connecting credence and objective chance. Almost all of the discussion of the Principal Principle in the philosophical literature assumes classical probability theory, which is unfortunate since the theory of modern physics that, arguably, speaks most clearly of objective chance is the quantum theory, and quantum probabilities are not classical probabilities. This paper develops an account of how chance works in quantum theory that reveals a connection between credence and quantum chance (...) quite unlike what is envisioned in the philosophical literature: as a theorem of quantum probability, updating a completely additive chance function on a knowledge of chance brings credence into line with chance. The account also suggests a way of construing the Humean supervenience of chance that has the virtue of dissolving some puzzles about the "undermining" of chances. A number of interpretative moves in quantum theory are needed to generate the account of quantum chance on offer here, and they can all be disputed. But engaging in these disputes is part and parcel of naturalized metaphysics, and as such it can be more productive than engaging in the battle of intuitions among analytical metaphysicians about how chance ought to work this and other possible worlds. -/- . (shrink)
The overaraching goal of this paper is to elucidate the nature of superselection rules in a manner that is accessible to philosophers of science and that brings out the connections between superselection and some of the most fundamental interpretational issues in quantum physics. The formalism of von Neumann algebras is used to characterize three different senses of superselection rules and to provide useful necessary and sufficient conditions for each sense. It is then shown how the Haag–Kastler algebraic approach to quantum (...) physics holds the promise of a uniform and comprehensive account of the origin of superselection rules. Some of the challenges that must be met before this promise can be kept are discussed. The focus then turns to the role of superselection rules in solutions to the measurement problem and the emergence of classical properties. It is claimed that the role for “hard” superselection rules is limited, but “soft” superselection rules or N. P. Landsman’s situational superselection rules may have a major role to play. Finally, an assessment is given of the recently revived attempts to deconstruct superselection rules. (shrink)
In Part I, we presented and motivated a new formulation of Humean Supervenience about Laws of Nature (HS). Here in Part II, we present an epistemological argument in defense of HS, thus formulated. Our contention is that one can combine a modest realism about laws of nature with a proper recognition of the importance of empirical testability in the epistemology of science only if one accepts HS.
Except for a few brief periods, Einstein was uninterested in analysing the nature of the spacetime singularities that appeared in solutions to his gravitational field equations for general relativity. The existence of such monstrosities reinforced his conviction that general relativity was an incomplete theory which would be superseded by a singularity-free unified field theory. Nevertheless, on a number of occasions between 1916 and the end of his life, Einstein was forced to confront singularities. His reactions show a strange asymmetry: he (...) tended to be more disturbed by merely apparent singularities and less disturbed by real singularities. Einstein had strong a priori ideas about what results a correct physical theory should deliver. In the process of searching through theoretical possibilities, he tended to push aside technical problems and jump over essential difficulties. Sometimes this method of working produced brilliant new ideas—such as the Einstein–Rosen bridge—and sometimes it lead him to miss important implications of his theory of gravity—such as gravitational collapse. (shrink)
It is argued that seemingly “merely technical” issues about the existence and uniqueness of self-adjoint extensions of symmetric operators in quantum mechanics have interesting implications for foundations problems in classical and quantum physics. For example, pursuing these technical issues reveals a sense in which quantum mechanics can cure some of the forms of indeterminism that crop up in classical mechanics; and at the same time it reveals the possibility of a form of indeterminism in quantum mechanics that is quite distinct (...) from the indeterminism of state vector collapse. More generally, the examples considered indicate that the classical–quantum correspondence is more intricate and delicate than is generally appreciated. The aim of the article is to give a series of examples that reveal why the technical issues about self-adjointness are relevant to the philosophy of science and that help to make the issues accessible to philosophers of science. (shrink)
We discuss the intertwined topics of Fulling non‐uniqueness and the Unruh effect. The Fulling quantization, which is in some sense the natural one for an observer uniformly accelerated through Minkowski spacetime to adopt, is often heralded as a quantization of the Klein‐Gordon field which is both physically relevant and unitarily inequivalent to the standard Minkowski quantization. We argue that the Fulling and Minkowski quantizations do not constitute a satisfactory example of physically relevant, unitarily inequivalent quantizations, and indicate what it would (...) take to settle the open question of whether a satisfactory example exists. A popular gloss on the Unruh effect has it that an observer uniformly accelerated through the Minkowski vacuum experiences a thermal flux of Rindler quanta. Taking the Unruh effect, so glossed, to establish that the notion of particle must be relativized to a reference frame, some would use it to demote the particle concept from fundamental status. We explain why technical results do not support the popular gloss and why the attempted demotion of the particle concept is both unsuccessful and unnecessary. Fulling non‐uniqueness and the Unruh effect merit attention despite these negative verdicts because they provide excellent vehicles for illustrating key concepts of quantum field theory and for probing foundational issues of considerable philosophical interest. (shrink)
The standard theory of computation excludes computations whose completion requires an infinite number of steps. Malament-Hogarth spacetimes admit observers whose pasts contain entire future-directed, timelike half-curves of infinite proper length. We investigate the physical properties of these spacetimes and ask whether they and other spacetimes allow the observer to know the outcome of a computation with infinitely many steps.
We discuss the intertwined topics of Fulling non-uniqueness and the Unruh effect. The Fulling quantization, which is in some sense the natural one for an observer uniformly accelerated through Minkowski spacetime to adopt, is often heralded as a quantization of the Klein-Gordon field which is both physically relevant and unitarily inequivalent to the standard Minkowski quantization. We argue that the Fulling and Minkowski quantizations do not constitute a satisfactory example of physically relevant, unitarily inequivalent quantizations, and indicate what it would (...) take to settle the open question of whether a satisfactory example exists. A popular gloss on the Unruh effect has it that an observer uniformly accelerated through the Minkowski vacuum experiences a thermal flux of Rindler quanta. Taking the Unruh effect, so glossed, to establish that the notion of particle must be relativized to a reference frame, some would use it to demote the particle concept from fundamental status. We explain why technical results do not support the popular gloss and why the attempted demotion of the particle concept is both unsuccessful and unnecessary. Fulling non-uniqueness and the Unruh effect merit attention despite these negative verdicts because they provide excellent vehicles for illustrating key concepts of quantum field theory and for probing foundational issues of considerable philosophical interest. (shrink)
Stephen Hawking has argued that universes containing evaporating black holes can evolve from pure initial states to mixed final ones. Such evolution is non-unitary and so contravenes fundamental quantum principles on which Hawking's analysis was based. It disables the retrodiction of the universe's initial state from its final one, and portends the time-asymmetry of quantum gravity. Small wonder that Hawking's paradox has met with considerable resistance. Here we use a simple result for C*-algebras to offer an argument for pure-to-mixed state (...) evolution in black hole evaporation, and review responses to the Hawking paradox with respect to how effectively they rebut this argument. (shrink)
We discuss the relationship between the interpretative problems of quantum gravity and those of general relativity. We argue that classical and quantum theories of gravity resuscitate venerable philosophical questions about the nature of space, time, and change; and that the resolution of some of the difficulties facing physicists working on quantum theories of gravity would appear to require philosophical as well as scientific creativity.
A vast amount of ink has been spilled in both the physics and the philosophy literature on the measurement problem in quantum mechanics. Important as it is, this problem is but one aspect of the more general issue of how, if at all, classical properties can emerge from the quantum descriptions of physical systems. In this paper we will study another aspect of the more general issue-the emergence of classical chaos-which has been receiving increasing attention from physicists but which has (...) largely been neglected by philosophers of science. (shrink)
The purpose of this paper is to give a brief survey the implications of the theories of modern physics for the doctrine of determinism. The survey will reveal a curious feature of determinism: in some respects it is fragile, requiring a number of enabling assumptions to give it a ﬁghting chance; but in other respects it is quite robust and very diﬃcult to kill. The survey will also aim to show that, apart from its own intrinsic interest, determinism is an (...) excellent device for probing the foundations of classical, relativistic, and quantum physics. The survey is conducted under three major presuppositions. First, I take a realistic attitude towards scientiﬁc theories in that I assume that to give an interpretation of a theory is, at a minimum, to specify what the world would have to be like in order for the theory to be true. But we will see that the demand for a deterministic interpretation of a theory can force us to abandon a naively realistic reading of the theory. Second, I reject the “no laws” view of science and assume that the ﬁeld equations or laws of motion of the most fundamental theories of current physics represent science’s best guesses as to the form of the basic laws of nature. Third, I take determinism to be an ontological doctrine, a doctrine about the temporal evolution of the world. This ontological doctrine must not be confused with predictability, which is an epistemological doctrine, the failure of which need not entail a failure of determinism. From time to time I will comment on ways in which predictability can fail in a deterministic setting. Finally, my survey will concentrate on the Laplacian variety of determinism according to which the instantaneous state of the world at any time uniquely determines the state at any other time. The plan of the survey is as follows. Section 2 illustrates the fragility of determinism by means of a Zeno type example. Then sections 3 and 4 survey successively the fortunes of determinism in the Newtonian and the special relativistic settings.. (shrink)
In this second part of our two-part paper we review and analyse attempts since 1950 to use information theoretic notions to exorcise Maxwell’s Demon. We argue through a simple dilemma that these attempted exorcisms are ineffective, whether they follow Szilard in seeking a compensating entropy cost in information acquisition or Landauer in seeking that cost in memory erasure. In so far as the Demon is a thermodynamic system already governed by the Second Law, no further supposition about information and entropy (...) is needed to save the Second Law. In so far as the Demon fails to be such a system, no supposition about the entropy cost of information acquisition and processing can save the Second Law from the Demon. (shrink)
Schrödinger averred that entanglement is the characteristic trait of quantum mechanics. The first part of this paper is simultaneously an exploration of Schrödinger’s claim and an investigation into the distinction between mere entanglement and genuine quantum entanglement. The typical discussion of these matters in the philosophical literature neglects the structure of the algebra of observables, implicitly assuming a tensor product structure of the simple Type I factor algebras used in ordinary Quantum Mechanics . This limitation is overcome by adopting the (...) algebraic approach to quantum physics, which allows a uniform treatment of ordinary QM, relativistic quantum field theory, and quantum statistical mechanics. The algebraic apparatus helps to distinguish several different criteria of quantum entanglement and to prove results about the relation of quantum entanglement to two additional ways of characterizing the classical versus quantum divide, viz. abelian versus non-abelian algebras of observables, and the ability versus inability to interrogate the system without disturbing it. Schrödinger’s claim is reassessed in the light of this discussion. The second part of the paper deals with the relativity-to-ambiguity threat: the entanglement of a state on a system algebra is entanglement of the state relative to a decomposition of the system algebra into subsystem algebras; a state may be entangled with respect to one decomposition but not another; hence, unless there is some principled way to choose a decomposition, entanglement is a radically ambiguous notion. The problem is illustrated in terms a Realist versus Pragmatist debate, the former claiming that the decomposition must correspond to real as opposed to virtual subsystems, while the latter claims that the real versus virtual distinction is bogus and that practical considerations can steer the choice of decomposition. This debate is applied to the fraught problem of measuring entanglement for indistinguishable particles. The paper ends with some remarks about claims in the philosophical literature that entanglement undermines the separability or independence of subsystems while promoting holism. (shrink)
It is argued that the main problem with "the problem of the direction of time" is to figure out what the problem is or is supposed to be. Towards this end, an attempt is made to disentangle and to classify some of the many issues which have been discussed under the label of 'the direction of time'. Secondly, some technical apparatus is introduced in the hope of producing a sharper formulation of the issues than they have received in the philosophical (...) literature. Finally, some tentative suggestions about the central issues are offered. In particular, it is suggested that entropy and irreversibility are much less crucial to the central issues than most philosophers would have us believe. This suggestion is not made because of any firm conviction of its correctness but rather because it helps to focus the discussion on some basic but long neglected assumptions which underlie traditional approaches. (shrink)
The cosmological constant is back. Several lines of evidence point to the conclusion that either there is a positive cosmological constant or else the universe is filled with a strange form of matter (“quintessence”) that mimics some of the effects of a positive lambda. This paper investigates the implications of the former possibility. Two senses in which the cosmological constant can be a constant are distinguished: the capital Λ sense in which lambda is a universal constant on a par with (...) the charge of the electron, and the lower case λ sense in which lambda is a humble constant of integration. The latter interpretation has been touted as the means to a solution to various problems in physics. These claims are critically examined with an eye to discerning the implications for philosophy of science and foundations of physics. (shrink)
The central problem with Bayesian philosophy of science is that it cannot take account of the relevance of simplicity and unification to confirmation, induction, and scientific inference. The standard Bayesian folklore about factoring simplicity into the priors, and convergence theorems as a way of grounding their objectivity are some of the myths that Earman's book does not address adequately.
The importance of the Unruh effect lies in the fact that, together with the related Hawking effect, it serves to link the three main branches of modern physics: thermal/statistical physics, relativity theory/gravitation, and quantum physics. However, different researchers can have in mind different phenomena when they speak of “the Unruh effect” in flat spacetime and its generalization to curved spacetimes. Three different approaches are reviewed here. They are shown to yield results that are sometimes concordant and sometimes discordant. The discordance (...) is disconcerting only if one insists on taking literally the definite article in “the Unruh effect.” It is argued that the role of linking different branches of physics is better served by taking “the Unruh effect” to designate a family of related phenomena. The relation between the Hawking effect and the generalized Unruh effect for curved spacetimes is briefly discussed. (shrink)