Harvey Brown’s Physical Relativity defends a view, the dynamical perspective, on the nature of spacetime that goes beyond the familiar dichotomy of substantivalist/relationist views. A full defense of this view requires attention to the way that our use of spacetime concepts connect with the physical world. Reflection on such matters, I argue, reveals that the dynamical perspective affords the only possible view about the ontological status of spacetime, in that putative rivals fail to express anything, either true or false. I (...) conclude with remarks aimed at clarifying what is and isn’t in dispute with regards to the explanatory priority of spacetime and dynamics, at countering an objection raised by John Norton to views of this sort, and at clarifying the relation between background and effective spacetime structure. (shrink)
This chapter will review selected aspects of the terrain of discussions about probabilities in statistical mechanics (with no pretensions to exhaustiveness, though the major issues will be touched upon), and will argue for a number of claims. None of the claims to be defended is entirely original, but all deserve emphasis. The first, and least controversial, is that probabilistic notions are needed to make sense of statistical mechanics. The reason for this is the same reason that convinced Maxwell, Gibbs, and (...) Boltzmann that probabilities would be needed, namely, that the second law of thermodynamics, which in its original formulation says that certain processes are impossible, must, on the kinetic theory, be replaced by a weaker formulation according to which what the original version deems impossible is merely improbable. Second is that we ought not take the standard measures invoked in equilibrium statistical mechanics as giving, in any sense, the correct probabilities about microstates of the system. We can settle for a much weaker claim: that the probabilities for outcomes of experiments yielded by the standard distributions are effectively the same as those yielded by any distribution that we should take as a representing probabilities over microstates. Lastly, (and most controversially): in asking about the status of probabilities in statistical mechanics, the familiar dichotomy between epistemic probabilities (credences, or degrees of belief) and ontic (physical) probabilities is insufficient; the concept of probability that is best suited to the needs of statistical mechanics is one that combines epistemic and physical considerations. (shrink)
Much of the the discussion of the metaphysics of quantum mechanics focusses on the status of wavefunctions. This paper is about how to think about wavefunctions, when we bear in mind that quantum mechanics—that is, the nonrelativistic quantum theory of systems of a fixed, finite number of degrees of freedom—is not a fundamental theory, but arises, in a certain approximation, valid in a limited regime, from a relativistic quantum field theory. We will explicitly show how the wavefunctions of quantum mechanics, (...) and the configuration spaces on which they are defined, are constructed from a relativistic quantum field theory. Two lessons will be drawn from this. The first is that configuration spaces are not fundamental, but rather are derivative of structures defined on ordinary spacetime. The second is that wavefunctions are not as much like classical fields as might first appear, in that, on the most natural way of constructing wavefunctions from quantum field-theoretic quantities, the value assigned to a point in configuration space is not a local fact about that point, but rather, depends on the global state. (shrink)
This paper discusses two senses in which a hypothesis may be said to unify evidence. One is the ability of the hypothesis to increase the mutual information of a set of evidence statements; the other is the ability of the hypothesis to explain commonalities in observed phenomena by positing a common origin for them. On Bayesian updating, it is only mutual information unification that contributes to the incremental support of a hypothesis by the evidence unified. This poses a challenge for (...) the view that explanation is a confirmatory virtue that contributes to such incremental support; its advocates must ground it in some relevant difference between humans and a Bayesian agent. (shrink)
The standard treatment of conditional probability leaves conditional probability undefined when the conditioning proposition has zero probability. Nonetheless, some find the option of extending the scope of conditional probability to include zero-probability conditions attractive or even compelling. This article reviews some of the pitfalls associated with this move, and concludes that, for the most part, probabilities conditional on zero-probability propositions are more trouble than they are worth.
A Bayesian account of the virtue of unification is given. On this account, the ability of a theory to unify disparate phenomena consists in the ability of the theory to render such phenomena informationally relevant to each other. It is shown that such ability contributes to the evidential support of the theory, and hence that preference for theories that unify the phenomena need not, on a Bayesian account, be built into the prior probabilities of theories.
One finds, in Maxwell's writings on thermodynamics and statistical physics, a conception of the nature of these subjects that differs in interesting ways from the way that they are usually conceived. In particular, though—in agreement with the currently accepted view—Maxwell maintains that the second law of thermodynamics, as originally conceived, cannot be strictly true, the replacement he proposes is different from the version accepted by most physicists today. The modification of the second law accepted by most physicists is a probabilistic (...) one: although statistical fluctuations will result in occasional spontaneous differences in temperature or pressure, there is no way to predictably and reliably harness these to produce large violations of the original version of the second law. Maxwell advocates a version of the second law that is strictly weaker; the validity of even this probabilistic version is of limited scope, limited to situations in which we are dealing with large numbers of molecules en masse and have no ability to manipulate individual molecules. Connected with this is his concept of the thermodynamic concepts of heat, work, and entropy; on the Maxwellian view, these are concepts that must be relativized to the means we have available for gathering information about and manipulating physical systems. The Maxwellian view is one that deserves serious consideration in discussions of the foundation of statistical mechanics. It has relevance for the project of recovering thermodynamics from statistical mechanics because, in such a project, it matters which version of the second law we are trying to recover. (shrink)
Andrew Wayne discusses some recent attempts to account, within a Bayesian framework, for the "common methodological adage" that "diverse evidence better confirms a hypothesis than does the same amount of similar evidence". One of the approaches considered by Wayne is that suggested by Howson and Urbach and dubbed the "correlation approach" by Wayne. This approach is, indeed, incomplete, in that it neglects the role of the hypothesis under consideration in determining what diversity in a body of evidence is relevant diversity. (...) In this paper, it is shown how this gap can be filled, resulting in a more satisfactory account of the evidential role of diversity of evidence. In addition, it is argued that Wayne's criticism of the correlation approach does not indicate a serious flaw in the approach. (shrink)
In this paper, it is argued that the prima facie conflict between special relativity and the quantum-mechanical collapse postulate is only apparent, and that the seemingly incompatible accounts of entangled systems undergoing collapse yielded by different reference frames can be regarded as no more than differing accounts of the same processes and events. Attention to the transformation properties of quantum-mechanical states undergoing unitary, non-collapse evolution points the way to a treatment of collapse evolution consistent with the demands of relativity. r (...) 2002 Elsevier Science Ltd. All rights reserved. (shrink)
In this paper, a concept of chance is introduced that is compatible with deterministic physical laws, yet does justice to our use of chance-talk in connection with typical games of chance. We take our cue from what Poincaré called "the method of arbitrary functions," and elaborate upon a suggestion made by Savage in connection with this. Comparison is made between this notion of chance, and David Lewis' conception.
In this chapter, I will discuss what it takes for a dynamical collapse theory to provide a reasonable description of the actual world. I will start with discussions of what is required, in general, of the ontology of a physical theory, and then apply it to the quantum case. One issue of interest is whether a collapse theory can be a quantum state monist theory, adding nothing to the quantum state and changing only its dynamics. Although this was one of (...) the motivations for advancing such theories, its viability has been questioned, and it has been argued that, in order to provide an account of the world, a collapse theory must supplement the quantum state with additional ontology, making such theories more like hidden-variables theories than would first appear. I will make a case for quantum state monism as an adequate ontology, and, indeed, the only sensible ontology for collapse theories. This will involve taking dynamical variables to possess, not sharp values, as in classical physics, but distributions of values. (shrink)
In a recent paper, David Albert has suggested that no quantum theory can yield a description of the world unfolding in Minkowski spacetime. This conclusion is premature; a natural extension of Stein's notion of becoming in Minkowski spacetime to accommodate the demands of quantum nonseparability yields such an account, an account that is in accord with a proposal which was made by Aharonov and Albert but which is dismissed by Albert as a ‘mere trick’. The nature of such an account (...) is clarified by an extension to a relativistic quantum context of David Lewis' picture of objective chances evolving in time. 1 Introduction 2 Classical relativistic becoming 3 Relativistic quantum becoming, without collapse 4 Relativistic quantum becoming, with collapse 5 Objective chance, conditional probability, and definite properties 6 The nature of the wave function 7 Conclusion. (shrink)
This article examines the implications of the holonomy interpretation of classical electromagnetism. As has been argued by Richard Healey and Gordon Belot, classical electromagnetism on this interpretation evinces a form of nonseparability, something that otherwise might have been thought of as confined to nonclassical physics. Consideration of the differences between this classical nonseparability and quantum nonseparability shows that the nonseparability exhibited by the classical electromagnetism on the holonomy interpretation is closer to separability than might at first appear.
In addition to purely practical values, cognitive values also figure into scientific deliberations. One way of introducing cognitive values is to consider the cognitive value that accrues to the act of accepting a hypothesis. Although such values may have a role to play, such a role does not exhaust the significance of cognitive values in scientific decision-making. This paper makes a plea for consideration of epistemic value —that is, value attaching to a state of belief—and defends the notion of cognitive (...) epistemic value against some criticisms that have been raised. A stability requirement for epistemic value functions is argued for on the basis of considerations of diachronic coherence. This stability requirement is sufficient to obtain the Value of Learning Theorem, which says that the expected utility of cost-free learning cannot be negative. This holds also for cognitive epistemic values, provided that the stability requirement is met. (shrink)
A proof is given, at a greater level of generality than previous 'no-go' theorems, of the impossibility of formulating a modal interpretation that exhibits 'serious' Lorentz invariance at the fundamental level. Particular attention is given to modal interpretations of the type proposed by Bub.
This paper addresses the question of how we should regard the probability distributions introduced into statistical mechanics. It will be argued that it is problematic to take them either as purely ontic, or purely epistemic. I will propose a third alternative: they are almost objective probabilities, or epistemic chances. The definition of such probabilities involves an interweaving of epistemic and physical considerations, and thus they cannot be classified as either purely epistemic or purely ontic. This conception, it will be argued, (...) resolves some of the puzzles associated with statistical mechanical probabilities: it explains how probabilistic posits introduced on the basis of incomplete knowledge can yield testable predictions, and it also bypasses the problem of disastrous retrodictions, that is, the fact the standard equilibrium measures yield high probability of the system being in equilibrium in the recent past, even when we know otherwise. As the problem does not arise on the conception of probabilities considered here, there is no need to invoke a Past Hypothesis as a special posit to avoid it. (shrink)
Quantum information theory has given rise to a renewed interest in, and a new perspective on, the old issue of understanding the ways in which quantum mechanics differs from classical mechanics. The task of distinguishing between quantum and classical theory is facilitated by neutral frameworks that embrace both classical and quantum theory. In this paper, I discuss two approaches to this endeavour, the algebraic approach, and the convex set approach, with an eye to the strengths of each, and the relations (...) between the two. I end with a discussion of one particular model, the toy theory devised by Rob Spekkens, which, with minor modifications, fits neatly within the convex sets framework, and which displays in an elegant manner some of the similarities and differences between classical and quantum theories. The conclusion suggested by this investigation is that Schrödinger was right to find the essential difference between classical and quantum theory in their handling of composite systems, though Schrödinger's contention that it is entanglement that is the distinctive feature of quantum mechanics needs to be modified. (shrink)
Recent literature on Bohm's alternative to mainstream quantum mechanics may create the misleading impression that, except for perfunctory dismissals, the theory was ignored by the physics community in the years immediately following its proposal. As a matter of fact, Einstein, Pauli, and Heisenberg all published criticisms of Bohm's theory, explaining their reasons for not accepting the theory. These criticisms will be discussed and evaluated in this article.
The Akaike Information Criterion can be a valuable tool of scientific inference. This statistic, or any other statistical method for that matter, cannot, however, be the whole of scientific methodology. In this paper some of the limitations of Akaikean statistical methods are discussed. It is argued that the full import of empirical evidence is realized only by adopting a richer ideal of empirical success than predictive accuracy, and that the ability of a theory to turn phenomena into accurate, agreeing measurements (...) of causally relevant parameters contributes to the evidential support of the theory. This is illustrated by Newton's argument from orbital phenomena to the inverse-square law of gravitation. (shrink)
In this paper, the issues of computability and constructivity in the mathematics of physics are discussed. The sorts of questions to be addressed are those which might be expressed, roughly, as: Are the mathematical foundations of our current theories unavoidably non-constructive: or, Are the laws of physics computable?
Fifty years after the publication of Bell's theorem, there remains some controversy regarding what the theorem is telling us about quantum mechanics, and what the experimental violations of Bell inequalities are telling us about the world. This chapter represents my best attempt to be clear about what I think the lessons are. In brief: there is some sort of nonlocality inherent in any quantum theory, and, moreover, in any theory that reproduces, even approximately, the quantum probabilities for the outcomes of (...) experiments. But not all forms of nonlocality are the same; there is a distinction to be made between action at a distance and other forms of nonlocality, and I will argue that the nonlocality required to violate the Bell inequalities need not involve action at a distance. Furthermore, the distinction between forms of nonlocality makes a difference when it comes to compatibility with relativistic causal structure. (shrink)
There has been a long-standing and sometimes passionate debate between physicists over whether a dynamical framework for quantum systems should incorporate not completely positive (NCP) maps in addition to completely positive (CP) maps. Despite the reasonableness of the arguments for complete positivity, we argue that NCP maps should be allowed, with a qualification: these should be understood, not as reflecting ‘not completely positive’ evolution, but as linear extensions, to a system’s entire state space, of CP maps that are only partially (...) defined. Beyond the domain of definition of a partial-CP map, we argue, much may be permitted. (shrink)
Earman and Ruetsche ([2005]) have cast their gaze upon existing no-go theorems for relativistic modal interpretations, and have found them inconclusive. They suggest that it would be more fruitful to investigate modal interpretations proposed for "really relativistic theories," that is, algebraic relativistic quantum field theories. They investigate the proposal of Clifton ([2000]), and extend Clifton's result that, for a host of states, his proposal yields no definite observables other than multiples of the identity. This leads Earman and Ruetsche to a (...) suspicion that troubles for modal interpretations of such relativistic theories "are due less to the Poincaré invariance of relativistic QFT vs. the Galilean invariance of ordinary nonrelativistic QM than to the infinite number of degrees of freedom of former vs. the finite number of degrees of freedom of the latter" (577-78). I am skeptical of this suggestion. Though there are troubles for modal interpretations of a relativistic quantum field theory that are due to its being a field theory—that is, due to infinitude of the degrees of freedom—they are not the only troubles faced by modal interpretations of quantum theories set in relativistic spacetime; there are also troubles traceable to relativistic causal structure. (shrink)
The Akaike Information Criterion can be a valuable tool of scientific inference. This statistic, or any other statistical method for that matter, cannot, however, be the whole of scientific methodology. In this paper some of the limitations of Akaikean statistical methods are discussed. It is argued that the full import of empirical evidence is realized only by adopting a richer ideal of empirical success than predictive accuracy, and that the ability of a theory to turn phenomena into accurate, agreeing measurements (...) of causally relevant parameters contributes to the evidential support of the theory. This is illustrated by Newton's argument from orbital phenomena to the inverse-square law of gravitation. (shrink)
Richard Pettigrew has recently advanced a justification of the Principle of Indifference on the basis of a principle that he calls “cognitive conservatism,” or “extreme epistemic conservatism.” However, the credences based on the Principle of Indifference, as Pettigrew formulates it, violate another desideratum, namely, that learning from experience be possible. If it is accepted that learning from experience should be possible, this provides grounds for rejecting cognitive conservatism. Another set of criteria considered by Pettigrew, which involves a weighted mean of (...) worst-case and best-case accuracy, affords some learning, but not the sort that one would expect. This raises the question of whether accuracy-based considerations can be adapted to justify credence functions that permit induction. (shrink)
This dissertation is an investigation into the degree to which the mathematics used in physical theories can be constructivized. The techniques of recursive function theory and classical logic are used to separate out the algorithmic content of mathematical theories rather than attempting to reformulate them in terms of "intuitionistic" logic. The guiding question is: are there experimentally testable predictions in physics which are not computable from the data? ;The nature of Church's thesis, that the class of effectively calculable functions on (...) the natural numbers is identical to the class of general recursive functions, is discussed. It is argued that this thesis is an example of an explication of the very notion of an effectively calculable function. This is contrary to a view of the thesis as a hypothesis about the limitations of the human mind. ;The extension to functions of a real variable of the notion of effective calculability is discussed, and it is argued that a function of a real variable must be continuous in order to be considered effectively calculable . The relation between continuity and computability is significant for the problem at hand. The results of a well-designed experiment do not depend critically upon the precise values of the relevant parameters. Accordingly, if the solution to a problem in mathematical physics depends discontinuously upon the data, it cannot be regarded as an experimentally testable prediction of the theory. The principle that the testable predictions of a physical theory cannot be singular is known as the principle of regularity. This principle is significant, because discontinuities generate non-computability, but they also disqualify a prediction from being experimentally testable. ;A mathematical framework is set up for discussing computability in physical theories. This framework is then applied to the case of quantum mechanics. It is found that, due to the use of unbounded operators in the theory, noncomputable objects appear, but predictions which satisfy the principle of regularity are nevertheless computable functions of the data. (shrink)
The impossibility of an indeterministic evolution for standard relativistic quantum field theories, that is, theories in which all fields satisfy the condition that the generators of space-time translation have spectra in the forward light-cone, is demonstrated. The demonstration proceeds by arguing that a relativistically invariant theory must have a stable vacuum and then showing that stability of the vacuum, together with the requirements imposed by relativistic causality, entails deterministic evolution, if all degrees of freedom are standard degrees of freedom.
This paper addresses the question of how we should regard the probability distributions introduced into statistical mechanics. It will be argued that it is problematic to take them either as purely subjective credences, or as objective chances. I will propose a third alternative: they are "almost objective" probabilities, or "epistemic chances". The definition of such probabilities involves an interweaving of epistemic and physical considerations, and so cannot be classified as either purely subjective or purely objective. This conception, it will be (...) argued, resolves some of the puzzles associated with statistical mechanical probabilities; it explains how probabilistic posits introduced on the basis of incomplete knowledge can yield testable predictions, and it also bypasses the problem of disastrous retrodictions, that is, the fact the standard equilibrium measures yield high probability of the system being in equilibrium in the recent past, even when we know otherwise. As the problem does not arise on the conception of probabilities considered here, there is no need to invoke a Past Hypothesis as a special posit to avoid it. (shrink)
On April 1, 2016, at the Annual Meeting of the Pacific Division of the American Philosophical Association, a book symposium, organized by Alyssa Ney, was held in honor of David Albert’s After Physics. All participants agreed that it was a valuable and enlightening session. We have decided that it would be useful, for those who weren’t present, to make our remarks publicly available. Please bear in mind that what follows are remarks prepared for the session, and that on some points (...) participants may have changed their minds in light of the ensuing discussion. (shrink)
Part I Introduction -/- Passion at a Distance (Don Howard) -/- Part II Philosophy, Methodology and History -/- Balancing Necessity and Fallibilism: Charles Sanders Peirce on the Status of Mathematics and its Intersection with the Inquiry into Nature (Ronald Anderson) -/- Newton’s Methodology (William Harper) -/- Whitehead’s Philosophy and Quantum Mechanics (QM): A Tribute to Abner Shimony (Shimon Malin) -/- Bohr and the Photon (John Stachel) -/- Part III Bell’s Theorem and Nonlocality A. Theory -/- Extending the Concept of an (...) “Element of Reality” to Work with Inefficient Detectors (Daniel M. Greenberger) -/- A General Proof of Nonlocality without Inequalities for Bipartite States (GianCarlo Ghirardi and Luca Marinatto) -/- On the Separability of Physical Systems (Jon P. Jarrett) -/- Bell Inequalities: Many Questions, a Few Answers (Nicolas Gisin) -/- B. Experiment -/- Do Experimental Violations of Bell Inequalities Require a Nonlocal Interpretation of Quantum Mechanics? II: Analysis a la Bell (Edward S. Fry, Xinmei Qu, and Marlan O. Scully) -/- The Physics of 2 = 1 + 1 (Yanhua Shih) -/- Part IV Probability, Uncertainty, and Stochastic Modifications of Quantum Mechanics -/- Interpretations of Probability in Quantum Mechanics: A Case of “Experimental Metaphysics” (Geoffrey Hellman) -/- “No Information Without Disturbance”: Quantum Limitations of Measurement (Paul Busch) -/- How Stands Collapse II (Philip Pearle) -/- Is There a Relation Between the Breakdown of the Superposition Principle and an Indeterminacy in the Structure of the Einsteinian Space-Time? (Andor Frenkel) -/- Indistinguishability or Stochastic Dependence? (D. Costantini and U. Garibaldi) -/- Part V Relativity -/- Plane Geometry in Spacetime (N. David Mermin) -/- The Transient nows (Steven F. Savitt) -/- Quantum in Gravity? (Michael Horne) -/- A Proposed Test of the Local Causality of Spacetime (Adrian Kent) -/- Quantum Gravity Computers: On the Theory of Computation with Indefinite Causal Structure (Lucien Hardy) -/- “Definability,” “Conventionality,” and Simultaneity in Einstein–Minkowski Space-Time (Howard Stein) -/- Part VI Concluding Words -/- Bistro Banter: A Dialogue with Abner Shimony and Lee Smolin -/- Unfinished Work: A Bequest (Abner Shimony) -/- Bibliography of Abner Shimony. (shrink)
There is a long tradition of thinking of thermodynamics, not as a theory of fundamental physics, but as a theory of how manipulations of a physical system may be used to obtain desired effects, such as mechanical work. On this view, the basic concepts of thermodynamics, heat and work, and with them, the concept of entropy, are relative to a class of envisaged manipulations. This article is a sketch and defense of a science of manipulations and their effects on physical (...) systems. I call this science thermo-dynamics, or ΘΔcs\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${\Theta \Delta }^{\text{cs}}$$\end{document}, for short, to highlight that it may be different from the science of thermodynamics, as the reader conceives it. An upshot of the discussion is a clarification of the roles of the Gibbs and von Neumann entropies. Light is also shed on the use of coarse-grained entropies. (shrink)
If some sort of dynamical collapse theory is correct, what might the world be like? Can a theory of that sort be a quantum state monist theory, or must such theories supplement the quantum state ontology with additional beables? In a previous work, I defended quantum state monism, with a distributional ontology along the lines advocated by Philip Pearle. In this chapter the account is extended to collapse theories in relativistic spacetimes.
As has been noted by several authors, in a relativistic context, there is an interesting difference between classical and quantum state evolution. For a classical system, a state history of a quantum system given along one foliation uniquely determines, without any consideration of the system’s dynamics, a state history along any other foliation. This is not true for quantum state evolution; there are cases in which a state history along one foliation is compatible with multiple distinct state histories along some (...) other, a phenomenon that David Albert has dubbed “non-narratability.” In this article, we address the question of whether non-narratability is restricted to the sorts of special states that so far have been used to illustrate it. The results of the investigation suggest that there has been a misplaced emphasis on underdetermination of state histories; though this is generic for the special cases that have up until now been considered, involving bipartite systems in pure entangled states, it fails generically in cases in which more component systems are taken into account, and for bipartite systems that have some entanglement with their environment. For such cases, if we impose relativistic causality constraints on the evolution, then, except for very special states, a state history along one foliation uniquely determines a state history along any other. But this in itself is a marked difference between classical and quantum state evolution, because, in a classical setting, no considerations of dynamics at all are needed to go from a state history along one foliation to a state history along another. (shrink)
There seems to be a growing consensus that any interpretation of quantum mechanics other than an instrumentalist interpretation will have to abandon the requirement of Lorentz invariance, at least at the fundamental level, preserving at best Lorentz invariance of phenomena. In particular, it is often said that the collapse postulate is incompatible with the demands of relativity. It is the purpose of this paper to argue that such a conclusion is premature, and to defend the view that a covariant account (...) of collapse can be given according to which the state histories yielded by different reference frames are the Lorentz transforms of each other. Objections that have been raised to such a view are considered. (shrink)
Once an experiment is done, the observations have been made and the data have been analyzed, what should scientists communicate to the world at large, and how should they do it? This, I will argue, is an intricate question, and one that philosophers can make a contribution to. I will illustrate these points by reference to the debate between Fisher and Neyman & Pearson in the 1950s, which I take to be, at heart, a debate about norms of scientific communication. (...) I will argue that scientists need a richer set of tools for communicating epistemic states that may be very nuanced, and will point to way in which philosophers can contribute. (shrink)
The hidden-variables model constructed by Karl Hess and Walter Philipp is claimed by its authors to exploit a "loophole" in Bell's theorem; according to Hess and Philipp, the parameters employed in their model extend beyond those considered by Bell. Furthermore, they claim that their model satisfies Einstein locality and is free of any "suspicion of spooky action at a distance." Both of these claims are false; the Hess-Philipp model achieves agreement with the quantum-mechanical predictions, not by circumventing Bell's theorem, but (...) via Parameter Dependence. (shrink)
A shift in focus, of the sort recently advocated by David Wallace, towards consideration of work in nonequilibrium statistical mechanics has the potential for far-reaching consequences in the way we think about the foundations of statistical mechanics. In particular, consideration of the approach to equilibrium helps to pick out appropriate equilibrium measures, measures that are picked out by the dynamics as "natural' measures for systems in equilibrium. Consideration of the rationale for using such measures reveals that the scope of their (...) legitimate employment is much more limited than an appeal to a Principle of Indifference would suggest. These points are illustrated by use of a toy model that I call the parabola gadget. (shrink)
The ambition of this book is a noble one: to provide a counter to the assumption, taken for granted made by many postmodernists, that quantum mechanics lends support to the view that scienti® c realism is nothing more than an outmoded fad. It is especially gratifying that this book comes from a literary theorist, author of a well-respected book on Derrida (Norris, 1987), who, by his own admission, has ª previously published several books on literary theory that might be construed (...) ¼ as going along with the emergent trend towards anti-realism and cultural relativism in various quarters of `advanced’ theoretical debateº (Introduction, p. 1). One wishes, however, that Norris had taken more time to familiarize himself with issues that he writes about, and that he had taken more care in constructing his arguments. Although ª there will be more joy in heaven over one sinner who repents than over ninety-nine righteous persons who need no repentanceº (Luke 15: 7, RSV), we should not let jubilation blind us to the book’ s shortcomings. Among these is a lack of clarity in its central notion, that of realism. Early on, Norris quotes with approval William Alston’ s characterization of the alethic conception of realism, which is the conception advocated by Norris; the alethic conception ª implies that (almost always) what confers a truth-value on a statement is something independent of the cognitive-linguistic goings-on that issued in that statement, including any epistemic status of those goings-onº (p. 41). As the book progresses, however, additional conditions on what is to count as a realist interpretation of quantum mechanics emerge. Realism apparently becomes synonymous with ª causal-explanatoryº theories, and in one passage, Norris goes so far as to suggest that realism entails a commitment to synthetic a priori knowledge of the physical world: Bell’ s calculations and those applied in interpreting the Aspect results are themselves dependentÐ no less than EPRÐ on a range of distinctly ª classicalº assumptions, among them the existence of a physical object-domain which, however puzzling its details, permits such experiments to be carried out and conclusions to be drawn from them.. (shrink)
The Bell–Kochen–Specker theorem shows that, in any Hilbert space of dimension of at least 3, it is impossible to assign noncontextual definite values to all observables in such a way that the quantum-mechanical predictions are reproduced. This leaves open the issue of what subsets of observables may be assigned definite values. Clifton has shown that, for a system of at least two continuous degrees of freedom, it is not possible to assign simultaneous noncontextual values to two coordinates and their conjugate (...) momenta. In this Letter, it is shown that, for a system of a single continuous degree of freedom, it is not possible to assign noncontextual values to the coordinate and its conjugate momenta that satisfy a continuity assumption herein called the ‘ ε -Product Rule’. (shrink)