In this paper, I critically assess different interpretations of Bohmian mechanics that are not committed to an ontology based on the wave function being an actual physical object that inhabits configuration space. More specifically, my aim is to explore the connection between the denial of configuration space realism and another interpretive debate that is specific to Bohmian mechanics: the quantum potential versus guidance approaches. Whereas defenders of the quantum potential approach to the theory claim that Bohmian mechanics (...) is better formulated as quasi-Newtonian, via the postulation of forces proportional to acceleration; advocates of the guidance approach defend the notion that the theory is essentially first-order and incorporates some concepts akin to those of Aristotelian physics. Here I analyze whether the desideratum of an interpretation of Bohmian mechanics that is both explanatorily adequate and not committed to configuration space realism favors one of these two approaches to the theory over the other. Contrary to some recent claims in the literature, I argue that the quasi-Newtonian approach based on the idea of a quantum potential does not come out the winner. (shrink)
In a quantum universe with a strong arrow of time, we postulate a low-entropy boundary condition to account for the temporal asymmetry. In this paper, I show that the Past Hypothesis also contains enough information to simplify the quantum ontology and define a unique initial condition in such a world. First, I introduce Density Matrix Realism, the thesis that the quantum universe is described by a fundamental density matrix that represents something objective. This stands in sharp contrast to Wave Function (...) Realism, the thesis that the quantum universe is described by a wave function that represents something objective. Second, I suggest that the Past Hypothesis is sufficient to determine a unique and simple density matrix. This is achieved by what I call the Initial Projection Hypothesis: the initial density matrix of the universe is the normalized projection onto the special low-dimensional Hilbert space. Third, because the initial quantum state is unique and simple, we have a strong case for the \emph{Nomological Thesis}: the initial quantum state of the universe is on a par with laws of nature. This new package of ideas has several interesting implications, including on the harmony between statistical mechanics and quantum mechanics, the dynamic unity of the universe and the subsystems, and the alleged conflict between Humean supervenience and quantum entanglement. (shrink)
One can (for the most part) formulate a model of a classical system in either the Lagrangian or the Hamiltonian framework. Though it is often thought that those two formulations are equivalent in all important ways, this is not true: the underlying geometrical structures one uses to formulate each theory are not isomorphic. This raises the question of whether one of the two is a more natural framework for the representation of classical systems. In the event, the answer is yes: (...) I state and sketch proofs of two technical results—inspired by simple physical arguments about the generic properties of classical systems—to the effect that, in a precise sense, classical systems evince exactly the geometric structure Lagrangian mechanics provides for the representation of systems, and none provided by Hamiltonian. The argument not only clarifies the conceptual structure of the two systems of mechanics, but also their relations to each other and their respective mechanisms for representing physical systems. It also shows why naïvely structural approaches to the representational content of physical theories cannot work. [Lagrange] grasped that he had gained a method of stating dynamical truths in a way, which is perfectly indifferent to the particular methods of measurement employed in fixing the positions of the various parts of the system. Accordingly, he went on to deduce equations of motion, which are equally applicable whatever quantitative measurements have been made, provided that they are adequate to fix positions. The beauty and almost divine simplicity of these equations is such that these formulae are worthy to rank with those mysterious symbols which in ancient times were held directly to indicate the Supreme Reason at the base of all things. (Whitehead [1948], p. 63)1. Introduction2.Classical Systems3. The Possible Interactions of a Classical System and the Structure of Its Space of States4. Classical Systems Are Lagrangian5. Classical Systems Are Not Hamiltonian6. How Lagrangian and Hamiltonian Mechanics Represent Classical Systems7. The Conceptual Structure of Classical Mechanics. (shrink)
THE PRINCIPLE OF SUPERPOSITION. The need for a quantum theory Classical mechanics has been developed continuously from the time of Newton and applied to an ...
Bohmian mechanics, which is also called the de Broglie-Bohm theory, the pilot-wave model, and the causal interpretation of quantum mechanics, is a version of quantum theory discovered by Louis de Broglie in 1927 and rediscovered by David Bohm in 1952. It is the simplest example of what is often called a hidden variables interpretation of quantum mechanics. In Bohmian mechanics a system of particles is described in part by its wave function, evolving, as usual, according to (...) Schrödinger's equation. However, the wave function provides only a partial description of the system. This description is completed by the specification of the actual positions of the particles. The latter evolve according to the.. (shrink)
Bohmian mechanics is a theory about point particles moving along trajectories. It has the property that in a world governed by Bohmian mechanics, observers see the same statistics for experimental results as predicted by quantum mechanics. Bohmian mechanics thus provides an explanation of quantum mechanics. Moreover, the Bohmian trajectories are defined in a non-conspiratorial way by a few simple laws.
The paper address the question of whether quantum mechanics (QM) favors Priority Monism, the view according to which the Universe is the only fundamental object. It develops formal frameworks to frame rigorously the question of fundamental mereology and its answers, namely (Priority) Pluralism and Monism. It then reconstructs the quantum mechanical argument in favor of the latter and provides a detailed and thorough criticism of it that sheds furthermore new light on the relation between parthood, composition and fundamentality in (...) QM. (shrink)
What ontology does realism about the quantum state suggest? The main extant view in contemporary philosophy of physics is wave-function realism . We elaborate the sense in which wave-function realism does provide an ontological picture, and defend it from certain objections that have been raised against it. However, there are good reasons to be dissatisfied with wave-function realism, as we go on to elaborate. This motivates the development of an opposing picture: what we call spacetime state realism , a view (...) which takes the states associated to spacetime regions as fundamental. This approach enjoys a number of beneficial features, although, unlike wave-function realism, it involves non-separability at the level of fundamental ontology. We investigate the pros and cons of this non-separability, arguing that it is a quite acceptable feature, even one which proves fruitful in the context of relativistic covariance. A companion paper discusses the prospects for combining a spacetime-based ontology with separability, along lines suggested by Deutsch and Hayden. (shrink)
I argue that quantum mechanics is fundamentally a theory about the representation and manipulation of information, not a theory about the mechanics of nonclassical waves or particles. The notion of quantum information is to be understood as a new physical primitive—just as, following Einstein’s special theory of relativity, a field is no longer regarded as the physical manifestation of vibrations in a mechanical medium, but recognized as a new physical primitive in its own right.
In quantum mechanics it is usually assumed that mutually exclusives states of affairs must be represented by orthogonal vectors. Recent attempts to solve the measurement problem, most notably the GRW theory, require the relaxation of this assumption. It is shown that a consequence of relaxing this assumption is that arithmatic does not apply to ordinary macroscopic objects. It is argued that such a radical move is unwarranted given the current state of understanding of the foundations of quantum mechanics.
I maintain that quantum mechanics is fundamentally about a system of N particles evolving in three-dimensional space, not the wave function evolving in 3N-dimensional space.
After introducing the empiricist point of view in philosophy of science, and the concepts and methods of the semantic approach to scientific theories, van Fraassen discusses quantum theory in three stages. He first examines the question of whether and how empirical phenomena require a non-classical theory, and what sort of theory they require. He then discusses the mathematical foundations of quantum theory with special reference to developments in the modelling of interaction, composite systems, and measurement. Finally, the author broaches the (...) main questions of interpretation. After offering a critique of earlier interpretations, he develops a new one--the modal interpretation--which attempts to stay close to the original Copenhagen ideas without implying a radical incompleteness in quantum theory. He again gives special attention to the character of composite, many-body systems and especially to the peculiar character of assemblies of identical particles in quantum statistics. (shrink)
This book explores the prospects of rivaling ontological and epistemic interpretations of quantum mechanics (QM). It concludes with a suggestion for how to interpret QM from an epistemological point of view and with a Kantian touch. It thus refines, extends, and combines existing approaches in a similar direction. -/- The author first looks at current, hotly debated ontological interpretations. These include hidden variables-approaches, Bohmian mechanics, collapse interpretations, and the many worlds interpretation. He demonstrates why none of these ontological (...) interpretations can claim to be the clear winner amongst its rivals. Next, coverage explores the possibility of interpreting QM in terms of knowledge but without the assumption of hidden variables. It examines QBism as well as Healey’s pragmatist view. The author finds both interpretations or programs wanting in certain respects. As a result, he then goes on to advance a genuine proposal as to how to interpret QM from the perspective of an internal realism in the sense of Putnam and Kant. -/- The book also includes two philosophical interludes. One details the notions of probability and realism. The other highlights the connections between the notions of locality, causality, and reality in the context of violations of Bell-type inequalities. (shrink)
Here I explore a novel no-collapse interpretation of quantum mechanics that combines aspects of two familiar and well-developed alternatives, Bohmian mechanics and the many-worlds interpretation. Despite reproducing the empirical predictions of quantum mechanics, the theory looks surprisingly classical. All there is at the fundamental level are particles interacting via Newtonian forces. There is no wave function. However, there are many worlds.
A common understanding of quantum mechanics (QM) among students and practical users is often plagued by a number of “myths”, that is, widely accepted claims on which there is not really a general consensus among experts in foundations of QM. These myths include wave-particle duality, time-energy uncertainty relation, fundamental randomness, the absence of measurement-independent reality, locality of QM, nonlocality of QM, the existence of well-defined relativistic QM, the claims that quantum field theory (QFT) solves the problems of relativistic QM (...) or that QFT is a theory of particles, as well as myths on black-hole entropy. The fact is that the existence of various theoretical and interpretational ambiguities underlying these myths does not yet allow us to accept them as proven facts. I review the main arguments and counterarguments lying behind these myths and conclude that QM is still a not-yet-completely-understood theory open to further fundamental research. (shrink)
In this book, which contains several of his key papers as well as new material, he focuses on the problem of consciousness and explains how quantum mechanics...
This paper shows how the classical finite probability theory (with equiprobable outcomes) can be reinterpreted and recast as the quantum probability calculus of a pedagogical or toy model of quantum mechanics over sets (QM/sets). There have been several previous attempts to develop a quantum-like model with the base field of ℂ replaced by ℤ₂. Since there are no inner products on vector spaces over finite fields, the problem is to define the Dirac brackets and the probability calculus. The previous (...) attempts all required the brackets to take values in ℤ₂. But the usual QM brackets <ψ|ϕ> give the "overlap" between states ψ and ϕ, so for subsets S,T⊆U, the natural definition is <S|T>=|S∩T| (taking values in the natural numbers). This allows QM/sets to be developed with a full probability calculus that turns out to be a non-commutative extension of classical Laplace-Boole finite probability theory. The pedagogical model is illustrated by giving simple treatments of the indeterminacy principle, the double-slit experiment, Bell's Theorem, and identical particles in QM/Sets. A more technical appendix explains the mathematics behind carrying some vector space structures between QM over ℂ and QM/Sets over ℤ₂. (shrink)
Statistical mechanics is a strange theory. Its aims are debated, its methods are contested, its main claims have never been fully proven, and their very truth is challenged, yet at the same time, it enjoys huge empirical success and gives us the feeling that we understand important phenomena. What is this weird theory, exactly? Statistical mechanics is the name of the ongoing attempt to apply mechanics, together with some auxiliary hypotheses, to explain and predict certain phenomena, above (...) all those described by thermodynamics. This paper shows what parts of this objective can be achieved with mechanics by itself. It thus clarifies what roles remain for the auxiliary assumptions that are needed to achieve the rest of the desiderata. Those auxiliary hypotheses are described in another paper in this journal, Foundations of statistical mechanics: The auxiliary hypotheses. (shrink)
This paper shows how the classical finite probability theory (with equiprobable outcomes) can be reinterpreted and recast as the quantum probability calculus of a pedagogical or toy model of quantum mechanics over sets (QM/sets). There have been several previous attempts to develop a quantum-like model with the base field of ℂ replaced by ℤ₂. Since there are no inner products on vector spaces over finite fields, the problem is to define the Dirac brackets and the probability calculus. The previous (...) attempts all required the brackets to take values in ℤ₂. But the usual QM brackets <ψ|ϕ> give the "overlap" between states ψ and ϕ, so for subsets S,T⊆U, the natural definition is <S|T>=|S∩T| (taking values in the natural numbers). This allows QM/sets to be developed with a full probability calculus that turns out to be a non-commutative extension of classical Laplace-Boole finite probability theory. The pedagogical model is illustrated by giving simple treatments of the indeterminacy principle, the double-slit experiment, Bell's Theorem, and identical particles in QM/Sets. A more technical appendix explains the mathematics behind carrying some vector space structures between QM over ℂ and QM/Sets over ℤ₂. (shrink)
To the best of our current understanding, quantum mechanics is part of the most fundamental picture of the universe. It is natural to ask how pure and minimal this fundamental quantum description can be. The simplest quantum ontology is that of the Everett or Many-Worlds interpretation, based on a vector in Hilbert space and a Hamiltonian. Typically one also relies on some classical structure, such as space and local configuration variables within it, which then gets promoted to an algebra (...) of preferred observables. We argue that even such an algebra is unnecessary, and the most basic description of the world is given by the spectrum of the Hamiltonian and the components of some particular vector in Hilbert space. Everything else—including space and fields propagating on it—is emergent from these minimal elements. (shrink)
In Process and Reality and other works, Alfred North Whitehead struggled to come to terms with the impact the new science of quantum mechanics would have on metaphysics.This ambitious book is the first extended analysis of the intricate relationships between relativity theory, quantum mechanics, and Whitehead's cosmology. Michael Epperson illuminates the intersection of science and philosophy in Whitehead's work-and details Whitehead's attempts to fashion an ontology coherent with quantum anomalies.Including a nonspecialist introduction to quantum mechanics, Epperson adds (...) an essential new dimension to our understanding of Whitehead-and of the constantly enriching encounter between science and philosophy in our century. (shrink)
We develop and defend the thesis that the Hilbert space formalism of quantum mechanics is a new theory of probability. The theory, like its classical counterpart, consists of an algebra of events, and the probability measures defined on it. The construction proceeds in the following steps: (a) Axioms for the algebra of events are introduced following Birkhoff and von Neumann. All axioms, except the one that expresses the uncertainty principle, are shared with the classical event space. The only models (...) for the set of axioms are lattices of subspaces of inner product spaces over a field K. (b) Another axiom due to Soler forces K to be the field of real, or complex numbers, or the quaternions. We suggest a probabilistic reading of Soler's axiom. (c) Gleason's theorem fully characterizes the probability measures on the algebra of events, so that Born's rule is derived. (d) Gleason's theorem is equivalent to the existence of a certain finite set of rays, with a particular orthogonality graph (Wondergraph). Consequently, all aspects of quantum probability can be derived from rational probability assignments to finite "quantum gambles". (e) All experimental aspects of entanglement- the violation of Bell's inequality in particular- are explained as natural outcomes of the probabilistic structure. (f) We hypothesize that even in the absence of decoherence macroscopic entanglement can very rarely be observed, and provide a precise conjecture to that effect .We also discuss the relation of the present approach to quantum logic, realism and truth, and the measurement problem. (shrink)
Statistical mechanics is the name of the ongoing attempt to explain and predict certain phenomena, above all those described by thermodynamics on the basis of the fundamental theories of physics, in particular mechanics, together with certain auxiliary assumptions. In another paper in this journal, Foundations of statistical mechanics: Mechanics by itself, I have shown that some of the thermodynamic regularities, including the probabilistic ones, can be described in terms of mechanics by itself. But in order (...) to prove those regularities, in particular the time asymmetric ones, it is necessary to add to mechanics assumptions of three kinds, all of which are under debate in contemporary literature. One kind of assumptions concerns measure and probability, and here, a major debate is around the notion of “typicality.” A second assumption concerns initial conditions, and here, the debate is about the nature and status of the so-called past hypothesis. The third kind of assumptions concerns the dynamics, and here, the possibility and significance of “Maxwell's Demon” is the main topic of discussions. This article describes these assumptions and examines the justification for introducing them, emphasizing the contemporary debates around them. (shrink)
One finds, in Maxwell's writings on thermodynamics and statistical physics, a conception of the nature of these subjects that differs in interesting ways from the way that they are usually conceived. In particular, though—in agreement with the currently accepted view—Maxwell maintains that the second law of thermodynamics, as originally conceived, cannot be strictly true, the replacement he proposes is different from the version accepted by most physicists today. The modification of the second law accepted by most physicists is a probabilistic (...) one: although statistical fluctuations will result in occasional spontaneous differences in temperature or pressure, there is no way to predictably and reliably harness these to produce large violations of the original version of the second law. Maxwell advocates a version of the second law that is strictly weaker; the validity of even this probabilistic version is of limited scope, limited to situations in which we are dealing with large numbers of molecules en masse and have no ability to manipulate individual molecules. Connected with this is his concept of the thermodynamic concepts of heat, work, and entropy; on the Maxwellian view, these are concepts that must be relativized to the means we have available for gathering information about and manipulating physical systems. The Maxwellian view is one that deserves serious consideration in discussions of the foundation of statistical mechanics. It has relevance for the project of recovering thermodynamics from statistical mechanics because, in such a project, it matters which version of the second law we are trying to recover. (shrink)
The paper points out that the modern formulation of Bohm’s quantum theory known as Bohmian mechanics is committed only to particles’ positions and a law of motion. We explain how this view can avoid the open questions that the traditional view faces according to which Bohm’s theory is committed to a wave-function that is a physical entity over and above the particles, although it is defined on configuration space instead of three-dimensional space. We then enquire into the status of (...) the law of motion, elaborating on how the main philosophical options to ground a law of motion, namely Humeanism and dispositionalism, can be applied to Bohmian mechanics. In conclusion, we sketch out how these options apply to primitive ontology approaches to quantum mechanics in general. (shrink)
We show that a new interpretation of quantum mechanics, in which the notion of event is defined without reference to measurement or observers, allows to construct a quantum general ontology based on systems, states and events. Unlike the Copenhagen interpretation, it does not resort to elements of a classical ontology. The quantum ontology in turn allows us to recognize that a typical behavior of quantum systems exhibits strong emergence and ontological non-reducibility. Such phenomena are not exceptional but natural, and (...) are rooted in the basic mathematical structure of quantum mechanics. (shrink)
In "Counterfactual Dependence and Time's Arrow", David Lewis defends an analysis of counterfactuals intended to yield the asymmetry of counterfactual dependence: that later affairs depend counterfactually on earlier ones, and not the other way around. I argue that careful attention to the dynamical properties of thermodynamically irreversible processes shows that in many ordinary cases, Lewis's analysis fails to yield this asymmetry. Furthermore, the analysis fails in an instructive way: it teaches us something about the connection between the asymmetry of overdetermination (...) and the asymmetry of entropy. (shrink)
Written by an internationally renowned philosopher, this volume offers a three-part philosophical interpretation of quantum physics. The first part reviews the basics of quantum mechanics, outlining their philosophical interpretation and summarizing their results; the second outlines the mathematical methods of quantum mechanics; and the third section blends the philosophical ideas of the first part and the mathematical formulations of the second part to develop a variety of interpretations of quantum mechanics. 1944 edition.
This chapter will review selected aspects of the terrain of discussions about probabilities in statistical mechanics (with no pretensions to exhaustiveness, though the major issues will be touched upon), and will argue for a number of claims. None of the claims to be defended is entirely original, but all deserve emphasis. The first, and least controversial, is that probabilistic notions are needed to make sense of statistical mechanics. The reason for this is the same reason that convinced Maxwell, (...) Gibbs, and Boltzmann that probabilities would be needed, namely, that the second law of thermodynamics, which in its original formulation says that certain processes are impossible, must, on the kinetic theory, be replaced by a weaker formulation according to which what the original version deems impossible is merely improbable. Second is that we ought not take the standard measures invoked in equilibrium statistical mechanics as giving, in any sense, the correct probabilities about microstates of the system. We can settle for a much weaker claim: that the probabilities for outcomes of experiments yielded by the standard distributions are effectively the same as those yielded by any distribution that we should take as a representing probabilities over microstates. Lastly, (and most controversially): in asking about the status of probabilities in statistical mechanics, the familiar dichotomy between epistemic probabilities (credences, or degrees of belief) and ontic (physical) probabilities is insufficient; the concept of probability that is best suited to the needs of statistical mechanics is one that combines epistemic and physical considerations. (shrink)
We expound an alternative to the Copenhagen interpretation of the formalism of nonrelativistic quantum mechanics. The basic difference is that the new interpretation is formulated in the language of epistemological realism. It involves a change in some basic physical concepts. The ψ function is no longer interpreted as a probability amplitude of the observed behaviour of elementary particles but as an objective physical field representing the particles themselves. The particles are thus extended objects whose extension varies in time according (...) to the variation of ψ. They are considered as fundamental regions of space with some kind of nonlocality. Special consideration is given to the Heisenberg relations, the Einstein-Podolsky- Rosen correlations, the reduction process, the problem of measurement, and the quantum-statistical distributions. (shrink)
Experimental evidence of the last decades has made the status of ``collapses of the wave function'' even more shaky than it already was on conceptual grounds: interference effects turn out to be detectable even when collapses are typically expected to occur. Non-collapse interpretations should consequently be taken seriously. In this paper we argue that such interpretations suggest a perspectivalism according to which quantum objects are not characterized by monadic properties, but by relations to other systems. Accordingly, physical systems may possess (...) different properties with respect to different ``reference systems''. We discuss some of the relevant arguments, and argue that perspectivalism both evades recent arguments that single-world interpretations are inconsistent and eliminates the need for a privileged rest frame in the relativistic case. (shrink)
The present paper reveals (non-relativistic) quantum mechanics as an emergent property of otherwise classical ergodic systems embedded in a stochastic vacuum or zero-point radiation field (zpf). This result provides a theoretical basis for understanding recent numerical experiments in which a statistical analysis of an atomic electron interacting with the zpf furnishes the quantum distribution for the ground state of the H atom. The action of the zpf on matter is essential within the present approach, but it is the ergodic (...) demand what ultimately leads to the matrix formulation of quantum mechanics. The paper thus represents a step forward in the quest for an elucidation of the fundamentals of quantum mechanics. (shrink)
It has been argued that the transition from classical to quantum mechanics is an example of a Kuhnian scientific revolution, in which there is a shift from the simple, intuitive, straightforward classical paradigm, to the quantum, convoluted, counterintuitive, amazing new quantum paradigm. In this paper, after having clarified what these quantum paradigms are supposed to be, I analyze whether they constitute a radical departure from the classical paradigm. Contrary to what is commonly maintained, I argue that, in addition to (...) radical quantum paradigms, there are also legitimate ways of understanding the quantum world that do not require any substantial change to the classical paradigm. (shrink)
Quantum mechanics is derived from the principle that the universe contain as much variety as possible, in the sense of maximizing the distinctiveness of each subsystem. The quantum state of a microscopic system is defined to correspond to an ensemble of subsystems of the universe with identical constituents and similar preparations and environments. A new kind of interaction is posited amongst such similar subsystems which acts to increase their distinctiveness, by extremizing the variety. In the limit of large numbers (...) of similar subsystems this interaction is shown to give rise to Bohm’s quantum potential. As a result the probability distribution for the ensemble is governed by the Schroedinger equation. The measurement problem is naturally and simply solved. Microscopic systems appear statistical because they are members of large ensembles of similar systems which interact non-locally. Macroscopic systems are unique, and are not members of any ensembles of similar systems. Consequently their collective coordinates may evolve deterministically. This proposal could be tested by constructing quantum devices from entangled states of a modest number of quits which, by its combinatorial complexity, can be expected to have no natural copies. (shrink)
The aim of this paper is to consider in what sense the modal-Hamiltonian interpretation of quantum mechanics satisfies the physical constraints imposed by the Galilean group. In particular, we show that the only apparent conflict, which follows from boost-transformations, can be overcome when the definition of quantum systems and subsystems is taken into account. On this basis, we apply the interpretation to different well-known models, in order to obtain concrete examples of the previous conceptual conclusions. Finally, we consider the (...) role played by the Casimir operators of the Galilean group in the interpretation. (shrink)
There has been recent interest in formulating theories of non-representational indeterminacy. The aim of this paper is to clarify the relevance of quantum mechanics to this project. Quantum-mechanical examples of vague objects have been offered by various authors, displaying indeterminate identity, in the face of the famous Evans argument that such an idea is incoherent. It has also been suggested that the quantum-mechanical treatment of state-dependent properties exhibits metaphysical indeterminacy. In both cases it is important to consider the details (...) of the metaphysical account and the way in which the quantum phenomenon is captured within it. Indeed if we adopt a familiar way of thinking about indeterminacy and apply it in a natural way to quantum mechanics, we run into illuminating difficulties and see that the case is far less straightforward than might be hoped. (shrink)
First steps are taken toward a formulation of quantum mechanics which avoids the use of probability amplitudes and is expressed entirely in terms of observable probabilities. Quantum states are represented not by state vectors or density matrices but by “probability tables,” which contain only the probabilities of the outcomes of certain special measurements. The rule for computing transition probabilities, normally given by the squared modulus of the inner product of two state vectors, is re-expressed in terms of probability tables. (...) The new version of the rule is surprisingly simple, especially when one considers that the notion of complex phases, so crucial in the evaluation of inner products, is entirely absent from the representation of states used here. (shrink)
It is widely held that quantum mechanics is the first scientific theory to present scientifically internal, fundamental difficulties for a realistic interpretation (in the philosophical sense). The standard (Copenhagen) interpretation of the quantum theory is often described as the inevitable instrumentalistic response. It is the purpose of the present article to argue that quantum theory doesnot present fundamental new problems to a realistic interpretation. The formalism of quantum theory has the same states—it will be argued—as the formalisms of older (...) physical theories and is capable of the same kinds of philosophical interpretation. This result is reached via an analysis of what it means to give a realistic interpretation to a theory. The main point of difference between quantum mechanics and other theories—as far as the possibilities of interpretation are concerned—is the special treatment given tomeasurement by the “projection postulate.” But it is possible to do without this postulate. Moreover, rejection of the projection postulate does not, in spite of what is often maintained in the literature, automatically lead to the many-worlds interpretation of quantum mechanics. A realistic interpretation is possible in which only the reality ofone (our) world is recognized. It is argued that the Copenhagen interpretation as expounded by Bohr is not in conflict with the here proposed realistic interpretation of quantum theory. (shrink)
Statistical mechanics is one of the crucial fundamental theories of physics, and in his new book Lawrence Sklar, one of the pre-eminent philosophers of physics, offers a comprehensive, non-technical introduction to that theory and to attempts to understand its foundational elements. Among the topics treated in detail are: probability and statistical explanation, the basic issues in both equilibrium and non-equilibrium statistical mechanics, the role of cosmology, the reduction of thermodynamics to statistical mechanics, and the alleged foundation of (...) the very notion of time asymmetry in the entropic asymmetry of systems in time. The book emphasises the interaction of scientific and philosophical modes of reasoning, and in this way will interest all philosophers of science as well as those in physics and chemistry concerned with philosophical questions. The book could also be read by an informed general reader interested in the foundations of modern science. (shrink)
David Lewis is a natural target for those who believe that findings in quantum physics threaten the tenability of traditional metaphysical reductionism. Such philosophers point to allegedly holistic entities they take both to be the subjects of some claims of quantum mechanics and to be incompatible with Lewisian metaphysics. According to one popular argument, the non-separability argument from quantum entanglement, any realist interpretation of quantum theory is straightforwardly inconsistent with the reductive conviction that the complete physical state of the (...) world supervenes on the intrinsic properties of and spatio-temporal relations between its point-sized constituents. Here I defend Lewis's metaphysical doctrine, and traditional reductionism more generally, against this alleged threat from quantum holism. After presenting the non-separability argument from entanglement, I show that Bohmian mechanics, an interpretation of quantum mechanics explicitly recognized as a realist one by proponents of the non-separability argument, plausibly rejects a key premise of that argument. Another holistic worry for Humeanism persists, however, the trouble being the apparently holistic character of the Bohmian pilot wave. I present a Humean strategy for addressing the holistic threat from the pilot wave by drawing on resources from the Humean best system account of laws. (shrink)
If the block universe view is correct, the future and the past have similar status and one would expect physical theories to involve final as well as initial boundary conditions. A plausible consistency condition between the initial and final boundary conditions in non-relativistic quantum mechanics leads to the idea that the properties of macroscopic quantum systems, relevantly measuring instruments, are uniquely determined by the boundary conditions. An important element in reaching that conclusion is that preparations and measurements belong in (...) a special class because they involve many subsystems, at least some of which do not form superpositions of their physical properties before the boundary conditions are imposed. It is suggested that the primary role of the formalism of standard quantum mechanics is to provide the consistency condition on the boundary conditions rather than the properties of quantum systems. Expressions are proposed for assigning a set of (unmeasured) physical properties to a quantum system at all times. The physical properties avoid the logical inconsistencies implied by the no-go theorems because they are assigned differently from standard quantum mechanics. Since measurement outcomes are determined by the boundary conditions, they help determine, rather than are determined by, the physical properties of quantum systems. (shrink)
Quantum mechanics is, at least at first glance and at least in part, a mathematical machine for predicting the behaviors of microscopic particles — or, at least, of the measuring instruments we use to explore those behaviors — and in that capacity, it is spectacularly successful: in terms of power and precision, head and shoulders above any theory we have ever had. Mathematically, the theory is well understood; we know what its parts are, how they are put together, and (...) why, in the mechanical sense (i.e., in a sense that can be answered by describing the internal grinding of gear against gear), the whole thing performs the way it does, how the information that gets fed in at one end is converted into what comes out the other. The question of what kind of a world it describes, however, is controversial; there is very little agreement, among physicists and among philosophers, about what the world is like according to quantum mechanics. Minimally interpreted, the theory describes a set of facts about the way the microscopic world impinges on the macroscopic one, how it affects our measuring instruments, described in everyday language or the language of classical mechanics. Disagreement centers on the question of what a microscopic world, which affects our apparatuses in the prescribed manner, is, or even could be, like intrinsically ; or how those apparatuses could themselves be built out of microscopic parts of the sort the theory describes.[1.. (shrink)
A variety of ideas arising in decoherence theory, and in the ongoing debate over Everett's relative-state theory, can be linked to issues in relativity theory and the philosophy of time, specifically the relational theory of tense and of identity over time. These have been systematically presented in companion papers (Saunders 1995; 1996a); in what follows we shall consider the same circle of ideas, but specifically in relation to the interpretation of probability, and its identification with relations in the Hilbert Space (...) norm. The familiar objection that Everett's approach yields probabilities different from quantum mechanics is easily dealt with. The more fundamental question is how to interpret these probabilities consistent with the relational theory of change, and the relational theory of identity over time. I shall show that the relational theory needs nothing more than the physical, minimal criterion of identity as defined by Everett's theory, and that this can be transparently interpreted in terms of the ordinary notion of the chance occurrence of an event, as witnessed in the present. It is in this sense that the theory has empirical content. (shrink)