Bohmian mechanics and the Ghirardi-Rimini-Weber theory provide opposite resolutions of the quantum measurement problem: the former postulates additional variables (the particle positions) besides the wave function, whereas the latter implements spontaneous collapses of the wave function by a nonlinear and stochastic modification of Schrödinger's equation. Still, both theories, when understood appropriately, share the following structure: They are ultimately not about wave functions but about 'matter' moving in space, represented by either particle trajectories, fields on space-time, or a discrete set of (...) space-time points. The role of the wave function then is to govern the motion of the matter. (shrink)
A major disagreement between different views about the foundations of quantum mechanics concerns whether for a theory to be intelligible as a fundamental physical theory it must involve a ‘primitive ontology’ (PO), i.e. variables describing the distribution of matter in four-dimensional space–time. In this article, we illustrate the value of having a PO. We do so by focusing on the role that the PO plays for extracting predictions from a given theory and discuss valid and invalid derivations of predictions. To (...) this end, we investigate a number of examples based on toy models built from the elements of familiar interpretations of quantum theory. (shrink)
Schrödinger’s first proposal for the interpretation of quantum mechanics was based on a postulate relating the wave function on configuration space to charge density in physical space. Schrödinger apparently later thought that his proposal was empirically wrong. We argue here that this is not the case, at least for a very similar proposal with charge density replaced by mass density. We argue that when analyzed carefully, this theory is seen to be an empirically adequate many-worlds theory and not an empirically (...) inadequate theory describing a single world. Moreover, this formulation—Schrödinger’s first quantum theory—can be regarded as a formulation of the many-worlds view of quantum mechanics that is ontologically clearer than Everett’s. (shrink)
Bohmian mechanics is a theory about point particles moving along trajectories. It has the property that in a world governed by Bohmian mechanics, observers see the same statistics for experimental results as predicted by quantum mechanics. Bohmian mechanics thus provides an explanation of quantum mechanics. Moreover, the Bohmian trajectories are defined in a non-conspiratorial way by a few simple laws.
In Bohmian mechanics elementary particles exist objectively, as point particles moving according to a law determined by a wavefunction. In this context, questions as to whether the particles of a certain species are real---questions such as, Do photons exist? Electrons? Or just the quarks?---have a clear meaning. We explain that, whatever the answer, there is a corresponding Bohm-type theory, and no experiment can ever decide between these theories. Another question that has a clear meaning is whether particles are intrinsically distinguishable, (...) i.e., whether particle world lines have labels indicating the species. We discuss the intriguing possibility that the answer is no, and particles are points---just points. (shrink)
We consider an isolated, macroscopic quantum system. Let H be a microcanonical “energy shell,” i.e., a subspace of the system’s Hilbert space spanned by the (finitely) many energy eigenstates with energies between E and E + δE. The thermal equilibrium macro-state at energy E corresponds to a subspace Heq of H such that dim Heq/ dim H is close to 1. We say that a system with state vector ψ H is in thermal equilibrium if ψ is “close” to Heq. (...) We show that for “typical” Hamiltonians with given eigenvalues, all initial state vectors ψ0 evolve in such a way that ψt is in thermal equilibrium for most times t. This result is closely related to von Neumann’s quantum ergodic theorem of 1929. (shrink)
It is well known that density matrices can be used in quantum mechanics to represent the information available to an observer about either a system with a random wave function (“statistical mixture”) or a system that is entangled with another system (“reduced density matrix”). We point out another role, previously unnoticed in the literature, that a density matrix can play: it can be the “conditional density matrix,” conditional on the configuration of the environment. A precise definition can be given in (...) the context of Bohmian mechanics, whereas orthodox quantum mechanics is too vague to allow a sharp definition, except perhaps in special cases. In contrast to statistical and reduced density matrices, forming the conditional density matrix involves no averaging. In Bohmian mechanics with spin, the conditional density matrix replaces the notion of conditional wave function, as the object with the same dynamical significance as the wave function of a Bohmian system. (shrink)
Let ℋ be a finite-dimensional complex Hilbert space and D the set of density matrices on ℋ, i.e., the positive operators with trace 1. Our goal in this note is to identify a probability measure u on D that can be regarded as the uniform distribution over D. We propose a measure on D, argue that it can be so regarded, discuss its properties, and compute the joint distribution of the eigenvalues of a random density matrix distributed according to this (...) measure. (shrink)
Among several possibilities for what reality could be like in view of the empirical facts of quantum mechanics, one is provided by theories of spontaneous wave function collapse, the best known of which is the Ghirardi–Rimini–Weber theory. We show mathematically that in GRW theory there are limitations to knowledge, that is, inhabitants of a GRW universe cannot find out all the facts true of their universe. As a specific example, they cannot accurately measure the number of collapses that a given (...) physical system undergoes during a given time interval; in fact, they cannot reliably measure whether one or zero collapses occur. Put differently, in a GRW universe certain meaningful, factual questions are empirically undecidable. We discuss several types of limitations to knowledge and compare them with those in other versions of quantum mechanics, such as Bohmian mechanics. Most of our results also apply to observer-induced collapses as in orthodox quantum mechanics. 1 Introduction1.1 Known examples of limitations to knowledge1.2 Remarks2 Brief Review of GRW Theories2.1 The GRW process2.2 GRWm2.3 GRWf3 First Examples of Limitations to Knowledge in GRW Theories4 Measurements of Flashes in GRWf, or of Collapses in GRWm4.1 An example in which ψ is known4.2 Other choices of ψ4.3 Experiments beginning before t24.4 If ψ is random4.5 Optimal way of distinguishing two density matrices4.6 If ψ is unknown5 Measurements of m in GRWmAppendix. (shrink)
In a recent paper Conway and Kochen, Found. Phys. 36, 2006, claim to have established that theories of the Ghirardi-Rimini-Weber (RW) type, i.e., of spontaneous wave function collapse, cannot be made relativistic. On the other hand, relativistic GRW-type theories have already been presented, in my recent paper, J. Stat. Phys. 125, 2006, and by Dowker and Henson, J. Stat. Phys. 115, 2004. Here, I elucidate why these are not excluded by the arguments of Conway and Kochen.
The renewed interest in the foundations of quantum statistical mechanics in recent years has led us to study John von Neumann’s 1929 article on the quantum ergodic theorem. We have found this almost forgotten article, which until now has been available only in German, to be a treasure chest, and to be much misunderstood. In it, von Neumann studied the long-time behavior of macroscopic quantum systems. While one of the two theorems announced in his title, the one he calls the (...) “quantum H-theorem,” is actually a much weaker statement than Boltzmann’s classical H-theorem, the other theorem, which he calls the “quantum ergodic theorem,” is a beautiful and very non-trivial result. It expresses a fact we call “normal typicality” and can be summarized as follows: For a “typical” finite family of commuting macroscopic observables, every initial wave function ψ0 from a micro-canonical energy shell so evolves that for most times t in the long run, the joint probability distribution of these observables obtained from ψt is close to their micro-canonical distribution. (shrink)
We discuss the content and significance of John von Neumann’s quantum ergodic theorem (QET) of 1929, a strong result arising from the mere mathematical structure of quantum mechanics. The QET is a precise formulation of what we call normal typicality, i.e., the statement that, for typical large systems, every initial wave function ψ0 from an energy shell is “normal”: it evolves in such a way that |ψt ψt| is, for most t, macroscopically equivalent to the micro-canonical density matrix. The QET (...) has been mostly forgotten after it was criticized as a dynamically vacuous statement in several papers in the 1950s. However, we point out that this criticism does not apply to the actual QET, a correct statement of which does not appear in these papers, but to a different (indeed weaker) statement. Furthermore, we formulate a stronger statement of normal typicality, based on the observation that the bound on the deviations from the average specified by von Neumann is unnecessarily coarse and a much tighter (and more relevant) bound actually follows from his proof. (shrink)
We criticize speculations to the effect that quantum mechanics is fundamentally about information. We do this by pointing out how unfounded such speculations in fact are. Our analysis focuses on the dubious claims of this kind recently made by Anton Zeilinger.
Given an ontological model of a quantum system, a “genuine measurement,” as opposed to a quantum measurement, means an experiment that determines the value of a beable, i.e., of a variable that, according to the model, has an actual value in nature before the experiment. We prove a theorem showing that in every ontological model, it is impossible to measure all beables. Put differently, there is no experiment that would reliably determine the ontic state. This result shows that the positivistic (...) idea that a physical theory should only involve observable quantities is too optimistic. (shrink)
Multi-time wave functions are wave functions for multi-particle quantum systems that involve several time variables. In this paper we contrast them with solutions of wave equations on a space–time with multiple timelike dimensions, i.e., on a pseudo-Riemannian manifold whose metric has signature such as \ or \, instead of \. Despite the superficial similarity, the two behave very differently: whereas wave equations in multiple timelike dimensions are typically mathematically ill-posed and presumably unphysical, relevant Schrödinger equations for multi-time wave functions possess (...) for every initial datum a unique solution on the spacelike configurations and form a natural covariant representation of quantum states. (shrink)
This is a comment on J. A. Barrett's article 'The Preferred-Basis Problem and the Quantum Mechanics of Everything' (), which concerns theories postulating that certain quantum observables have determinate values, corresponding to additional (often called 'hidden') variables. I point out that it is far from clear, for most observables, what such a postulate is supposed to mean, unless the postulated additional variable is related to a clear ontology in space-time, such as particle world lines, string world sheets, or fields.
This article concerns a phenomenon of elementary quantum mechanics that is quite counter-intuitive, very non-classical, and apparently not widely known: a quantum particle can get reflected at a potential step downwards. In contrast, classical particles get reflected only at upward steps. As a consequence, a quantum particle can be trapped for a long time (though not forever) in a region surrounded by downward potential steps, that is, on a plateau. Said succinctly, a quantum particle tends not to fall off a (...) table. The conditions for this effect are that the wave length is much greater than the width of the potential step and the kinetic energy of the particle is much smaller than the depth of the potential step. We point out how the topic is accessible with elementary methods, but also with mathematical rigor and numerically. (shrink)
The renewed interest in the foundations of quantum statistical mechanics in recent years has led us to study John von Neumann’s 1929 article on the quantum ergodic theorem. We have found this almost forgotten article, which until now has been available only in German, to be a treasure chest, and to be much misunderstood. In it, von Neumann studied the long-time behavior of macroscopic quantum systems. While one of the two theorems announced in his title, the one he calls the (...) “quantum H-theorem,” is actually a much weaker statement than Boltzmann’s classical H-theorem, the other theorem, which he calls the “quantum ergodic theorem,” is a beautiful and very non-trivial result. It expresses a fact we call “normal typicality” and can be summarized as follows: For a “typical” finite family of commuting macroscopic observables, every initial wave function ψ0 from a micro-canonical energy shell so evolves that for most times t in the long run, the joint probability distribution of these observables obtained from ψ is close to.. (shrink)