We present a brief history of decoherence, from its roots in the foundations of classical statistical mechanics, to the current spin bath models in condensed matter physics. We analyze the philosophical import of the subject matter in three different foundational problems, and find that, contrary to the received view, decoherence is less instrumental to their solutions than it is commonly believed. What makes decoherence more philosophically interesting, we argue, are the methodological issues it draws attention to, and (...) the question of the universality of quantum mechanics. (shrink)
The quantum theory of decoherence plays an important role in a pragmatist interpretation of quantum theory. It governs the descriptive content of claims about values of physical magnitudes and offers advice on when to use quantum probabilities as a guide to their truth. The content of a claim is to be understood in terms of its role in inferences. This promises a better treatment of meaning than that offered by Bohr. Quantum theory models physical systems with no mention of (...) measurement: it is decoherence, not measurement, that licenses application of Born’s probability rule. So quantum theory also offers advice on its own application. I show how this works in a simple model of decoherence, and then in applications to both laboratory experiments and natural systems. Applications to quantum field theory and the measurement problem will be discussed elsewhere. (shrink)
Schlosshauer has criticized the conclusion of Wiebe and Ballentine (Phys. Rev. A 72:022109, 2005) that decoherence is not essential for the emergence of classicality from quantum mechanics. I reply to the issues raised in his critique, which range from the interpretation of quantum mechanics to the criterion for classicality, and conclude that the role of decoherence in these issues is much more restricted than is often claimed.
Our account of the problem of the classical limit of quantum mechanics involves two elements. The first one is self-induced decoherence, conceived as a process that depends on the own dynamics of a closed quantum system governed by a Hamiltonian with continuous spectrum; the study of decoherence is addressed by means of a formalism used to give meaning to the van Hove states with diagonal singularities. The second element is macroscopicity represented by the limit $\hbar \rightarrow 0$ : (...) when the macroscopic limit is applied to the Wigner transformation of the diagonal state resulting from decoherence, the description of the quantum system becomes equivalent to the description of an ensemble of classical trajectories on phase space weighted by their corresponding probabilities. (shrink)
The Conditional Probability Interpretation of Quantum Mechanics replaces the abstract notion of time used in standard Quantum Mechanics by the time that can be read off from a physical clock. The use of physical clocks leads to apparent non-unitary and decoherence. Here we show that a close approximation to standard Quantum Mechanics can be recovered from conditional Quantum Mechanics for semi-classical clocks, and we use these clocks to compute the minimum decoherence predicted by the Conditional Probability Interpretation.
In this paper we analyze the resilience to decoherence of the Macroscopic Quantum Superpositions (MQS) generated by optimal phase-covariant quantum cloning according to two coherence criteria, both based on the concept of Bures distance in Hilbert spaces. We show that all MQS generated by this system are characterized by a high resilience to decoherence processes. This analysis is supported by the results of recent MQS experiments of N=3.5×104 particles.
This paper describes how the entire universe might be considered an eigenstate determined by classical limiting conditions within it. This description is in the context of an approach in which the path of each relativistic particle in spacetime represents a fine-grained history for that particle, and a path integral represents a coarse-grained history as a superposition of paths meeting some criteria. Since spacetime paths are parametrized by an invariant parameter, not time, histories based on such paths do not evolve in (...) time but are rather histories of all spacetime. Measurements can then be represented by orthogonal states that correlate with specific points in such coarse-grained histories, causing them to decohere, allowing a consistent probability interpretation. This conception is applied here to the analysis of the two slit experiment, scattering and, ultimately, the universe as a whole. The decoherence of cosmological states of the universe then provides the eigenstates from which our “real” universe can be selected by the measurements carried out within it. (shrink)
Suppose a quantum experiment includes one or more random processes. Then the results of repeated measurements may appear consistent with irreversible decoherence even if the system’s evolution prior to measurement is reversible and unitary. Two thought experiments are constructed as examples.
The possibility of consistency between the basic quantum principles of quantum mechanics and wave function collapse is reexamined. A specific interpretation of environment is proposed for this aim and is applied to decoherence. When the organization of a measuring apparatus is taken into account, this approach leads also to an interpretation of wave function collapse, which would result in principle from the same interactions with environment as decoherence. This proposal is shown consistent with the non-separable character of quantum (...) mechanics. (shrink)
We analyze seemingly contradictory claims in the literature about the role played by decoherence in ensuring classical behavior for the chaotically tumbling satellite Hyperion. We show that the controversy is resolved once the very different assumptions underlying these claims are recognized. In doing so, we emphasize the distinct notions of the problem of classicality in the ensemble interpretation of quantum mechanics and in decoherence-based approaches that are aimed at addressing the measurement problem.
This paper relates both to the metaphysics of probability and to the physics of time asymmetry. Using the formalism of decoherent histories, it investigates whether intuitions about intrinsic time directedness that are often associated with probability can be justified in the context of no-collapse approaches to quantum mechanics. The standard (two-vector) approach to time symmetry in the decoherent histories literature is criticised, and an alternative approach is proposed, based on two decoherence conditions ('forwards' and 'backwards') within the one-vector formalism. (...) In turn, considerations of forwards and backwards decoherence and of decoherence and recoherence suggest that a time-directed interpretation of probabilities, if adopted, should be both contingent and perspectival. (shrink)
Given the impressive success of environment-induced decoherence (EID), nowadays no interpretation of quantum mechanics can ignore its results. The modal-Hamiltonian interpretation (MHI) has proved to be effective for solving several interpretative problems but, since its actualization rule applies to closed systems, it seems to stand at odds of EID. The purpose of this paper is to show that this is not the case: the states einselected by the interaction with the environment according to EID (the elements of the “pointer (...) basis”) are the eigenvectors of an actual-valued observable belonging to the preferred context selected by the MHI. (shrink)
This work examines whether the environmentally-induced decoherence approach in quantum mechanics brings us any closer to solving the measurement problem, and whether it contributes to the elimination of subjectivism in quantum theory. A distinction is made between 'collapse' and 'decoherence', so that an explanation for decoherence does not imply an explanation for collapse. After an overview of the measurement problem and of the open-systems paradigm, we argue that taking a partial trace is equivalent to applying the projection (...) postulate. A criticism of Zurek's decoherence approach to measurements is also made, based on the restriction that he must impose on the interaction between apparatus and environment. We then analyze the element of subjectivity involved in establishing the boundary between system and environment, and criticize the incorporation of Everett's branching of memory records into the decoherence research program. Sticking to this program, we end by sketching a proposal for 'environmentally-induced collapse'. (shrink)
Decoherence results from the dissipative interaction between a quantum system and its environment. As the system and environment become entangled, the reduced density operator describing the system "decoheres" into a mixture (with the interference terms damped out). This formal result prompts some to exclaim that the measurement problem is solved. I will scrutinize this claim by examining how modal and relative-state interpretations can use decoherence. Although decoherence cannot rescue these interpretations from general metaphysical difficulties, decoherence may (...) help these interpretations to pick out a preferred basis. I will explore whether decoherence solves nagging technical problems associated with selecting a preferred basis. (shrink)
Interference phenomena are a well-known and crucial feature of quantum mechanics, the two-slit experiment providing a standard example. There are situations, however, in which interference effects are (artificially or spontaneously) suppressed. We shall need to make precise what this means, but the theory of decoherence is the study of (spontaneous) interactions between a system and its environment that lead to such suppression of interference. This study includes detailed modelling of system-environment interactions, derivation of equations (‘master equations’) for the (reduced) (...) state of the system, discussion of time-scales etc. A discussion of the concept of suppression of interference and a simplified survey of the theory is given in Section 2, emphasising features that will be relevant to the following discussion (and restricted to standard non-relativistic particle quantum mechanics. A partially overlapping field is that of decoherent histories, which proceeds from an abstract definition of loss of interference, but which we shall not be considering in any detail. (shrink)
This work examines whether the environmentally-induced decoherence approach in quantum mechanics brings us any closer to solving the measurement problem, and whether it contributes to the elimination of subjectivism in quantum theory. A distinction is made between ,collapse, and ,decoherence,, so that an explanation for decoherence does not imply an explanation for collapse. After an overview of the measurement problem and of the open-systems paradigm, we argue that taking a partial trace is equivalent to applying the projection (...) postulate. A criticism of Zurek's decoherence approach to measurements is also made, based on the restriction that he must impose on the interaction between apparatus and environment. We then analyze the element of subjectivity involved in establishing the boundary between system and environment, and criticize the incorporation of Everett's branching of memory records into the decoherence research program. Sticking to this program, we end by sketching a proposal for âenvironmentally-induced collapseâ. (shrink)
This work examines whether the environmentally-induced decoherence approach in quantum mechanics brings us any closer to solving the measurement problem, and whether it contributes to the elimination of subjectivism in quantum theory. A distinction is made between ,collapse, and ,decoherence,, so that an explanation for decoherence does not imply an explanation for collapse. After an overview of the measurement problem and of the open-systems paradigm, we argue that taking a partial trace is equivalent to applying the projection (...) postulate. A criticism of Zurek's decoherence approach to measurements is also made, based on the restriction that he must impose on the interaction between apparatus and environment. We then analyze the element of subjectivity involved in establishing the boundary between system and environment, and criticize the incorporation of Everett's branching of memory records into the decoherence research program. Sticking to this program, we end by sketching a proposal for ‘environmentally-induced collapse’. (shrink)
In this paper we argue that the formalisms for decoherence originally devised to deal just with closed or open systems can be subsumed under a general conceptual framework, in such a way that they cooperate in the understanding of the same physical phenomenon. This new perspective dissolves certain conceptual difficulties of the einselection program but, at the same time, shows that the openness of the quantum system is not the essential ingredient for decoherence. †To contact the authors, please (...) write to: Mario Castagnino, CONICET-IAFE, Universidad Nacional de Buenos Aires, Casilla de Correos 67, Sucursal 28, 1428 Buenos Aires, Argentina; Roberto Laura, IFIR-Universidad Nacional de Rosario, Av. Pellegrini 250, 2000 Rosario, Argentina; Olimpia Lombardi, CONICET-Universidad Nacional de Buenos Aires, C. Larralde 3440, 6°D, 1430, Buenos Aires; e-mail: firstname.lastname@example.org. (shrink)
Quantum decoherence is receiving a great deal of attention today not only in theoretical and experimental physics but also in branches of science as diverse as molecular biology, biochemistry, and even neuropsychology. It is no surprise that it is also beginning to appear in various philosophical debates concerning the fundamental structure of the world. The purpose of this article is primarily to acquaint non-specialists with quantum decoherence and clarify related concepts, and secondly to sketch its possible implications – (...) independent of particular interpretations of quantum mechanics – for broader philosophical debates. For example, decoherence shows that any method of parsing nature into levels or parts cannot be in principle but instead derives from our perception of the world as classical, a perception that is itself sustained by the process of decoherence. (shrink)
Can we explain the laws of thermodynamics, in particular the irreversible increase of entropy, from the underlying quantum mechanical dynamics? Attempts based on classical dynamics have all failed. Albert (1994a,b; 2000) proposed a way to recover thermodynamics on a purely dynamical basis, using the quantum theory of the collapse of the wavefunction of Ghirardi, Rimini and Weber (1986). In this paper we propose an alternative way to explain thermodynamics within no-collapse interpretations of quantum mechanics. Our approach relies on the standard (...) quantum mechanical models of environmental decoherence of open systems, e.g. Joos and Zeh (1985) and Zurek and Paz (1994). (shrink)
According to the environment-induced approach to decoherence (EID), the split of the Universe into the degrees of freedom which are of direct interest to the observer (the system) and the remaining degrees of freedom (the environment) is absolutely essential for decoherence. However, the EID approach offers no general criterion for deciding where to place the “cut” between system and environment: the environment may be “external” (a bath of particles interacting with the system of interest) or “internal” (such as (...) collections of phonons or other internal excitations). The main purpose of this paper is to argue that decoherence is a relative phenomenon, better understood from a closed-system perspective according to which the split of a closed quantum system into an open subsystem and its environment is just a way of selecting a particular space of relevant observables of the whole closed system. In order to support this claim, we shall consider the results obtained in a natural generalization of the simple spin-bath model usually studied in the literature. Our main thesis will lead us to two corollaries. First, the problem of identifying the system that decoheres is actually a pseudo-problem, which vanishes as soon as one acknowledges the relative nature of decoherence. Second, the usually supposed link between decoherence and energy dissipation is misguided. As previously pointed out, energy dissipation and decoherence are different phenomena, and we shall argue for this difference on the basis of the relative nature of decoherence. (shrink)
In this paper we argue that the emergence of the classical world from the underlying quantum reality involves two elements: self-induced decoherence and macroscopicity. Self-induced decoherence does not require the openness of the system and its interaction with the environment: a single closed system can decohere when its Hamiltonian has continuous spectrum. We show that, if the system is macroscopic enough, after self-induced decoherence it can be described as an ensemble of classical distributions weighted by their corresponding (...) probabilities. We also argue that classicality is an emergent property that arises when the behavior of the system is described from an observational perspective. (shrink)
General results about restrictions on measurements from inside are applied to quantum mechanics. They imply subjective decoherence: For an apparatus it is not possible to determine whether the joint system consisting of itself and the observed system is in a statistical state with or without interference terms; it is possible that the apparatus systematically mistakes the real pure state of the joint system for the decohered state. We discuss the relevance of subjective decoherence for quantum measurements and for (...) the problem of Wigner's friend. (shrink)
We discuss a recent proposal by Albert (1994a; 1994b; 2000, ch. 7) to recover thermodynamics on a purely dynamical basis, using the quantum theory of the collapse of the wave function by Ghirardi, Rimini, and Weber (1986). We propose an alternative way to explain thermodynamics within no-collapse interpretations of quantum mechanics. Our approach relies on the standard quantum mechanical models of environmental decoherence of open systems (e.g., Joos and Zeh 1985; Zurek and Paz 1994). This paper presents the two (...) approaches and discusses their advantages. The problems faced by both approaches will be discussed in a sequel (Hemmo and Shenker 2003). (shrink)
According to Zurek, decoherence is a process resulting from the interaction between a quantum system and its environment; this process singles out a preferred set of states, usually called “pointer basis”, that determines which observables will receive definite values. This means that decoherence leads to a sort of selection which precludes all except a small subset of the states in the Hilbert space of the system from behaving in a classical manner: environment-induced-superselection (einselection) is a consequence of the (...) process of decoherence. The aim of this paper is to present a new approach to decoherence, different from the mainstream approach of Zurek and his collaborators. We will argue that this approach offers conceptual advantages over the traditional one when problems of foundations are considered; in particular, from the new perspective, decoherence in closed quantum systems becomes possible and the preferred basis acquires a well founded definition. (shrink)
The decoherent histories formalism, developed by Griffiths, Gell-Mann, and Hartle (in Phys. Rev. A 76:022104, 2007; arXiv:1106.0767v3 [quant-ph], 2011; Consistent Quantum Theory, Cambridge University Press, 2003; arXiv:gr-qc/9304006v2, 1992) is a general framework in which to formulate a timeless, ‘generalised’ quantum theory and extract predictions from it. Recent advances in spin foam models allow for loop gravity to be cast in this framework. In this paper, I propose a decoherence functional for loop gravity and interpret existing results (Bianchi et al. (...) in Phys. Rev. D 83:104015, 2011; Phys. Rev. D 82:084035, 2010) as showing that coarse grained histories follow quasiclassical trajectories in the appropriate limit. (shrink)
In realistic situations where a macroscopic system interacts with an external environment, decoherence of the quantum state, as derived in the decoherence approach, is only approximate. We argue that this can still give rise to facts, provided that during the decoherence process states that are, respectively, always close to eigenvectors of pointer position and record observable are correlated. We show in a model that this is always the case.
We discuss a recent proposal by Albert (1994a,b; 2000, Chapter 7) to recover thermodynamics on a purely dynamical basis, using the quantum theory of the collapse of the wave function of Ghirardi, Rimini and Weber (1986). We propose an alternative way to explain thermodynamics within no-collapse interpretations of quantum mechanics. Our approach relies on the standard quantum mechanical models of environmental decoherence of open systems, \eg Joos and Zeh (1985) and Zurek and Paz (1994). This paper presents the two (...) approaches and discusses their advantages. The problems they face will be discussed in a sequel (Hemmo and Shenker 2002b). (shrink)
The role of the environment in producing the correct classical limit in the Bohm interpretation of quantum mechanics is investigated, in the context of a model of quantum Brownian motion. One of the effects of the interaction is to produce a rapid approximate diagonalisation of the reduced density matrix in the position representation. This effect is, by itself, insufficient to produce generically quasi-classical behaviour of the Bohmian trajectory. However, it is shown that, if the system particle is initially in an (...) approximate energy eigenstate, then there is a tendency for the Bohmian trajectory to become approximately classical on a longer time-scale. The relationship between this phenomenon and the behaviour of the Wigner function post-decoherence is discussed. (shrink)
I investigate the character of the definite properties defined by the Basic Rule in the Vermaas and Dieks' (1995) version of the modal interpretation of quantum mechanics, specifically for the case of the continuous model of decoherence by Joos and Zeh (1985). While this model suggests that the characteristic length that might be associated with the localisation of an individual system is the coherence length of the state (which converges rapidly to the thermal de Broglie wavelength), I show in (...) an exactly soluble case that the definite properties that are possessed with overwhelming probability in this modal interpretation are delocalized over the entire spread of the state. (shrink)
We review the decoherent histories approach to the interpretation of quantum mechanics. The Everett relative-state theory is reformulated in terms of decoherent histories. A model of evolutionary adaptation is shown to imply decoherence. A general interpretative framework is proposed: probability and value-definiteness are to have a similar status to the attribution of tense in classical spacetime theory.
We demonstrate by use of a simple one-dimensional model of a square barrier imbedded in an infinite potential well that decoherence is enhanced by chaotic-like behavior. We, moreover, show that the transition h→0 is singular. Finally it is argued that the time scale on which decoherence occurs depends, on the degree of complexity of the underlying quantum mechanical system, i.e., more complex systems decohere relatively faster than less complex ones.
I discuss a model inspired from the string/brane framework, in which our Universe is represented (after perhaps appropriate compactification) as a three brane, propagating in a bulk space time punctured by D0-brane (D-particle) defects. As the D3-brane world moves in the bulk, the D-particles cross it, and from an effective observer on D3 the situation looks like a “space-time foam” with the defects “flashing” on and off (“D-particle foam”). The open strings, with their ends attached on the brane, which represent (...) matter in this scenario, can interact with the D-particles on the D3-brane universe in a topologically non-trivial manner, involving splitting and capture of the strings by the D0-brane defects. Such processes are consistently described by logarithmic conformal field theories on the world-sheet of the strings. Physically, they result in effective decoherence of the string matter on the D3 brane, and as a result, of CPT Violation, but of a type that implies an ill-defined nature of the effective CPT operator. Due to electric charge conservation, only electrically neutral (string) matter can exhibit such interactions with the D-particle foam. This may have unique, experimentally detectable (in principle), consequences for electrically-neutral entangled quantum matter states on the brane world, in particular the modification of the pertinent Einstein-Podolsky-Rosen (EPR) Correlation in neutral mesons in an appropriate meson factory. For the simplest scenarios, the order of magnitude of such effects might lie within the sensitivity of upgraded φ-meson factories. (shrink)
Decoherence is the name for the complex of phenomena leading to appearance of classical features of quantum systems. In the present paper decoherence in continuous measurements is analyzed with the help of restricted path integrals (RPI) and (equivalently in simple cases) complex Hamiltonians. A continuous measurement results in a readout giving information in the classical form on the evolution of the measured quantum system. The quantum features of the system reveal themselves in the variation of possible measurement readouts. (...) For example, the monitoring energy of a multi-level system may visualize a transition between levels as a process evolving in time but with an unavoidable quantum noise. Decoherence of a continuously measured system is completely determined by the measurement readout, i.e., by the information recorded in its environment. It is shown that the ideology of RPI makes the Feynman version of quantum mechanics closed, contrary to the conventional operator form of quantum mechanics which needs quantum theory of measurement as a necessary additional part. (shrink)
Examining the notion of wavefunction collapse (WFC) in quantum measurements, which came again to be in question in the recent debate on the quantum Zeno effect, we remark that WFC is realized only through decoherence among branch waves by detection, after a spectral decomposition process from an initial object wavefunction to a superposition of branch waves corresponding to relevant measurement propositions. We improve the definition of the decoherence parameter, so as to be fitted to general cases, by which (...) we can quantitatively estimate the degree of WFC given by detectors. Finally, we briefly discuss whether two special detector models, with very huge and very small degrees of freedom, can provoke WFC. (shrink)
State-reduction and the notion of actuality are compared to passage through time and the notion of the present; already in classical relativity the latter give rise to difficulties. The solution proposed here is to treat both tense and value-definiteness as relational properties or facts as relations; likewise the notions of change and probability. In both cases essential characteristics are absent: temporal relations are tenselessly true; probabilistic relations are deterministically true.The basic ideas go back to Everett, although the technical development makes (...) use of the decoherent histories theory of Griffiths, Omnès, and Gell-Mann and Hartle. Alternative interpretations of the decoherent histories framework are also considered. (shrink)
I discuss the quantum mechanical theory of consciousness and freewill offered by Stapp (1993, 1995, 2000, 2004). First I show that decoherence-based arguments do not work against this theory. Then discuss a number of problems with the theory: Stapp's separate accounts of consciousness and freewill are incompatible, the interpretations of QM they are tied to are questionable, the Zeno effect could not enable freewill as he suggests because weakness of will would then be ubiquitous, and the holism of measurement (...) in QM is not a good explanation of the unity of consciousness for essentially the same reason that local interactions may seem incapable to account for it. (shrink)
Quantum mechanical entangled configurations of particles that do not satisfy Bell’s inequalities, or equivalently, do not have a joint probability distribution, are familiar in the foundational literature of quantum mechanics. Nonexistence of a joint probability measure for the correlations predicted by quantum mechanics is itself equivalent to the nonexistence of local hidden variables that account for the correlations (for a proof of this equivalence, see Suppes and Zanotti, 1981). From a philosophical standpoint it is natural to ask what sort of (...) concept can be used to provide a “joint” analysis of such quantum correlations. In other areas of application of probability, similar but different problems arise. A typical example is the introduction of upper and lower probabilities in the theory of belief. A person may feel uncomfortable assigning a precise probability to the occurrence of rain tomorrow, but feel comfortable saying the probability should be greater than ½ and less than ⅞. Rather extensive statistical developments have occurred for this framework. A thorough treatment can be found in Walley (1991) and an earlier measurement-oriented development in Suppes (1974). It is important to note that this focus on beliefs, or related Bayesian ideas, is not concerned, as we are here, with the nonexistence of joint probability distributions. Yet earlier work with no relation to quantum mechanics, but focused on conditions for existence has been published by many people. For some of our own work on this topic, see Suppes and Zanotti (1989). Still, this earlier work naturally suggested the question of whether or not upper and lower measures could be used in quantum mechanics, as a generalization of.. (shrink)
In Everett's many worlds interpretation, quantum measurements are considered to be decoherence events. If so, then inexact decoherence may allow large worlds to mangle the memory of observers in small worlds, creating a cutoff in observable world size. Smaller world are mangled and so not observed. If this cutoff is much closer to the median measure size than to the median world size, the distribution of outcomes seen in unmangled worlds follows the Born rule. Thus deviations from exact (...)decoherence can allow the Born rule to be derived via world counting, with a finite number of worlds and no new fundamental physics. (shrink)
The conceptual structure of orthodox quantum mechanics has not provided a fully satisfactory and coherent description of natural phenomena. With particular attention to the measurement problem, we review and investigate two unorthodox formulations. First, there is the model advanced by GRWP, a stochastic modification of the standard Schrödinger dynamics admitting statevector reduction as a real physical process. Second, there is the ontological interpretation of Bohm, a causal reformulation of the usual theory admitting no collapse of the statevector. Within these two (...) seemingly quite different approaches, we discuss in a comparative manner, several points: The meaning of the state vector, the status of quantum probability, the legitimacy of attributing macro objective properties to physical systems, and the possibility of retrieving the classical limit. Finally, we consider aspects of non-locality and relevant difficulties with formulating a relativistic generalization of the two approaches. (shrink)
NGC 1300 (shown in figure 1) is a spiral galaxy 65 million light years from Earth.1 We have never been there, and (although I would love to be wrong about this) we will never go there; all we will ever know about NGC 1300 is what we can see of it from sixty-five million light years away, and what we can infer from our best physics. Fortunately, “what we can infer from our best physics” is actually quite a lot. To (...) take a particular example: our best theory of galaxies tells us that that hazy glow is actually made up of the light of hundreds of billions of stars; our best theories of planetary formation tell us that a sizable fraction of those stars.. (shrink)
The Ollivier–Poulin–Zurek definition of objectivity provides a philosophical basis for the environment as witness formulation of decoherence theory and hence for quantum Darwinism. It is shown that no account of the reference of the key terms in this definition can be given that does not render the definition inapplicable within quantum theory. It is argued that this is not the fault of the language used, but of the assumption that the laws of physics are independent of Hilbert-space decomposition. All (...) evidence suggests that this latter assumption is true. If it is, decoherence cannot explain the emergence of classicality. (shrink)
We propose a technical reformulation of the measurement problem of quantum mechanics, which is based on the postulate that the final state of a measurement is classical; this accords with experimental practice as well as with Bohr’s views. Unlike the usual formulation (in which the post-measurement state is a unit vector in Hilbert space), our version actually opens the possibility of admitting a purely technical solution within the confines of conventional quantum theory (as opposed to solutions that either modify this (...) theory, or introduce unusual and controversial interpretative rules and/or ontologies).To that effect, we recall a remarkable phenomenon in the theory of Schrödinger operators (discovered in 1981 by Jona-Lasinio, Martinelli, and Scoppola), according to which the ground state of a symmetric double-well Hamiltonian (which is paradigmatically of Schrödinger’s Cat type) becomes exponentially sensitive to tiny perturbations of the potential as ħ→0. We show that this instability emerges also from the textbook wkb approximation, extend it to time-dependent perturbations, and study the dynamical transition from the ground state of the double well to the perturbed ground state (in which the cat is typically either dead or alive, depending on the details of the perturbation).Numerical simulations show that adiabatically arising perturbations may (quite literally) cause the collapse of the wave-function in the classical limit. Thus, at least in the context of a simple mathematical model, we combine the technical and conceptual virtues of decoherence (which fails to solve the measurement problem but launches the key idea that perturbations may come from the environment) with those of dynamical collapse models à la grw (which do solve the measurement problem but are ad hoc), without sharing their drawbacks: single measurement outcomes are obtained (instead of merely diagonal reduced density matrices), and no modification of quantum mechanics is needed. (shrink)