Quantum theory is a probabilistic theory that embodies notoriously striking correlations, stronger than any that classical theories allow but not as strong as those of hypothetical ‘super-quantum’ theories. This raises the question ‘Why the quantum?’—whether there is a handful of principles that account for the character of quantum probability. We ask what quantum-logical notions correspond to this investigation. This project isn’t meant to compete with the many beautiful results that information-theoretic approaches have yielded but rather aims to complement that work.
A quantum algorithm succeeds not because the superposition principle allows ‘the computation of all values of a function at once’ via ‘quantum parallelism’, but rather because the structure of a quantum state space allows new sorts of correlations associated with entanglement, with new possibilities for information‐processing transformations between correlations, that are not possible in a classical state space. I illustrate this with an elementary example of a problem for which a quantum algorithm is more efficient than any classical algorithm. I (...) also introduce the notion of ‘pseudotelepathic’ games and show how the difference between classical and quantum correlations plays a similar role here for games that can be won by quantum players exploiting entanglement, but not by classical players whose only allowed common resource consists of shared strings of random numbers (common causes of the players’ correlated responses in a game). *Received October 2008. †To contact the author, please write to: Department of Philosophy, University of Maryland, College Park, MD 20742; e‐mail: email@example.com. (shrink)
We argue that the intractable part of the measurement problem -- the 'big' measurement problem -- is a pseudo-problem that depends for its legitimacy on the acceptance of two dogmas. The first dogma is John Bell's assertion that measurement should never be introduced as a primitive process in a fundamental mechanical theory like classical or quantum mechanics, but should always be open to a complete analysis, in principle, of how the individual outcomes come about dynamically. The second dogma is (...) the view that the quantum state has an ontological significance analogous to the significance of the classical state as the 'truthmaker' for propositions about the occurrence and non-occurrence of events, i.e., that the quantum state is a representation of physical reality. We show how both dogmas can be rejected in a realist information-theoretic interpretation of quantum mechanics as an alternative to the Everett interpretation. The Everettian, too, regards the 'big' measurement problem as a pseudo-problem, because the Everettian rejects the assumption that measurements have definite outcomes, in the sense that one particular outcome, as opposed to other possible outcomes, actually occurs in a quantum measurement process. By contrast with the Everettians, we accept that measurements have definite outcomes. By contrast with the Bohmians and the GRW 'collapse' theorists who add structure to the theory and propose dynamical solutions to the 'big' measurement problem, we take the problem to arise from the failure to see the significance of Hilbert space as a new kinematic framework for the physics of an indeterministic universe, in the sense that Hilbert space imposes kinematic (i.e., pre-dynamic) objective probabilistic constraints on correlations between events. (shrink)
We show that three fundamental information-theoretic constraints -- the impossibility of superluminal information transfer between two physical systems by performing measurements on one of them, the impossibility of broadcasting the information contained in an unknown physical state, and the impossibility of unconditionally secure bit commitment -- suffice to entail that the observables and state space of a physical theory are quantum-mechanical. We demonstrate the converse derivation in part, and consider the implications of alternative answers to a remaining open question about (...) nonlocality and bit commitment. (shrink)
It is generally accepted, following Landauer and Bennett, that the process of measurement involves no minimum entropy cost, but the erasure of information in resetting the memory register of a computer to zero requires dissipating heat into the environment. This thesis has been challenged recently in a two-part article by Earman and Norton. I review some relevant observations in the thermodynamics of computation and argue that Earman and Norton are mistaken: there is in principle no entropy cost to the acquisition (...) of information, but the destruction of information does involve an irreducible entropy cost. (shrink)
I explore the nature of the problem generated by the transition from classical to quantum mechanics, and I survey some of the different responses to this problem. I show briefly how recent work on quantum information over the past ten years has led to a shift of focus, in which the puzzling features of quantum mechanics are seen as a resource to be developed rather than a problem to be solved.
I show how quantum mechanics, like the theory of relativity, can be understood as a 'principle theory' in Einstein's sense, and I use this notion to explore the approach to the problem of interpretation developed in my book Interpreting the Quantum World.
We prove a uniqueness theorem showing that, subject to certain natural constraints, all 'no collapse' interpretations of quantum mechanics can be uniquely characterized and reduced to the choice of a particular preferred observable as determine (definite, sharp). We show how certain versions of the modal interpretation, Bohm's 'causal' interpretation, Bohr's complementarity interpretation, and the orthodox (Dirac-von Neumann) interpretation without the projection postulate can be recovered from the theorem. Bohr's complementarity and Einstein's realism appear as two quite different proposals for selecting (...) the preferred determinate observable--either settled pragmatically by what we choose to observe, or fixed once and for all, as the Einsteinian realist would require, in which case the preferred observable is a 'beable' in Bell's sense, as in Bohm's interpretation (where the preferred observable is position in configuration space). (shrink)
I consider to what extent the phenomenon of interference precludes the possibility of attributing simultaneously determinate values to noncommuting observables, and I show that, while all observables can in principle be taken as simultaneously determinate, it suffices to take a suitable privileged observable as determinate to solve the measurement problem.
I formulate the interpretation problem of quantum mechanics as the problem of identifying all possible maximal sublattices of quantum propositions that can be taken as simultaneously determinate, subject to certain constraints that allow the representation of quantum probabilities as measures over truth possibilities in the standard sense, and the representation of measurements in terms of the linear dynamics of the theory. The solution to this problem yields a modal interpretation that I show to be a generalized version of Bohm's hidden (...) variable theory. I argue that unless we alter the dynamics of quantum mechanics, or accept a for all practical purposes solution, this generalized Bohmian mechanics is the unique solution to the problem of interpretation. (shrink)
The aim of cognitive neuropsychology is to articulate the functional architecture underlying normal cognition, on the basis of cognitive performance data involving brain-damaged subjects. Glymour (forthcoming) formulates a discovery problem for cognitive neuropsychology, in the sense of formal learning theory, concerning the existence of a reliable methodology, and argues that the problem is insoluble: granted certain apparently plausible assumptions about the form of neuropsychological theories and the nature of the available evidence, a reliable methodology does not exist! I argue for (...) a reformulation of the discovery problem in terms of an alternative characterization of relevant evidence in neuropsychology. (shrink)
The aim of cognitive neuropsychology is to articulate the functional architecture underlying normal cognition, on the basis of congnitive performance data involving brain-damaged subjects. Throughout the history of the subject, questions have been raised as to whether the methods of neuropsychology are adequate to its goals. The question has been reopened by Glymour , who formulates a discovery problem for cognitive neuropsychology, in the sense of formal learning theory, concerning the existence of a reliable methodology. It appears that the discovery (...) problem may be insoluble in principle! I propose a modified formulation of Glymour's discovery problem and argue that a sceptical conclusion about the possiblity of cognitive neuropsychology as an empirical science is not warranted. (shrink)
The properties of classical and quantum systems are characterized by different algebraic structures. We know that the properties of a quantum mechanical system form a partial Boolean algebra not embeddable into a Boolean algebra, and so cannot all be co-determinate. We also know that maximal Boolean subalgebras of properties can be (separately) co-determinate. Are there larger subsets of properties that can be co-determinate without contradiction? Following an analysis of Bohrs response to the Einstein-Podolsky-Rosen objection to the complementarity interpretation of quantum (...) mechanics, a principled argument is developed justifying the selection of particular subsets of properties as co-determinate for a quantum system in particular physical contexts. These subsets are generated by sets of maximal Boolean subalgebras, defined in each case by the relation between the quantum state and a measurement (possibly, but not necessarily, the measurement in terms of which we seek to establish whether or not a particular property of the system in question obtains). If we are required to interpret quantum mechanics in this way, then predication for quantum systems is quite unlike the corresponding notion for classical systems. (shrink)
Philosophical debate on the measurement problem of quantum mechanics has, for the most part, been confined to the non-relativistic version of the theory. Quantizing quantum field theory, or making quantum mechanics relativistic, yields a conceptual framework capable of dealing with the creation and annihilation of an indefinite number of particles in interaction with fields, i.e. quantum systems with an infinite number of degrees of freedom. I show that a solution to the standard measurement problem is available if we exploit the (...) properties of the infinite quantum models available in this broader conceptual framework. (shrink)
Friedman and Putnam have argued (Friedman and Putnam 1978) that the quantum logical interpretation of quantum mechanics gives us an explanation of interference that the Copenhagen interpretation cannot supply without invoking an additional ad hoc principle, the projection postulate. I show that it is possible to define a notion of equivalence of experimental arrangements relative to a pure state φ , or (correspondingly) equivalence of Boolean subalgebras in the partial Boolean algebra of projection operators of a system, which plays a (...) role in the Copenhagen explanation of interference analogous to the role played by the material equivalence, given φ , of certain propositions in the Friedman-Putnam quantum logical analysis. I also show that the quantum logical interpretation and the Copenhagen interpretation are equally capable of avoiding the paradoxical conclusion of the Einstein-Podolsky-Rosen argument (Einstein, Podolsky, and Rosen 1935). Thus, neither interference phenomena nor the correlations between separated systems provide a test case for distinguishing between the relative acceptability of the Copenhagen interpretation and the quantum logical interpretation as explanations of quantum effects. (shrink)
Bell's proof purports to show that any hidden variable theory satisfying a physically reasonable locality condition is characterized by an inequality which is inconsistent with the quantum statistics. It is shown that Bell's inequality actually characterizes a feature of hidden variable theories which is much weaker than locality in the sense considered physically motivated. We consider an example of non-local hidden variable theory which reproduces the quantum statistics (and hence violates Bell's inequality). A simple extension of the theory, which (...) preserves the non-local character, alters the statistics in such a way that Bell's inequality is satisfied. (shrink)