In this new edition, Arthur Fine looks at Einstein's philosophy of science and develops his own views on realism. A new Afterword discusses the reaction to Fine's own theory. "What really led Einstein . . . to renounce the new quantum order? For those interested in this question, this book is compulsory reading."--Harvey R. Brown, American Journal of Physics "Fine has successfully combined a historical account of Einstein's philosophical views on quantum mechanics and a discussion of some of the philosophical (...) problems associated with the interpretation of quantum theory with a discussion of some of the contemporary questions concerning realism and antirealism. . . . Clear, thoughtful, [and] well-written."--Allan Franklin, Annals of Science "Attempts, from Einstein's published works and unpublished correspondence, to piece together a coherent picture of 'Einstein realism.' Especially illuminating are the letters between Einstein and fellow realist Schrodinger, as the latter was composing his famous 'Schrodinger-Cat' paper."--Nick Herbert, New Scientist "Beautifully clear. . . . Fine's analysis is penetrating, his own results original and important. . . . The book is a splendid combination of new ways to think about quantum mechanics, about realism, and about Einstein's views of both."--Nancy Cartwright, Isis. (shrink)
The realist programme has degenerated by now to the point where it is quite beyond salvage. A token of this degeneration is that there are altogether too many realisms. It is as though by splitting into a confusing array of types and kinds, realism has hoped that some one variety might yet escape extinct. I shall survey the debate, and some of these realisms, below. Here I would just point out the obvious; that in so far as the successes of (...) science mount while realism continues to decline we must conclude that scientific success lends no support to realism. Since it is unlikely to support anti-realism, we have some reason to suspect that the philosophical debate over realism does not concern issues that can be settled by developments in the sciences, no matter how successful science may be. Further, since that success grounds a culture of acceptance for science and its entities, we have reason to believe that the existence of those entities is also not actually the issue that concerns realism. Afortiori, it is not the issue that concerns anti-realism either; nor, I might add, is anti-realism the winner in the philosophical debate that realism has lost. (shrink)
Faced with realist-resistant sciences and the no-nonsense attitude of the times realism has moved away from the rather grandiose program that had traditionally been characteristic of its school. The objective of the shift seems to be to protect some doctrine still worthy of the "realist" name. The strategy is to relocate the school to where conditions seem optimal for its defense, and then to insinuate that the case for such a " piecemeal realism" could be made elsewhere too, were there (...) but world enough and time. The burden of this paper is to examine this piecemeal approach and to show why, despite the relocation, it cannot escape the difficulties of its grander cousins. For that purpose I begin with some brief historical reminders, and with a quick review of the state of the argument before realism went to pieces. This will help us see what has been abandoned in realism's flight, and what baggage still remains. (shrink)
This paper examines the efficiency problem involved in experimental tests of so-called “local” hidden variables. It separates the phenomenological locality at issue in the Bell case from Einstein's different conception of locality, and shows how phenomenological locality also differs from the factorizability needed to derive the Bell inequalities in the stochastic case. It then pursues the question of whether factorizable, local models (or, equivalently, deterministic ones) exist for the experiments designed to test the Bell inequalities, thus rendering the experimental argument (...) against them incomplete. This leads to an investigation of the so-called “prism models” and to new inequalities for a significant class of such models, inequalities that are testable even at the low efficiencies of the photon correlation experiments. (shrink)
In the concluding chapter of Exceeding our Grasp Kyle Stanford outlines a positive response to the central issue raised brilliantly by his book, the problem of unconceived alternatives. This response, called "epistemic instrumentalism", relies on a distinction between instrumental and literal belief. We examine this distinction and with it the viability of Stanford's instrumentalism, which may well be another case of exceeding our grasp.
In the May 15, 1935 issue of Physical Review Albert Einstein co-authored a paper with his two postdoctoral research associates at the Institute for Advanced Study, Boris Podolsky and Nathan Rosen. The article was entitled “Can Quantum Mechanical Description of Physical Reality Be Considered Complete?” (Einstein et al. 1935). Generally referred to as “EPR”, this paper quickly became a centerpiece in the debate over the interpretation of the quantum theory, a debate that continues today. The paper features a striking case (...) where two quantum systems interact in such a way as to link both their spatial coordinates in a certain direction and also their linear momenta (in the same direction). As a result of this “entanglement”, determining either position or momentum for one system would fix (respectively) the position or the momentum of the other. EPR use this case to argue that one cannot maintain both an intuitive condition of local action and the completeness of the quantum description by means of the wave function. This entry describes the argument of that 1935 paper, considers several different versions and reactions, and explores the ongoing significance of the issues they raise. (shrink)
This paper addresses the “inefficiency loophole” in the Bell theorem. We examine factorizable stochastic models for the Bell inequalities, where we allow the detection efficiency to depend both on the “hidden” state of the measured system and also its passage through an analyzer. We show that, nevertheless, if the efficiency functions are symmetric between the two wings of the experiment, one can dispense with supplementary assumptions and derive new inequalities that enable the models to be tested even for highly inefficient (...) experiments. (shrink)
This paper constructs two classes of models for the quantum correlation experiments used to test the Bell-type inequalities, synchronization models and prism models. Both classes employ deterministic hidden variables, satisfy the causal requirements of physical locality, and yield precisely the quantum mechanical statistics. In the synchronization models, the joint probabilities, for each emission, do not factor in the manner of stochastic independence, showing that such factorizability is not required for locality. In the prism models the observables are not random variables (...) over a common space; hence these models throw into question the entire random variables idiom of the literature. Both classes of models appear to be testable. (shrink)
"But science in the making, science as an end to be pursued, is as subjective and psychologically conditioned as any other branch of human endeavor-- so much so that the question, What is the purpose and meaning of science? receives quite different answers at different times and from different sorts of people" (Einstein 1934, p. 112).
This paper develops lines of criticism directed at two currently popular versions of anti-realism: the putnam-rorty-kuhn version that is centered on an acceptance theory of truth, and the van fraassen version that is centered on empiricist strictures over warranted beliefs. the paper continues by elaborating and extending a stance, called "the natural ontological attitude", that is neither realist nor anti-realist.
In the contemporary discussion of hidden variable interpretations of quantum mechanics, much attention has been paid to the “no hidden variable” proof contained in an important paper of Kochen and Specker. It is a little noticed fact that Bell published a proof of the same result the preceding year, in his well-known 1966 article, where it is modestly described as a corollary to Gleason's theorem. We want to bring out the great simplicity of Bell's formulation of this result and to (...) show how it can be extended in certain respects. (shrink)
Two principles of locality used in discussions about quantum mechanics are distinguished. The intuitive no-action-at-a distance requirement is called physical locality. There is also a mathematical requirement of a kind of factorizability which is referred to as "locality". It is argued in this paper that factorizability is not necessary for physical locality. Ways of producing models that are physically local although not factorizable which are concerned with correlations between the behavior of pairs of particles are suggested. These models can account (...) for all the quantum mechanical single and joint probabilities. (shrink)
What we represent to ourselves behind the appear- ances exists only in our understanding . . . [having] only the value of memoria technica or formula whose form, because it is arbitrary and irrelevant, varies . . . with the standpoint of our culture.
Two things about Hilary Putnam have not changed throughout his career: some (including Putnam himself) have regarded him as a “realist” and some have seen him as a philosopherwho changed his positions (certainly with respect to realism) almost continually. Apparently, what realism meant to him in the 1960s, in the late seventies and eighties, and in the nineties, respectively, are quite different things. Putnam indicates this by changing preﬁxes: scientiﬁc, metaphysical, internal, pragmatic, commonsense, but always realism. Encouraged by Putnam’s own (...) attempts to distinguish his views from one time to another, his work is often regarded as split between an early period of “metaphysical realism” (his characterization) and a later and still continuing period of “internal realism”. Late Putnam is understood to be a view that insists on the primacy of our practices, while the early period is taken to be a view from outside these, a “God’s Eye view”. As Putnam himself stresses (1992b), this way of dividing his work obscures continuities, the most important of which is a continuing attempt to understand what is involved in judging practices of inquiry, like science, as being objectively correct. Thus Putnam’s early and his current work appear to have more in common than the division between “early” and “late” suggests. In fact, Putnam’s earlier writings owe much of their critical force to his adopting the pragmatic perspective of an open-minded participant in practices of empirical inquiry, a stance not explicitly articulated in these writings but rather taken simply as a matter of course.1 Thus insofaras Putnam’s early writings defend a form of representational realism, they can be regarded as attempts to articulate a realist position at work inside our ordinary practices of making empirical judgments. For this reason, we begin our review of Putnam’s realisms by extracting from the early writings a core of principles that carries over into his current work but underwent signiﬁcantly different interpretations over time.. (shrink)
The aim of this paper is to present and discuss a probabilistic framework that is adequate for the formulation of quantum theory and faithful to its applications. Contrary to claims, which are examined and rebutted, that quantum theory employs a nonclassical probability theory based on a nonclassical "logic," the probabilistic framework set out here is entirely classical and the "logic" used is Boolean. The framework consists of a set of states and a set of quantities that are interrelated in a (...) specified manner. Each state induces a classical probability space on the values of each quantity. The quantities, so considered, become statistical variables (not random variables). Such variables need not have a "joint distribution." For the quantum theoretic application, there is a uniform procedure that defines and determines the existence of such "joint distributions" for statistical variables. A general rule is provided and it is shown to lead to the usual compatibility-commutivity requirements of quantum theory. The paper concludes with a brief discussion of interference and the misunderstandings that are involved in the false move from interference to nonclassical probability. (shrink)
(Draft copy published as “Science Made Up: Constructivist Sociology of Scientific Knowledge.” In P. Galison and D. Stump (eds.) The Disunity of Science: Boundaries, Contexts, and Power. Stanford: Stanford University Press, 1996, pp. 231-54.).
A recent analysis by de Barros and Suppes of experimentally realizable GHZ correlations supports the conclusion that these correlations cannot be explained by introducing local hidden variables. We show, nevertheless, that their analysis does not exclude local hidden variable models in which the inefficiency in the experiment is an effect not only of random errors in the detector equipment, but is also the manifestation of a pre-set, hidden property of the particles ("prism models"). Indeed, we present an explicit prism model (...) for the GHZ scenario; that is, a local hidden variable model entirely compatible with recent GHZ experiments. (shrink)