Many years ago, when Michael was lecturing in Oxford on the Philosophy of Physics and was trying to explain the logic of Aspect's experiments in Paris, he turned to me to expound the correct doctrine of counter-factual truth. I was flummoxed. It had been much discussed in late- and postmediaeval times, especially in the Iberian peninsula, and had recently enjoyed a revival in the Eastern United States. But Middle Knowledge, as the Schoolmen called it, was beyond my comprehension, and I (...) could only exemplify once again the truism that philosophers were better at raising problems than resolving them. (shrink)
The views of Redhead () are defended against the argument by Panu Raatikainen (). The importance of informal rigour is canvassed, and the argument for the a priori nature of induction is explained. The significance of Gödel's theorem is again rehearsed.
This article focuses on Charles Taylors and William Connollys attempts to fashion alternative forms of secular public reasoning to those of liberals like Rawls and Galston. I provide a weak defense of Taylor against both Connolly and many of Taylors liberal secular foes. Despite its noted shortcomings that Connolly can help to address, Taylors model does provide a more adequate basis for thinking through a public morality appropriate to the times because it takes seriously the hold certain values have on (...) non-secular individuals, and thus is very much attuned to the realities of ethical deliberation in early 21st-century western democracies Key Words: democracy liberalism pluralism secularism theism. (shrink)
The bootstrap approach to understanding the elementary particles in hadronic physics was very popular in the 1960s as an alternative to quantum field theory. This episode is subjected to historical, methodological and philosophical analysis designed to complement the extensive work of Jim Cushing in this field.
Granted that truth is valuable we must recognize that certifiable truth is hard to come by, for example in the natural and social sciences. This paper examines the case of mathematics. As a result of the work of Gödel and Tarski we know that truth does not equate with proof. This has been used by Lucas and Penrose to argue that human minds can do things which digital computers can't, viz to know the truth of unprovable arithmetical statements. The argument (...) is given a simple formulation in the context of sorites (Robinson) arithmetic, avoiding the complexities of formulating the Gödel sentence. The pros and cons of the argument are considered in relation to the conception of mathematical truth. (shrink)
In this article, I discuss Charles Taylor's reading of Nietzsche. Taylor argues that Nietzsche presents a challenge on the 'deepest level' because, on Taylor's reading, Nietzsche forces us to consider whether or not our 'continuing allegiance to standards of justice and benevolence' goes against our inner nature. I argue that this purported Nietzschean challenge is more self-revealing of Taylor than it is foreboding, as it brings to light the tension between the open and pluralistic content of Taylor's faith, and the (...) epistemological grounding of it, which a more well-rounded appreciation of Nietzsche could help to alleviate. Key Words: Charles Taylor genealogy ontology moral reasoning Nietzsche theism. (shrink)
Darrin Belousek has argued that the indistinguishability of quantum particles is conventional “in the Duhemian–Einsteinian sense,” in part by critially examining prior arguments given by Redhead and Teller. Belousek's discussion provides a useful occasion to clarify some of those arguments, acknowledge respects in which they were misleading, and comment on how they can be strengthened. We also comment briefly on the relevant sense of “conventional.”.
We introduce a simple model for so-called spin-echo experiments. We show that the model is a mincing system. On the basis of this model we study fine-grained entropy and coarse-grained entropy descriptions of these experiments. The coarse-grained description is shown to be unable to provide an explanation of the echo signals, as a result of the way in which it ignores dynamically generated correlations. This conclusion is extended to the general debate on the foundations of statistical mechanics. We emphasize the (...) need for an appropriate mechanism to explain the gradual suppression over time of the correlations in a thermodynamic system. We argue that such a mechanism can only be provided by the interventionist approach, in which the interaction of the system with its environment is taken into account. Irreversible behavior is then seen to arise not as a result of limited measurement accuracy , but as a result of the fact that thermodynamic systems are limited systems which interact with their environment. A detailed discussion is given of recent objections to the interventionist approach in the literature. (shrink)
The book is drawn from the Tarner lectures, delivered in Cambridge in 1993. It is concerned with the ultimate nature of reality, and how this is revealed by modern physical theories such as relativity and quantum theory. The objectivity and rationality of science are defended against the views of relativists and social constructionists. It is claimed that modern physics gives us a tentative and fallible, but nevertheless rational, approach to the nature of physical reality. The role of subjectivity in science (...) is examined in the fields of relativity theory, statistical mechanics and quantum theory, and recent claims of an essential role for human consciousness in physics is rejected. Prospects for a 'Theory of Everything' are considered, and the related question of how to assess scientific progress is carefully examined. (shrink)
In this paper questions about vacuum fluctuations in local measurements, and the correlations between such fluctuations, are discussed. It is shown that maximal correlations always exist between suitably chosen local projection operators associated with spacelike separated regions of space-time, however far apart these regions may be. The connection of this result with the well-known Fregenhagen bound showing exponential decay of correlations with distance is explained, and the relevance of the discussion to the question “What do particle detectors detect?” is addressed.
The status of the vacuum in relativistic quantum field theory is examined. A sharp distinction arises between the global vacuum and the local vacuum. The concept of local number density is critically assessed. The global vacuum state implies fluctuations for all local observables. Correlations between such fluctuations in space-like separated regions of space-time are discussed and the existence of correlations which are maximal in a certain sense is remarked on, independently of how far apart those regions may be. The analogy (...) with the mirror-image correlations in the singlet state of two spin-1/2 particles is explained. The connection between these maximal correlations and the well-known violation of the Bell inequality in the vacuum state is discussed, together with the way in which the existence of these correlations might be exploited in developing a vacuum version of the Einstein-Podolsky-Rosen argument. The recent relativistic formulation of the Einstein-Podolsky-Rosen argument by Ghirardi and Grassi is critically assessed with particular reference to the vacuum case. (shrink)
An attempt is made to defend realism and the absence of space-like causation in quantum mechanics, by invoking indeterminism and a new necessary condition for stochastic causality, we term robustness. This condition is defended against recent critical attacks by Cartwright and Jones, and by Healey, and the violation of the robustness condition in Bell-type correlation experiments is shown to follow if an appropriate interpretation of the state vector is employed.
We extend the work of French and Redhead  further examining the relation of quantum statistics to the assumption that quantum entities have the sort of identity generally assumed for physical objects, more specifically an identity which makes them susceptible to being thought of as conceptually individuatable and labelable even though they cannot be experimentally distinguished. We also further examine the relation of such hypothesized identity of quantum entities to the Principle of the Identity of Indiscernibles. We conclude that although (...) such an assumption of identity is consistent with the facts of quantum statistics, methodological considerations show that we should take quantum entities to be entirely unindividuatable, in the way suggested by a Fock space description. (shrink)
We further develop a recent new proof (by Greenberger, Horne, and Zeilinger—GHZ) that local deterministic hidden-variable theories are inconsistent with certain strict correlations predicted by quantum mechanics. First, we generalize GHZ's proof so that it applies to factorable stochastic theories, theories in which apparatus hidden variables are causally relevant to measurement results, and theories in which the hidden variables evolve indeterministically prior to the particle-apparatus interactions. Then we adopt a more general measure-theoretic approach which requires that GHZ's argument be modified (...) in order to produce a valid proof. Finally, we motivate our more general proof's assumptions in a somewhat different way from previous authors in order to strengthen the implications of our proof as much as possible. After developing GHZ's proof along these lines, we then consider the analogue, for our proof, of Bohr's reply to the EPR argument, and conclude (pace GHZ) that in at least one respect (viz. that of most concern to Bohr) the proof is no more powerful than Bell's. Nevertheless, we point out some new advantages of our proof over Bell's, and over other algebraic proofs of nonlocality. And we conclude by giving a modified version of our proof that, like Bell's, does not rely on experimentally unrealizable strict correlations, but still leads to a testable “quasi-algebraic” locality inequality.“... to admit things not visible to the gross creatures that we are is, in my opinion, to show a decent humility, and not just a lamentable addiction to metaphysics.”J. S. Bell. (shrink)
The practice of describing multiparticle quantum systems in terms of labeled particles indicates that we think of quantum entities as individuatable. The labels, together with particle indistinguishability, create the need for symmetrization or antisymmetrization (or, in principle, higher-order symmetries), which in turn results in “surplus formal structure” in the formalism, formal structure which corresponds to nothing in the real world. We argue that these facts show quanta to be unindividuatable entities, things in principle incapable of supporting labels, and so things (...) which support no factual difference_if two of them are thought of as being switched. When thinking of the metaphysics of quanta, we should eschew the misleading labels of the tensor product Hilbert space formalism and prefer the ontologically more faithful description of the Fock space formalism. This conception eliminates puzzles about the quantum statistics of bosons. (shrink)
give a proof of the existence of nonlocal influences acting on correlated spin-1/2 particles in the singlet state which does not require any particular interpretation of quantum mechanics (QM). (Except Stapp holds that the proof fails under a many-worlds interpretation of QM—a claim we analyse in 1.2.) Recently, in responding to Redhead's (, pp. 90-6) criticism that the Stapp 1 proof fails under an indeterministic interpretation of QM, Stapp  (henceforth Stapp 2), has revised the logical structure of his proof (...) including its crucial locality assumption. Our main aim is to show that this revision is a step in the wrong direction because it faces two difficulties which undermine the resulting proof's significance (3.1) and validity (3. 2). We also clarify and extend the Stapp 1 proof (1. 1) with the aid of Lewis' analysis of counterfactuals (1. 2) and causal dependence (2. 2 and 2. 3). In so doing, we are able to identify two new defects in the Stapp 1 proof (1. 3 and 2. 1) in addition to corroborating Redhead's criticism (2. 2). Also, the additional assumptions which save the Stapp 1 proof's validity are detailed (2. 3) and some new difficulties for the determinist are pointed out by exploiting a slightly extended version of the proof (2. 4). In providing this full analysis of the Stapp 1 proof, we also construct the necessary framework within which to provide a critique of Stapp 2's proof (3). *Portions of this paper were presented by R. K. Clifton to the 1988 British Society for the Philosophy of Science Conference at the University of Southampton. R. K. Clifton wishes to thank the Natural Sciences and Engineering Research Council of Canada, the Royal Commission for the Exhibition of 1851, and the Governing Body of Peterhouse at Cambridge University for support during this work. (shrink)
It is only when mixing two or more pure substances along a reversible path that the entropy of the mixing can be made physically manifest. It is not, in this case, a mere mathematical artifact. This mixing requires a process of successive stages. In any finite number of stages, the external manifestation of the entropy change, as a definite and measurable quantity of heat, isa fully continuous function of the relevant variables. It is only at an infinite and unattainable limit (...) thata non-uniform convergence occurs. And this occurs when considered in terms of the number of stages together with a distinguishability parameter appropriate to the particular device which is used to achieve reversibility. These considerations, which are of technological interest to chemical engineers, resolve a paradox derived in chemical theory called Gibbs'' Paradox. (shrink)
Department of History and Philosophy of Science. University of Cambridge, Free School Lane, Cambridge CB2 3RH This paper is concerned with the question of whether atomic particles of the same species, i. e. with the same intrinsic state-independent properties of mass, spin, electric charge, etc, violate the Leibnizian Principle of the Identity of Indiscernibles, in the sense that, while there is more than one of them, their state-dependent properties may also all be the same. The answer depends on what exactly (...) the state-dependent properties of atomic particles are taken to be. On the plausible interpretation that these should comprise all monadic and relational properties that can be expressed in terms of physical magnitudes associated with self-adjoint operators that can be defined for the individual particles, then the weakest form of the Principle is shown to be violated for bosons, fermions and higher-order paraparticles, treated in first quantization *Some of the arguments inn this paper appeared in a thesis submited by one of us (S.F.) In partial fulfilment of the requirements for the PhD degree of the University of London, in 1984. entitled 'Identity and ‘Individuality in Classical and Quantum Physics’. (shrink)
Fine has recently proved the surprising result that satisfaction of the Bell inequality in a Clauser-Horne experiment implies the existence of joint probabilities for pairs of noncommuting observables in the experiment. In this paper we show that if probabilities are interpreted in the von Mises-Church sense of relative frequencies on random sequences, a proof of the Bell inequality is nonetheless possible in which such joint probabilities are assumed not to exist. We also argue that Fine's theorem and related results do (...) not impugn the common view that local realists are committed to the Bell inequality. (shrink)
Aiming to unravel the mystery of quantum mechanics, this book is concerned with questions about action-at-a-distance, holism, and whether quantum mechanics gives a complete account of microphysical reality. With rigorous arguments and clear thinking, the author provides an introduction to the philosophy of physics.
A new proof of the impossibility of reconciling realism and locality in quantum mechanics is given. Unlike proofs based on Bell's inequality, the present work makes minimal and transparent use of probability theory and proceeds by demonstrating a Kochen-Specker type of paradox based on the value assignments to the spin components of two spatially separated spin-1 systems in the singlet state of their total spin. An essential part of the argument is to distinguish carefully two commonly confused types of contextuality; (...) we call them ontological and environmental contextuality. These in turn are associated with two quite distinct senses of nonlocality. We indicate the relevance of our treatment to other related discussions in recent literature on the philosophy of quantum mechanics. (shrink)