The symmetrization postulates of quantum mechanics (symmetry for bosons, antisymmetry for fermions) are usually taken to entail that quantum particles of the same kind (e.g., electrons) are all in exactly the same state and therefore indistinguishable in the strongest possible sense. These symmetrization postulates possess a general validity that survives the classical limit, and the conclusion seems therefore unavoidable that even classical particles of the same kind must all be in the same state—in clear conflict with what we know about (...) classical particles. In this article we analyze the origin of this paradox. We shall argue that in the classical limit classical particles emerge, as new entities that do not correspond to the “particle indices” defined in quantum mechanics. Put differently, we show that the quantum mechanical symmetrization postulates do not pertain to particles, as we know them from classical physics, but rather to indices that have a merely formal significance. This conclusion raises the question of whether many discussions in the literature about the status of identical quantum particles have not been misguided. (shrink)
Pekka Lahti is a prominent exponent of the renaissance of foundational studies in quantum mechanics that has taken place during the last few decades. Among other things, he and coworkers have drawn renewed attention to, and have analyzed with fresh mathematical rigor, the threat of inconsistency at the basis of quantum theory: ordinary measurement interactions, described within the mathematical formalism by Schrödinger-type equations of motion, seem to be unable to lead to the occurrence of definite measurement outcomes, whereas the same (...) formalism is interpreted in terms of probabilities of precisely such definite outcomes. Of course, it is essential here to be explicit about how definite measurement results (or definite properties in general) should be represented in the formalism. To this end Lahti et al. have introduced their objectification requirement that says that a system can be taken to possess a definite property if it is certain (in the sense of probability 1) that this property will be found upon measurement. As they have gone on to demonstrate, this requirement entails that in general definite outcomes cannot arise in unitary measuring processes.In this paper we investigate whether it is possible to escape from this deadlock. As we shall argue, there is a way out in which the objectification requirement is fully maintained. The key idea is to adapt the notion of objectivity itself, by introducing relational or perspectival properties. It seems that such a “relational perspective” offers prospects of overcoming some of the long-standing problems in the interpretation of quantum mechanics. (shrink)
We take another look at Reichenbach’s 1920 conversion to conventionalism, with a special eye to the background of his ‘conventionality of distant simultaneity’ thesis. We argue that elements of Reichenbach earlier neo-Kantianism can still be discerned in his later work and, related to this, that his conventionalism should be seen as situated at the level of global theory choice. This is contrary to many of Reichenbach’s own statements, in which he declares that his conventionalism is a consequence of the arbitrariness (...) of coordinative definitions. (shrink)
An often repeated account of the genesis of special relativity tells us that relativity theory was to a considerable extent the fruit of an operationalist philosophy of science. Indeed, Einstein’s 1905 paper stresses the importance of rods and clocks for giving concrete physical content to spatial and temporal notions. I argue, however, that it would be a mistake to read too much into this. Einstein’s operationalist remarks should be seen as serving rhetoric purposes rather than as attempts to promulgate a (...) particular philosophical position --- in fact, Einstein never came close to operationalism in any of his philosophical writings. By focussing on what could actually be measured with rods and clocks Einstein shed doubt on the empirical status of a number of pre-relativistic concepts, with the intention to persuade his readers that the applicability of these concepts was not obvious. This rhetoric manoeuvre has not always been rightly appreciated in the philosophy of physics. Thus, the influence of operationalist misinterpretations, according to which associated operations strictly define what a concept means, can still be felt in present-day discussions about the conventionality of simultaneity. The standard story continues by pointing out that Minkowski in 1908 supplanted Einstein’s approach with a realist spacetime account that has no room for a foundational role of rods and clocks: relativity theory became a description of a four-dimensional ‘absolute world’. As it turns out, however, it is not at all clear that Minkowski was proposing a substantivalist position with respect to spacetime. On the contrary, it seems that from a philosophical point of view Minkowski’s general position was not very unlike the one in the back of Einstein’s mind. However, in Minkowski’s formulation of special relativity it becomes more explicit that the content of spatiotemporal concepts relates to considerations about the form of physical laws. If accepted, this position has important consequences for the discussion about the conventionality of simultaneity. (shrink)
Saunders has recently claimed that “identical quantum particles” with an anti-symmetric state (fermions) are weakly discernible objects, just like irreflexively related ordinary objects in situations with perfect symmetry (Black’s spheres, for example). Weakly discernible objects have all their qualitative properties in common but nevertheless differ from each other by virtue of (a generalized version of) Leibniz’s principle, since they stand in relations an entity cannot have to itself. This notion of weak discernibility has been criticized as question begging, but we (...) defend and accept it for classical cases likes Black’s spheres. We argue, however, that the quantum mechanical case is different. Here the application of the notion of weak discernibility indeed is question begging and in conflict with standard interpretational ideas. We conclude that the introduction of the conceptual resource of weak discernibility does not change the interpretational status quo in quantum mechanics. (shrink)
Modal interpretations have the ambition to construe quantum mechanics as an objective, man-independent description of physical reality. Their second leading idea is probabilism: quantum mechanics does not completely fix physical reality but yields probabilities. In working out these ideas an important motif is to stay close to the standard formalism of quantum mechanics and to refrain from introducing new structure by hand. In this paper we explain how this programme can be made concrete. In particular, we show that the Born (...) probability rule, and sets of definite-valued observables to which the Born probabilities pertain, can be uniquely defined from the quantum state and Hilbert space structure. We discuss the status of probability in modal interpretations, and to this end we make a comparison with many-worlds alternatives. An overall point that we stress is that the modal ideas define a general framework and research programme rather than one definite and finished interpretation. (shrink)
According to the Doomsday Argument we have to rethink the probabilities we assign to a soon or not so soon extinction of mankind when we realize that we are living now, rather early in the history of mankind. Sleeping Beauty finds herself in a similar predicament: on learning the date of her first awakening, she is asked to re-evaluate the probabilities of her two possible future scenarios. In connection with Doom, I argue that it is wrong to assume that our (...) ordinary probability judgements do not already reflect our place in history: we justify the predictive use we make of the probabilities yielded by science (or other sources of information) by our knowledge of the fact that we live now, a certain time before the possible occurrence of the events the probabilities refer to. Our degrees of belief should change drastically when we forget the date—importantly, this follows without invoking the “Self Indication Assumption”. Subsequent conditionalization on information about which year it is cancels this probability shift again. The Doomsday Argument is about such probability shifts, but tells us nothing about the concrete values of the probabilities—for these, experience provides the only basis. Essentially the same analysis applies to the Sleeping Beauty problem. I argue that Sleeping Beauty “thirders” should be committed to thinking that the Doomsday Argument is ineffective; whereas “halfers” should agree that doom is imminent—but they are wrong. (shrink)
The photon box thought experiment can be considered a forerunner of the EPR-experiment: by performing suitable measurements on the box it is possible to ``prepare'' the photon, long after it has escaped, in either of two complementary states. Consistency requires that the corresponding box measurements be complementary as well. At first sight it seems, however, that these measurements can be jointly performed with arbitrary precision: they pertain to different systems (the center of mass of the box and an internal clock, (...) respectively). But this is deceptive. As we show by explicit calculation, although the relevant quantities are simultaneously measurable, they develop non-vanishing commutators when calculated back to the time of escape of the photon. This justifies Bohr's qualitative arguments in a precise way; and it illustrates how the details of the dynamics conspire to guarantee the requirements of complementarity. In addition, our calculations exhibit a ``fine structure'' in the distribution of the uncertainties over the complementary quantities: depending on when the box measurement is performed, the resulting quantum description of the photon differs. This brings us close to the argumentation of the later EPR thought experiment. (shrink)
Saunders has recently claimed that ``identical quantum particles'' with an anti-symmetric state (fermions) are weakly discernible objects, just like irreflexively related ordinary objects in situations with perfect symmetry (Black's spheres, for example). Weakly discernible objects have all their qualitative properties in common but nevertheless differ from each other by virtue of (a generalized version of) Leibniz's principle, since they stand in relations an entity cannot have to itself. This notion of weak discernibility has been criticized as question begging, but we (...) defend and accept it for classical cases likes Black's spheres. We argue, however, that the quantum mechanical case is different. Here the application of the notion of weak discernibility indeed is question begging and in conflict with standard interpretational ideas. We conclude that the introduction of the conceptual resource of weak discernibility does not change the interpretational status quo in quantum mechanics. (shrink)
In his general theory of relativity (GR) Einstein sought to generalize the special-relativistic equivalence of inertial frames to a principle according to which all frames of reference are equivalent. He claimed to have achieved this aim through the general covariance of the equations of GR. There is broad consensus among philosophers of relativity that Einstein was mistaken in this. That equations can be made to look the same in different frames certainly does not imply in general that such frames are (...) physically equivalent. We shall argue, however, that Einstein's position is tenable. The equivalence of arbitrary frames in GR should not be equated with relativity of arbitrary motion, though. There certainly are observable differences between reference frames in GR (differences in the way particles move and fields evolve). The core of our defense of Einstein's position will be to argue that such differences should be seen as fact-like rather than law-like in GR. By contrast, in classical mechanics and in special relativity (SR) the differences between inertial systems and accelerated systems have a law-like status. The fact-like character of the differences between frames in GR justifies regarding them as equivalent in the same sense as inertial frames in SR. (shrink)
This book contains selected papers from the First International Conference on the Ontology of Spacetime. Its fourteen chapters address two main questions: first, what is the current status of the substantivalism/relationalism debate, and second, what about the prospects of presentism and becoming within present-day physics and its philosophy? The overall tenor of the four chapters of the book’s first part is that the prospects of spacetime substantivalism are bleak, although different possible positions remain with respect to the ontological status of (...) spacetime. Part II and Part III of the book are devoted to presentism, eternalism, and becoming, from two different perspectives. In the six chapters of Part II it is argued, in different ways, that relativity theory does not have essential consequences for these issues. It certainly is true that the structure of time is different, according to relativity theory, from the one in classical theory. But that does not mean that a decision is forced between presentism and eternalism, or that becoming has proved to be an impossible concept. It may even be asked whether presentism and eternalism really offer different ontological perspectives at all. The writers of the last four chapters, in Part III, disagree. They argue that relativity theory is incompatible with becoming and presentism. Several of them come up with proposals to go beyond relativity, in order to restore the prospects of presentism. · Space and time in present-day physics and philosophy · Relatively low level of technicality, easily accessible · Introduction from scratch of the debates surrounding time · Top authors explaining their positions · Broad spectrum of approaches, coherently represented. (shrink)
It is a central aspect of our ordinary concept of time that history unfolds and events come into being. It is only natural to take this seriously. However, it is notoriously difficult to explain further what this `becoming' consists in, or even to show that the notion is consistent at all. In this article I first argue that the idea of a global temporal ordering, involving a succession of cosmic nows, is not indispensable for our concept of time. Our experience (...) does not support the existence of global simultaneity and arguments from modern physics further support the conclusion that time should not be seen as a succession of cosmic nows. Accordingly, I propose that if we want to make sense of becoming we should attempt to interpret it as something purely local. Second, I address the question of what this local becoming consists in. I maintain that processes of becoming are nothing but the successive happening of events, and that this happening of events consists entirely in the occurring of these events at their own spacetime locations. This leads to a consistent view of becoming, which is applicable even to rather pathological spacetimes. (shrink)
Jim Cushing emphasized that physical theory should tell us an intelligible and objective story about the world, and concluded that the Bohm theory is to be preferred over the Copenhagen interpretation. We argue here, however, that the Bohm theory is only one member of a wider class of interpretations that can be said to fulfill Cushing’s desiderata. We discuss how the pictures provided by these interpretations differ from the classical one. In particular, it seems that a rather drastic form of (...) perspectivalism is needed if accordance with special relativity is to be achieved. (shrink)
Achieving understanding of nature is one of the aims of science. In this paper we offer an analysis of the nature of scientific understanding that accords with actual scientific practice and accommodates the historical diversity of conceptions of understanding. Its core idea is a general criterion for the intelligibility of scientific theories that is essentially contextual: which theories conform to this criterion depends on contextual factors, and can change in the course of time. Our analysis provides a general account of (...) how understanding is provided by scientific explanations of diverse types. In this way, it reconciles conflicting views of explanatory understanding, such as the causal-mechanical and the unificationist conceptions. (shrink)
We study the process of observation (measurement), within the framework of a “perspectival” (“relational,” “relative state”) version of the modal interpretation of quantum mechanics. We show that if we assume certain features of discreteness and determinism in the operation of the measuring device (which could be a part of the observer's nerve system), this gives rise to classical characteristics of the observed properties, in the first place to spatial localization. We investigate to what extent semi-classical behavior of the object system (...) itself (as opposed to the observational system) is needed for the emergence of classicality. Decoherence is an essential element in the mechanism of observation that we assume, but it turns out that in our approach no environment-induced decoherence on the level of the object system is required for the emergence of classical properties. (shrink)
In relativistic quantum field theory the notion of a local operation is regarded as basic: each open space-time region is associated with an algebra of observables representing possible measurements performed within this region. It is much more difficult to accommodate the notions of events taking place in such regions or of localized objects. But how can the notion of a local operation be basic in the theory if this same theory would not be able to represent localized measuring devices and (...) localized events? After briefly reviewing these difficulties we discuss a strategy for eliminating the tension, namely by interpreting quantum theory in a realist way. To implement this strategy we use the ideas of the modal interpretation of quantum mechanics. We then consider the question of whether the resulting scheme can be made Lorentz invariant. (shrink)
I argue that there is natural relationist interpretation of Newtonian and relativistic non-quantum physics. Although relationist, this interpretation does not fall prey to the traditional objections based on the existence of inertial effects.
According to the Doomsday Argument the probability of an impending extinction of mankind is much higher than we think. The adduced reason is that in our assignment of probabilities to soon or not so soon doom we have not fully taken into account that we live in the specific year 2001. This is relevant information, because if I consider myself as an arbitrary member of the human race I have a much higher probability of finding myself living in 2001 on (...) the hypothesis of a soon extinction, Doom Soon, than on the hypothesis of Doom Late---according to the latter hypothesis there are so many more years I could have found myself living in. Accordingly, Bayesian reasoning leads to a posterior probability of the Doom Soon hypothesis, after I have taken the evidence of my birth date fully into account, that is much higher than the prior probability. I show that the Argument is nothing but a rather trivial mathematical exercise in the calculation of posterior from prior probabilities; it is only about the relation between these probabilities and is silent about the concrete values these probabilities should have. Nothing in the Argument supports the conclusion its proponents think it supports, namely that Doom Soon is much more probable than we ordinarily think. The Argument is formally valid, but ineffective. (shrink)
Reductionism, in the sense of the doctrine that theories on different levels of reality should exhibit strict and general relations of deducibility, faces well-known difficulties. Nevertheless, the idea that deeper layers of reality are responsible for what happens at higher levels is well-entrenched in scientific practice. We argue that the intuition behind this idea is adequately captured by the notion of supervenience: the physical state of the fundamental physical layers fixes the states of the higher levels. Supervenience is weaker than (...) traditional reductionism, but it is not a metaphysical doctrine: one can empirically support the existence of a supervenience relation by exhibiting concrete relations between the levels. Much actual scientific research is directed towards finding such inter-level relations. It seems to be quite generally held that the importance of such relations between different levels is that they are explanatory and give understanding: deeper levels provide deeper understanding, and this justifies the search for ever deeper levels. We shall argue, however, that although achieving understanding is an important aim of science, its correct analysis is not in terms of relations between higher and lower levels. Connections with deeper layers of reality do not generally provide for deeper understanding. Accordingly, the motivation for seeking deeper levels of reality does not come from the desire to find deeper understanding of phenomena, but should be seen as a consequence of the goal to formulate ever better, in the sense of more accurate and more-encompassing, empirical theories. (shrink)
We generalize the modal interpretation of quantum mechanics so that it may be applied to composite systems represented by arbitrary density operators. We discuss the interpretation these density operators receive and relate this to the discussion about the interpretation of proper and improper mixtures in the standard interpretation.
It is argued that the symmetry and anti-symmetry of the wave functions of systems consisting of identical particles have nothing to do with the observational indistinguishability of these particles. Rather, a much stronger conceptual indistinguishability is at the bottom of the symmetry requirements. This can be used to argue further, in analogy to old arguments of De Broglie and Schrödinger, that the reality described by quantum mechanics has a wave-like rather than particle-like structure. The question of whether quantum statistics alone (...) can give rise to empirically observable correlations between results of distant measurements is also discussed. (shrink)
In his book Philosophie der Raum-Zeit-Lehre (1928) Reichenbach introduced the concept of universal force. Reichenbach's use of this concept was later severely criticized by Grünbaum. In this article it is argued that although Grünbaum's criticism is correct in an important respect, it misses part of Reichenbach's intentions. An attempt is made to clarify and defend Reichenbach's position, and to show that universal force is a useful notion in the physically important case of gravitation.
Summary A recurrent theme in the philosophical literature on the special theory of relativity is the question as to the reality of the Lorentz contraction. It is often suggested that there is a difference between the Lorentz-FitzGerald contraction in the pre-relativistic ether theory and the Lorentz contraction from special relativity in the sense that the former is a real contraction of matter conditioned by dynamical laws, whereas the latter should be compared with the apparent changes in the size of objects (...) when the perspective of the observer changes. It is here shown, however, that the same laws of nature which are operative in the Lorentz-FitzGerald contraction also condition the relativistic Lorentz contraction. The relevant distinction therefore is not between the reality of the two contractions. What is at issue is the difference in explanatory structure of the pre-relativistic theory on the one hand and the special theory of relativity on the other. In the course of the argument the question of the conventionality of simultaneity is also discussed. (shrink)