The Rovelli relational interpretation of quantum mechanics is based on the assumption that the notion of observer-independent state of a physical system is to be rejected. In RQM the primary target of the theory is the analysis of the whole network of relations that may be established among quantum subsystems, and the shift to a relational perspective is supposed to address in a satisfactory way the general problem of the interpretation of quantum mechanics. Here I discuss two basic issues, that (...) I take to be serious open problems of the interpretation. First, I wish to show—mainly through an analysis of the so-called third person problem—that it is far from clear what a relativization of states to observers exactly achieves and in what sense such an approach really advances our understanding of the peculiar features of quantum phenomena. Second, I argue that the claim, according to which RQM is able to preserve locality, is at best dubious. I conclude that further work needs to be done before RQM may aspire to become a satisfactory interpretational framework for the main foundational issues in quantum mechanics. (shrink)
Relational quantum mechanics is an interpretation of quantum theory which discards the notions of absolute state of a system, absolute value of its physical quantities, or absolute event. The theory describes only the way systems affect each other in the course of physical interactions. State and physical quantities refer always to the interaction, or the relation, between two systems. Nevertheless, the theory is assumed to be complete. The physical content of quantum theory is understood as expressing the net of relations (...) connecting all different physical systems. (shrink)
According to a widespread view, the Bell theorem establishes the untenability of so-called ‘local realism’. On the basis of this view, recent proposals by Leggett, Zeilinger and others have been developed according to which it can be proved that even some non-local realistic theories have to be ruled out. As a consequence, within this view the Bell theorem allows one to establish that no reasonable form of realism, be it local or non-local, can be made compatible with the (experimentally tested) (...) predictions of quantum mechanics. In the present paper it is argued that the Bell theorem has demonstrably nothing to do with the ‘realism’ as defined by these authors and that, as a consequence, their conclusions about the foundational significance of the Bell theorem are unjustified. (shrink)
The main claim of the paper is that one can be ‘realist’ (in some sense) about quantum mechanics without requiring any form of realism about the wave function. We begin by discussing various forms of realism about the wave function, namely Albert’s configuration-space realism, Dürr Zanghi and Goldstein’s nomological realism about Ψ, Esfeld’s dispositional reading of Ψ Pusey Barrett and Rudolph’s realism about the quantum state. By discussing the articulation of these four positions, and their interrelation, we conclude that instrumentalism (...) about Ψ is by itself not sufficient to choose one over the other interpretations of quantum mechanics, thereby confirming in a different way the indetermination of the metaphysical interpretations of quantum mechanics. -/- Key words: . (shrink)
I show why old and new claims on the role of counterfactual reasoning for the EPR argument and Bell’s theorem are unjustified: once the logical relation between locality and counterfactual reasoning is clarified, the use of the latter does no harm and the nonlocality result can well follow from the EPR premises. To show why, after emphasizing the role of incompleteness arguments that Einstein developed before the EPR paper, I critically review more recent claims that equate the use of counterfactual (...) reasoning with the assumption of a strong form of realism and argue that such claims are untenable. (shrink)
In the area of the foundations of quantum mechanics a true industry appears to have developed in the last decades, with the aim of proving as many results as possible concerning what there cannot be in the quantum realm. In principle, the significance of proving ‘no-go’ results should consist in clarifying the fundamental structure of the theory, by pointing out a class of basic constraints that the theory itself is supposed to satisfy. In the present paper I will discuss some (...) more recent no-go claims and I will argue against the deep significance of these results, with a two-fold strategy. First, I will consider three results concerning respectively local realism, quantum covariance and predictive power in quantum mechanics, and I will try to show how controversial the main conditions of the negative theorem turn out to be—something that strongly undermines the general relevance of these theorems. Second, I will try to discuss what I take to be a common feature of these theoretical enterprises, namely that of aiming at establishing negative results for quantum mechanics in absence of a deeper understanding of the overall ontological content and structure of the theory. I will argue that the only way toward such an understanding may be to cast in advance the problems in a clear and well-defined interpretational framework—which in my view means primarily to specify the ontology that quantum theory is supposed to be about—and after to wonder whether problems that seemed worth pursuing still are so in the framework. (shrink)
I show why old and new claims on the role of counterfactual reasoning for the EPR argument and Bell’s theorem are unjustified: once the logical relation between locality and counterfactual reasoning is clarified, the use of the latter does no harm and the nonlocality result can well follow from the EPR premises. To show why, after emphasizing the role of incompleteness arguments that Einstein developed before the EPR paper, I critically review more recent claims that equate the use of counterfactual (...) reasoning with the assumption of a strong form of realism and argue that such claims are untenable. (shrink)
In their theoretical and experimental reflections on the capacities and behaviours of living systems, neuroscientists often formulate generalizations about the behaviour of neural circuits. These generalizations are highly idealized, as they omit reference to the myriads of conditions that could perturb the behaviour of the modelled system in real-world settings. This article analyses an experimental investigation of the behaviour of place cells in the rat hippocampus, in which highly idealized generalizations were tested by comparing predictions flowing from them with real-world (...) experimental results. The aim of the article is to identify under what conditions even single prediction failures regarding the behaviour of single cells sufficed to reject highly idealized generalizations, and under what conditions prima facie counter-examples were deemed to be irrelevant to the testing of highly idealized generalizations. The results of this analysis may contribute to understanding how idealized models are tested experimentally in neuroscience and used to make reliable predictions concerning living systems in real-world settings. (shrink)
The year 2005 has been named the World Year of Physics in recognition of the 100th anniversary of Albert Einstein's "Miracle Year," in which he published four landmark papers which had deep and great influence on the last and the current century: quantum theory, general relativity, and statistical mechanics. Despite the enormous importance that Einstein’s discoveries played in these theories, most physicists adopt a version of quantum theory which is incompatible with the idea that motivated Einstein in the first place. (...) This seems to suggest that Einstein was fundamentally incapable of appreciating the `quantum revolution,’ and that his vision of physics as an attempt to reach a complete and comprehensive description of reality was ultimately impossible to obtain. Relativity theory has provided us with a picture of reality in which the world can be though as independent on who observes it, and the same can be said for statistical mechanics. Instead, quantum mechanics seems to suggest that physical objects do not exist `out there’ when someone is not observing them. In this framework, it is often suggested that any kind of causal explanation is impossible in the atomic and subatomic world, and therefore should be abandoned. This is why many think that it is in principle impossible for quantum theory to provide us with a coherent and comprehensive view of the world, in contrast with what happens with relativity and statistical mechanics. Is it really impossible to pursue Einstein’s ideal of physics also in the quantum framework? This book argues that this is not the case: the central idea is that Einstein’s vision of physics is still a live option, and indeed it is the one that best allows obtaining a unitary understanding of our physical theories. One can consider all the three theories mentioned above, suitably modified, as theories that are able to account and explain the world around us without too much departure from the classical framework. ---------------------------------------------------------------------------------------------------- -------------------------------------------------- -/- La teoria della relatività, la meccanica statistica e la meccanica quantistica hanno profondamente rivoluzionato il nostro modo di concepire spazio, tempo, materia, probabilità e causalità, nonché il rapporto tra universo fisico ed osservatore, nozioni che sono state al centro della discussione filosofica dal mondo greco fino ai nostri giorni. Questo volume, opera di Valia Allori, Mauro Dorato, Federico Laudisa e Nino Zanghì, non solo intende suggerire nuovi metodi di confronto tra fisica e filosofia, ma prova altresì a rendere espliciti i presupposti filosofici che sono presenti nell'interpretazione che i fisici stessi danno del formalismo matematico. (shrink)
On the basis of Mackey's axiomatic approach to quantum physics or, equivalently, of a “state-event-probability” (SEVP) structure, using a quite standard “fuzzification” procedure, a set of unsharp events (or “effects”) is constructed and the corresponding “state-effect-probability” (SEFP) structure is introduced. The introduction of some suitable axioms gives rise to a partially ordered structure of quantum Brouwer-Zadeh (BZ) poset; i.e., a poset endowed with two nonusual orthocomplementation mappings, a fuzzy-like orthocomplementation, and an intuitionistic-like orthocomplementation, whose set of sharp elements is an (...) orthomodular complete lattice. As customary, by these orthocomplementations the two modal-like necessity and possibility operators are introduced, and it is shown that Ludwig's and Jauch-Piron's approaches to quantum physics are “interpreted” in complete SEFP. As a marginal result, a standard procedure to construct a lot of unsharp realizations starting from any sharp realization of a fixed observable is given, and the relationship among sharp and corresponding unsharp realizations is studied. (shrink)
According to a wrong interpretation of the Bell theorem, it has been repeatedly claimed in recent times that we are forced by experiments to drop any possible form of realism in the foundations of quantum mechanics. In this paper I defend the simple thesis according to which the above claim cannot be consistently supported: the Bell theorem does not concern realism, and realism per se cannot be refuted in itself by any quantum experiment. As a consequence, realism in quantum mechanics (...) is not something that can be simply explained away once and for all on the basis of experiments, but rather something that must be conceptually characterized and discussed in terms of its foundational virtues and vices. To assess it, we cannot rely on experimentation but rather on philosophical discussion: realism is not a phlogiston-like notion, despite the efforts of the contemporary quantum orthodoxy to conceive it in Russellian terms as the relics of a bygone age. (shrink)
In the context of stochastic hidden variable theories, Howard has argued that the role of separability—spatially separated systems possess distinct real states—has been underestimated. Howard claims that separability is equivalent to Jarrett‘s completeness: this equivalence should imply that the Bell theorem forces us to give up either separability or locality. Howard's claim, however, is shown to be ill founded since it is based on an implausible assumption. The necessity of sharply distinguishing separability and locality is emphasized: a quantitative formulation of (...) separability, due to D'Espagnat, is reviewed and found unsatisfactory, in that it basically conflates separability and locality in a single notion. Finally, the possibility of an ‘Einsteinian’ nonseparable realism, envisaged by Shimony, is reviewed and found also to be implausible. (shrink)
In a recent paper on Foundations of Physics, Stephen Boughn reinforces a view that is more shared in the area of the foundations of quantum mechanics than it would deserve, a view according to which quantum mechanics does not require nonlocality of any kind and the common interpretation of Bell theorem as a nonlocality result is based on a misunderstanding. In the present paper I argue that this view is based on an incorrect reading of the presuppositions of the EPR (...) argument and the Bell theorem and, as a consequence, is unfounded. (shrink)
In a recent paper on Foundations of Physics Stephen Boughn argued that quantum mechanics does not require nonlocality of any kind and that the common interpretation of Bell theorem as a nonlocality result is based on a misunderstanding. In this note I argue that the Boughn arguments, that summarize views widespread in certain areas of the foundations of quantum mechanics, are based on an incorrect reading of the presuppositions of the EPR argument and the Bell theorem and, as a consequence, (...) are totally unfounded. (shrink)
The Bell 1964 theorem states that nonlocality is a necessary feature of hidden variable theories that reproduce the statistical predictions of quantum mechanics. In view of the no-go theorems for non-contextual hidden variable theories already existing up to 1964, and due to Gleason and Bell, one is forced to acknowledge the contextual character of the hidden variable theory which the Bell 1964 theorem refers to. Both the mathematical and the physical justifications of this contextualism are reconsidered. Consequently, the role of (...) contextualism in recent no-hidden-variables proofs and the import of these proofs are investigated. With reference to the physical intuition underlying contextualism, the possibility is considered whether a context-dependence of individual measurement results is compatible with context-independence of the statistics of measurement results. (shrink)
The view that takes laws of nature to be essentially nothing more than descriptions of facts is still rather popular. The present article, on the contrary, defends the claim that the only real motivation for defending a descriptive view of laws—the quest for ontological parsimony—entails too high a price to pay in philosophical terms. It is argued that nomic primitivism, namely the alternative option that takes laws to be primitive fundamental entities in our ontology, is decisively more appealing, since it (...) is the crucial role assigned to laws that makes a scientific theory of natural phenomena a system rather than a list. Finally, the implications that nomic primitivism might have for the issue of the status of the wave function in that particular formulation of quantum mechanics known as Bohmian mechanics are considered. (shrink)
According to a wrong interpretation of the Bell theorem, it has been repeatedly claimed in recent times that we are forced by experiments to drop any possible form of realism in the foundations of quantum mechanics. In this paper I defend the simple thesis according to which the above claim cannot be consistently supported: the Bell theorem does not concern realism, and realism per se cannot be refuted in itself by any quantum experiment. As a consequence, realism in quantum mechanics (...) is not something that can be simply explained away once and for all on the basis of experiments, but rather something that must be conceptually characterized and discussed in terms of its foundational virtues and vices. To assess it, we cannot rely on experimentation but rather on philosophical discussion: realism is not a phlogiston-like notion, despite the efforts of the contemporary quantum orthodoxy to conceive it in Russellian terms as the relics of a bygone age. (shrink)
In his 2013 Foundations of Physics paper Mathias Egg claims to show that my critical arguments toward the foundational significance of Leggett’s non-local theories are misguided. The main motivation is that my argument would connect too strongly the Leggett original motivation for introducing this new class of theories with the foundational significance of these theories per se. Egg basically aims to show that, although it can be conceded that the Leggett original motivation relies on a mistaken view of the original (...) Bell theorem, the investigation on the Leggett theories does have a foundational meaning that can be disassociated from the view that Leggett himself has of of them. As a reply to Egg, I would like to argue here that, even if we assume to disentangle the Leggett view from the fate of the Leggett theories, there is still room to dispute the foundational significance of the Leggett ‘non-local realistic’ research program. (shrink)
The physics and metaphysics of quantum field theory Content Type Journal Article Category Book Review Pages 1-3 DOI 10.1007/s11016-011-9609-2 Authors Federico Laudisa, Department of Human Sciences “R. Massa”, University of Milan-Bicocca, Piazza Ateneo Nuovo 1, 20126 Milan, Italy Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796.
The status of a causal approach to EPR-Bell nonlocal correlations in terms of a counterfactual framework for causation is considered. It is argued that when the relativistic spacetime structure of the events is taken into due account, the adoption of this approach is best motivated by the assumption of a preferred frame of reference, an assumption that seems even more in need of justification than the causal theory itself.
The debate over the question whether quantum mechanics should be considered as a complete account of microphenomena has a long and deeply involved history, a turning point in which has been certainly the Einstein-Bohr debate, with the ensuing charge of incompleteness raised by the Einstein-Podolsky-Rosen argument. In quantum mechanics, physical systems can be prepared in pure states that nevertheless have in general positive dispersion for most physical quantities; hence in the EPR argument, the attention is focused on the question whether (...) the account of the microphysical phenomena provided by quantum mechanics is to be regarded as an exhaustive description of the physical reality to which those phenomena are supposed to refer, a question to which Einstein himself answered in the negative. However, there is a mathematical side of the completeness issue in quantum mechanics, namely the question whether the kind of states with positive dispersion can be represented as a different, dispersion-free kind of states in a way consistent with the mathematical constraints of the quantum mechanical formalism. From this point of view, the other source of the completeness issue in quantum mechanics is the no hidden variables theorem formulated by John von Neumann in his celebrated book on the mathematical foundations of quantum mechanics, the preface of which already anticipates the program and the conclusion concerning the possibility of ‘neutralizing’ the statistical character of quantum mechanics. (shrink)
The paper investigates the question whether the nature of non-locality in quantum mechanics can be better understood by viewing it as grounded in some sort of causation. A general conclusion that may be drawn from the discussion above is that, as far as ordinary quantum mechanics is concerned, we are facing a dilemma: either the notion of causation is interpreted in such general terms so as to lose sight of the original underlying intuition - so that we seem to do (...) nothing but giving a different name to the puzzle under scrutiny - or we are led to ascribe to the special-relativistic spacetime structure a purely phenomenological status in order to make room for a preferred spacetime foliation, with respect to which causal relations can be univocally defined. (shrink)
The papers collected in this volume are based on the best contributions to the conference of the Italian Society for Logic and Philosophy of Science (SILFS) that took place in Milan on 8-10 October 2007. The aim of the Society, since its foundation in 1952, has always been that of bringing together scholars - working in the broad areas of Logic, Philosophy of Science and History of Science - who share an open-minded approach to their disciplines and regard them as (...) essentially requiring continuous confrontation and bridge-building to avoid the danger of over-specialism. In this perspective, logicians and philosophers of science should not indulge in inventing and cherishing their own "internal problems" - although these may occasionally be an opportunity for conceptual clarification - but should primarily look at the challenging conceptual and methodological questions that arise in any genuine attempt to extend our objective knowledge. As Ludovico Geymonat used to put it: " good] philosophy should be sought in the folds of science itself." Contributions are distributed into six sections, five of which - "Logic and Computing," "Physics and Mathematics," "Life Sciences," "Economics and Social Sciences," "Neuroscience and Philosophy of Mind" - are devoted to the discussion of cutting-edge problems that arise from current-day scientific research, while the remaining section on "General Philosophy of Science" is focused on foundational and methodological questions that are common to all areas. (shrink)
In spite of the relevance of a scientific representation of the world for naturalism, it is surprising that philosophy of science is less involved in the debate on naturalism than expected. Had the viewpoint of philosophy of science been duly considered, naturalism could not have overlooked the established lesson, according to which there is no well-defined recipe for what science must or must not be. In the present paper I address some implications of this lesson for naturalism, arguing that a (...) radically naturalistic outlook fails to pay sufficient attention to some of the main lessons that philosophy of science has taught us concerning the nature of scientific theories. One of these lessons is that real scientific theories are far more normative than ordinary scientific naturalism is ready to accept, a circumstance that at a minimum is bound to force most naturalization strategies to re-define their significance. (shrink)
In recent years, a number of research projects have been proposed whose goal is to build large-scale simulations of brain mechanisms at unprecedented levels of biological accuracy. Here it is argued that the roles these simulations are expected to play in neuroscientific research go beyond the “synthetic method” extensively adopted in Artificial Intelligence and biorobotics. In addition we show that, over and above the common goal of simulating brain mechanisms, these projects pursue various modelling ambitions that can be sharply distinguished (...) from one another, and that correspond to conceptually different interpretations of the notion of “biological accuracy”. They include the ambition to reach extremely deep levels in the mechanistic decomposition hierarchy, to simulate networks composed of extremely large numbers of neural units, to build systems able to generate rich behavioural repertoires, to simulate “complex” neuron models, to implement the “best” theories available on brain structure and function. Some questions will be raised concerning the significance of each of these modelling ambitions with respect to the various roles played by simulations in the study of the brain. (shrink)
In spite of the relevance of a scientific representation of the world for naturalism, it is surprising that philosophy of science is less involved in the debate on naturalism than expected. Had the viewpoint of philosophy of science been duly considered, naturalism could not have overlooked the established lesson, according to which there is no well-defined recipe for what science must or must not be. The present paper addresses some implications of this lesson for naturalism. First I will question the (...) very significance of the distinction 'ontological vs. epistemic naturalism', by defending a conceptual priority of the latter over the former. Then I will focus on the implications of this priority for naturalization strategies, claiming that these strategies underestimate the normativity of scientific theories themselves. Finally, on the basis of the above points, I will have a critical look at an especially ‘aggressive’ variant of naturalism, according to which all epistemic facts are natural facts. (shrink)
The paper focuses on the Humean origins of contemporary philosophical naturalism and attempts to address fundamental issues like the following: to what extent is the naturalistic interpretation of Humean philosophy influenced by contemporary interpretations of naturalism itself? Can we really make Humean naturalism consistent with contemporary naturalism? Is the former really relevant to the latter, and in what sense? The above analysis is not meant simply to be an exercise in Humean scholarship, but also a contribution to the understanding of (...) the philosophical foundations of naturalism, involving two general main claims. First, the image of science that shapes Humean naturalism is different from the corresponding contemporary one, so as to make far from obvious the claim according to which contemporary naturalism is inspired by the Humean one. Second, science itself presupposes a sort of intrinsic explanatory normativity, that makes the naturalistic claim on the non-normative character of scientific explanations highly controversial. (shrink)
It is usually held that the standard collapse model of a quantum measurement process grounds a kind of fundamental time asymmetry. The question whether and how it should be possible to reconstruct uniquely one's own history in an Everett no-collapse interpretation of quantum theory is investigated. A particular approach to the Everett interpretation, due to John S. Bell, is considered, according to which one of the chief claims of the Everett quantum theory is precisely that it allows us to do (...) without the notion of history. (shrink)