This book offers a thorough technical elaboration and philosophical defense of an objectivist informational interpretation of quantum mechanics according to which its novel content is located in its kinematical framework, that is, in how the theory describes systems independently of the specifics of their dynamics. -/- It will be of interest to researchers and students in the philosophy of physics and in theoretical physics with an interest in the foundations of quantum mechanics. Additionally, parts of the book may be used (...) as the basis for courses introducing non-physics majors to quantum mechanics, or for self-study by those outside of the university with an interest in quantum mechanics. (shrink)
Although computation and the science of physical systems would appear to be unrelated, there are a number of ways in which computational and physical concepts can be brought together in ways that illuminate both. This volume examines fundamental questions which connect scholars from both disciplines: is the universe a computer? Can a universal computing machine simulate every physical process? What is the source of the computational power of quantum computers? Are computational approaches to solving physical problems and paradoxes always fruitful? (...) Contributors from multiple perspectives reflecting the diversity of thought regarding these interconnections address many of the most important developments and debates within this exciting area of research. Both a reference to the state of the art and a valuable and accessible entry to interdisciplinary work, the volume will interest researchers and students working in physics, computer science, and philosophy of science and mathematics. (shrink)
According to the Gottesman–Knill theorem, quantum algorithms that utilize only the operations belonging to a certain restricted set are efficiently simulable classically. Since some of the operations in this set generate entangled states, it is commonly concluded that entanglement is insufficient to enable quantum computers to outperform classical computers. I argue in this article that this conclusion is misleading. First, the statement of the theorem is, on reflection, already evident when we consider Bell’s and related inequalities in the context of (...) a discussion of computational machines. This, in turn, helps us to understand that the appropriate conclusion to draw from the Gottesman–Knill theorem is not that entanglement is insufficient to enable a quantum performance advantage, but rather that if we limit ourselves to the operations referred to in the Gottesman–Knill theorem, we will not have used the resources provided by an entangled quantum system to their full potential. (shrink)
Combining physics, mathematics and computer science, quantum computing and its sister discipline of quantum information have developed in the past few decades from visionary ideas to two of the most fascinating areas of quantum theory. General interest and excitement in quantum computing was initially triggered by Peter Shor (1994) who showed how a quantum algorithm could exponentially “speed-up” classical computation and factor large numbers into primes far more efficiently than any (known) classical algorithm. Shor’s algorithm was soon followed by several (...) other algorithms that aimed to solve combinatorial and algebraic problems, and in the years since theoretical study of quantum systems serving as computational devices has achieved tremendous progress. Common belief has it that the implementation of Shor’s algorithm on a large scale quantum computer would have devastating consequences for current cryptography protocols which rely on the premise that all known classical worst-case algorithms for factoring take time exponential in the length of their input (see, e.g., Preskill 2005). Consequently, experimentalists around the world are engaged in attempts to tackle the technological difficulties that prevent the realisation of a large scale quantum computer. But regardless whether these technological problems can be overcome (Unruh 1995; Ekert and Jozsa 1996; Haroche and Raimond 1996), it is noteworthy that no proof exists yet for the general superiority of quantum computers over their classical counterparts. -/- The philosophical interest in quantum computing is manifold. From a social-historical perspective, quantum computing is a domain where experimentalists find themselves ahead of their fellow theorists. Indeed, quantum mysteries such as entanglement and nonlocality were historically considered a philosophical quibble, until physicists discovered that these mysteries might be harnessed to devise new efficient algorithms. But while the technology for harnessing the power of 50–100 qubits (the basic unit of information in the quantum computer) is now within reach (Preskill 2018), only a handful of quantum algorithms exist, and the question of whether these can truly outperform any conceivable classical alternative is still open. From a more philosophical perspective, advances in quantum computing may yield foundational benefits. For example, it may turn out that the technological capabilities that allow us to isolate quantum systems by shielding them from the effects of decoherence for a period of time long enough to manipulate them will also allow us to make progress in some fundamental problems in the foundations of quantum theory itself. Indeed, the development and the implementation of efficient quantum algorithms may help us understand better the border between classical and quantum physics (Cuffaro 2017, 2018a; cf. Pitowsky 1994, 100), and perhaps even illuminate fundamental concepts such as measurement and causality. Finally, the idea that abstract mathematical concepts such as computability and complexity may not only be translated into physics, but also re-written by physics bears directly on the autonomous character of computer science and the status of its theoretical entities—the so-called “computational kinds”. As such it is also relevant to the long-standing philosophical debate on the relationship between mathematics and the physical world. (shrink)
I argue that our judgements regarding the locally causal models that are compatible with a given constraint implicitly depend, in part, on the context of inquiry. It follows from this that certain quantum no-go theorems, which are particularly striking in the traditional foundational context, have no force when the context switches to a discussion of the physical systems we are capable of building with the aim of classically reproducing quantum statistics. I close with a general discussion of the possible implications (...) of this for our understanding of the limits of classical description, and for our understanding of the fundamental aim of physical investigation. _1_ Introduction _2_ No-Go Results _2.1_ The CHSH inequality _2.2_ The GHZ equality _3_ Classically Simulating Quantum Statistics _3.1_ GHZ statistics _3.2_ Singlet statistics _4_ What Is a Classical Computer Simulation? _5_ Comparing the All-or-Nothing GHZ with Statistical equalities _6_ General Discussion _7_ Conclusion. (shrink)
A primary goal of quantum computer science is to find an explanation for the fact that quantum computers are more powerful than classical computers. In this paper I argue that to answer this question is to compare algorithmic processes of various kinds and to describe the possibility spaces associated with these processes. By doing this, we explain how it is possible for one process to outperform its rival. Further, in this and similar examples little is gained in subsequently asking a (...) how-actually question. Once one has explained how-possibly, there is little left to do. (shrink)
The principle of 'information causality' can be used to derive an upper bound---known as the 'Tsirelson bound'---on the strength of quantum mechanical correlations, and has been conjectured to be a foundational principle of nature. In this paper, however, I argue that the principle has not to date been sufficiently motivated to play this role; the motivations that have so far been given are either unsatisfactorily vague or else amount to little more than an appeal to intuition. I then consider how (...) one might begin to successfully motivate the principle. I argue that a compelling way of so doing is to understand it as a generalisation of Einstein's principle of the mutually independent existence---the 'being-thus'---of spatially distant things, interpreted as a special methodological principle. More specifically: I describe an argument, due to Demopoulos, to the effect that the quantum-mechanical no-signalling condition can be viewed as a generalisation, appropriate to an irreducibly statistical theory such as quantum mechanics, of the Einsteinian principle. And I then argue that a compelling way to motivate information causality is to in turn consider it as a further generalisation of the Einsteinian principle that is appropriate to a theory of communication. I nevertheless describe important obstacles that must yet be overcome if the project of establishing information causality as a foundational principle of nature is to succeed. (shrink)
This chapter is about Grete Hermann, a philosopher-mathematician who productively and mutually beneficially interacted with the founders of quantum mechanics in the early period of that theory's elaboration. Hermann was a neo-Kantian philosopher. At the heart of Immanuel Kant's critical philosophy lay the question of the conditions under which we can be said to know something objectively, a question Hermann found to be particularly pressing in quantum mechanics. Hermann's own approach to Neo-Kantianism was Neo-Friesian. Jakob Friedrich Fries, like Kant, had (...) understood critical philosophy to be an essentially epistemic project. Fries departed from Kant in his account of the elements involved in our cognition. In this chapter it is discussed how, beginning from a neo-Friesian understanding of critical philosophy, Hermann is led to conclude that quantum mechanics shows us that physical knowledge is fundamentally split; that the objects of quantum mechanics are only objects from a particular perspective and in the context of a particular physical interaction. It will be seen how Hermann's solution to the problem of objectivity in quantum mechanics is a natural one from a neo-Friesian point of view, even though it disagrees with those offered by more orthodox versions of Kantian doctrine. (shrink)
A growing number of commentators have, in recent years, noted the important affinities in the views of Immanuel Kant and Niels Bohr. While these commentators are correct, the picture they present of the connections between Bohr and Kant is painted in broad strokes; it is open to the criticism that these affinities are merely superficial. In this essay, I provide a closer, structural, analysis of both Bohr's and Kant's views that makes these connections more explicit. In particular, I demonstrate the (...) similarities between Bohr's argument, on the one hand, that neither the wave nor the particle description of atomic phenomena pick out an object in the ordinary sense of the word, and Kant's requirement, on the other hand, that both ‘mathematical’ (having to do with magnitude) and ‘dynamical’ (having to do with an object's interaction with other objects) principles must be applicable to appearances in order for us to determine them as objects of experience. I argue that Bohr's ‘complementarity interpretation’ of quantum mechanics, which views atomic objects as idealizations, and which licenses the repeal of the principle of causality for the domain of atomic physics, is perfectly compatible with, and indeed follows naturally from a broadly Kantian epistemological framework. (shrink)
I argue that the many worlds explanation of quantum computation is not licensed by, and in fact is conceptually inferior to, the many worlds interpretation of quantum mechanics from which it is derived. I argue that the many worlds explanation of quantum computation is incompatible with the recently developed cluster state model of quantum computation. Based on these considerations I conclude that we should reject the many worlds explanation of quantum computation.
Computational complexity theory is a branch of computer science dedicated to classifying computational problems in terms of their difficulty. While computability theory tells us what we can compute in principle, complexity theory informs us regarding our practical limits. In this chapter I argue that the science of \emph{quantum computing} illuminates complexity theory by emphasising that its fundamental concepts are not model-independent, but that this does not, as some suggest, force us to radically revise the foundations of the theory. For model-independence (...) never has been essential to those foundations. The fundamental aim of complexity theory is to describe what is achievable in practice under various models of computation for our various practical purposes. Reflecting on quantum computing illuminates complexity theory by reminding us of this, too often under-emphasised, fact. (shrink)
The philosophical tradition of liberal political thought has come to see tolerance as a crucial element of a liberal political order. However, while much has been made of the value of toleration, little work has been done on individual-level motivations for tolerant behavior. In this article, we seek to develop an account of the rational motivations for toleration and of where the limits of toleration lie. We first present a very simple model of rational motivations for toleration. Key to this (...) model is an application of David Ricardo’s model of trade to thinking about toleration. This model supports the claim that we always have reasons to be as tolerant as possible. We then explore why we do not always see tolerant attitudes in the actual world, and point to some potential preconditions for toleration that the initial model does not capture. Subsequently, we examine a more detailed model that allows us to investigate more carefully the conditions under which tolerant behavior can be rewarded. We conclude by arguing that a consideration of self-interested motivations for toleration is essential to the success of a robust theory of toleration for a diverse society, but that even this approach has its limitations. (shrink)
There has been a long-standing and sometimes passionate debate between physicists over whether a dynamical framework for quantum systems should incorporate not completely positive (NCP) maps in addition to completely positive (CP) maps. Despite the reasonableness of the arguments for complete positivity, we argue that NCP maps should be allowed, with a qualification: these should be understood, not as reflecting ‘not completely positive’ evolution, but as linear extensions, to a system’s entire state space, of CP maps that are only partially (...) defined. Beyond the domain of definition of a partial-CP map, we argue, much may be permitted. (shrink)
The aim of this dissertation is to clarify the debate over the explanation of quantum speedup and to submit, for the reader's consideration, a tentative resolution to it. In particular, I argue, in this dissertation, that the physical explanation for quantum speedup is precisely the fact that the phenomenon of quantum entanglement enables a quantum computer to fully exploit the representational capacity of Hilbert space. This is impossible for classical systems, joint states of which must always be representable as product (...) states. I begin the dissertation by considering, in Chapter 2, the most popular of the candidate physical explanations for quantum speedup: the many worlds explanation of quantum computation. I argue that, although it is inspired by the neo-Everettian interpretation of quantum mechanics, unlike the latter it does not have the conceptual resources required to overcome objections such as the so-called `preferred basis objection'. I further argue that the many worlds explanation, at best, can serve as a good description of the physical process which takes place in so-called network-based computation, but that it is incompatible with other models of computation such as cluster state quantum computing. I next consider, in Chapter 3, a common component of most other candidate explanations of quantum speedup: quantum entanglement. I investigate whether entanglement can be said to be a necessary component of any explanation for quantum speedup, and I consider two major purported counter-examples to this claim. I argue that neither of these, in fact, show that entanglement is unnecessary for speedup, and that, on the contrary, we should conclude that it is. In Chapters 4 and 5 I then ask whether entanglement can be said to be sufficient as well. In Chapter 4 I argue that despite a result that seems to indicate the contrary, entanglement, considered as a resource, can be seen as sufficient to enable quantum speedup. Finally, in Chapter 5 I argue that entanglement is sufficient to explain quantum speedup as well. (shrink)
I argue that Immanuel Kant's critical philosophy—in particular the doctrine of transcendental idealism which grounds it—is best understood as an `epistemic' or `metaphilosophical' doctrine. As such it aims to show how one may engage in the natural sciences and in metaphysics under the restriction that certain conditions are imposed on our cognition of objects. Underlying Kant's doctrine, however, is an ontological posit, of a sort, regarding the fundamental nature of our cognition. This posit, sometimes called the `discursivity thesis', while considered (...) to be completely obvious and uncontroversial by some, has nevertheless been denied by thinkers both before and after Kant. One such thinker is Jakob Friedrich Fries, an early neo-Kantian thinker who, despite his rejection of discursivity, also advocated for a metaphilosophical understanding of critical philosophy. As I will explain, a consequence for Fries of the denial of discursivity is a radical reconceptualisation of the method of critical philosophy; whereas this method is a priori for Kant, for Fries it is in general empirical. I discuss these issues in the context of quantum theory, and I focus in particular on the views of the physicist Niels Bohr and the Neo-Friesian philosopher Grete Hermann. I argue that Bohr's understanding of quantum mechanics can be seen as a natural extension of an orthodox Kantian viewpoint in the face of the challenges posed by quantum theory, and I compare this with the extension of Friesian philosophy that is represented by Hermann's view. (shrink)
I argue that Kant's and Frege's refutations of the ontological argument are more similar than has generally been acknowledged. As I clarify, for both Kant and Frege, to say that something exists is to assert of a concept that it is instantiated. With such an assertion one expresses that there is a particular relation between the instantiating object and a rational subject - a particular mode of presentation for the object in question. By its very nature such a relation cannot (...) be the property of an object and thus cannot be included in the concept of that object. Thus the ontological argument, which takes existence to be a part of the concept of the supreme being, cannot, according to Kant and Frege, succeed. A secondary goal of the paper is to illuminate what I take to be an important affinity between Kant's and Frege's views more generally: that Frege's fundamental distinction between the sense and the referent of a proposition echoes, in an important way, Kant's distinction between concepts and the formal principles for their application to experience. (shrink)
Wittgenstein did not write very much on the topic of probability. The little we have comes from a few short pages of the Tractatus, some 'remarks' from the 1930s, and the informal conversations which went on during that decade with the Vienna Circle. Nevertheless, Wittgenstein's views were highly influential in the later development of the logical theory of probability. This paper will attempt to clarify and defend Wittgenstein's conception of probability against some oft-cited criticisms that stem from a misunderstanding of (...) his views. Max Black, for instance, criticises Wittgenstein for formulating a theory of probability that is capable of being used only against the backdrop of the ideal language of the Tractatus. I argue that on the contrary, by appealing to the 'hypothetical laws of nature', Wittgenstein is able to make sense of probability statements involving propositions that have not been completely analysed. G.H. von Wright criticises Wittgenstein's characterisation of these very hypothetical laws. He argues that by introducing them Wittgenstein makes what is distinctive about his theory superfluous, for the hypothetical laws are directly inspired by statistical observations and hence these observations indirectly determine the mechanism by which the logical theory of probability operates. I argue that this is not the case at all, and that while statistical observations play a part in the formation of the hypothetical laws, these observations are only necessary, but not sufficient conditions for the introduction of these hypotheses. (shrink)
There is a deeply entrenched view in philosophy and physics, the closed systems view, according to which isolated systems are conceived of as fundamental. On this view, when a system is under the influence of its environment this is described in terms of a coupling between it and a separate system which taken together are isolated. We argue against this view, and in favor of the alternative open systems view, for which systems interacting with their environment are conceived of as (...) fundamental, and the environment's influence is represented via the dynamical equations that govern the system's evolution. Taking quantum theories of closed and open systems as our case study, and considering three alternative notions of fundamentality: (i) ontic fundamentality, (ii) epistemic fundamentality, and (iii) explanatory fundamentality, we argue that the open systems view is fundamental, and that this has important implications for the philosophy of physics, the philosophy of science, and for metaphysics. (shrink)
It is not clear, on the face of it, whether Thomas Hobbes's legal philosophy should be considered to be an early example of legal positivism or continuous with the natural-law tradition. On the one hand, Hobbes's command theory of law seems characteristically positivistic. On the other hand, his conception of the "law of nature," as binding on both sovereign and subject, seems to point more naturally toward a natural-law reading of his philosophy. Yet despite this seeming ambiguity, Hobbes scholars, for (...) the most part, have placed him within the legal-positivist tradition. Indeed, Hobbes is usually regarded as the father of legal positivism. Recently, however, a growing number of commentators has begun to question this traditional classification. Although it is clear that Hobbes is not a natural lawyer of the same mold as Thomas Aquinas, it is, nevertheless, increasingly becoming evident that the traditional characterization of Hobbes as a positivist in the same vein as Jeremy Bentham or John Austin is also incorrect. There are important naturallaw aspects of Hobbes's view that one ignores only at the cost of a proper understanding of his theory of law. (shrink)
Kant's arguments for the synthetic a priori status of geometry are generally taken to have been refuted by the development of non-Euclidean geometries. Recently, however, some philosophers have argued that, on the contrary, the development of non-Euclidean geometry has confirmed Kant's views, for since a demonstration of the consistency of non-Euclidean geometry depends on a demonstration of its equi-consistency with Euclidean geometry, one need only show that the axioms of Euclidean geometry have 'intuitive content' in order to show that both (...) Euclidean and non-Euclidean geometry are bodies of synthetic a priori truths. Michael Friedman has argued that this defence presumes a polyadic conception of logic that was foreign to Kant. According to Friedman, Kant held that geometrical reasoning itself relies essentially on intuition, and that this precludes the very possibility of non-Euclidean geometry. While Friedman's characterization of Kant's views on geometrical reasoning is correct, I argue that Friedman's conclusion that non-Euclidean geometries are logically impossible for Kant is not. I argue that Kant is best understood as a proto-constructivist and that modern constructive axiomatizations (unlike Hilbert-style axiomatizations) of both Euclidean and non-Euclidean geometry capture Kant's views on the essentially constructive nature of geometrical reasoning well. (shrink)
Of the many and varied applications of quantum information theory, perhaps the most fascinating is the sub-field of quantum computation. In this sub-field, computational algorithms are designed which utilise the resources available in quantum systems in order to compute solutions to computational problems with, in some cases, exponentially fewer resources than any known classical algorithm. While the fact of quantum computational speedup is almost beyond doubt, the source of quantum speedup is still a matter of debate. In this paper I (...) argue that entanglement is a necessary component for any explanation of quantum speedup and I address some purported counter-examples that some claim show that the contrary is true. In particular, I address Cleve et al.'s solution to Deutsch's problem, Biham et al.'s mixed-state version of the Deutsch-Jozsa algorithm, and Knill & Laflamme's deterministic quantum computation with one qubit model of quantum computation. I argue that these examples do not demonstrate that entanglement is unnecessary for the explanation of quantum speedup, but that they rather illuminate and clarify the role that entanglement does play. (shrink)
We use Bub's (2016) correlation arrays and Pitowksy's (1989b) correlation polytopes to analyze an experimental setup due to Mermin (1981) for measurements on the singlet state of a pair of spin-12 particles. The class of correlations allowed by quantum mechanics in this setup is represented by an elliptope inscribed in a non-signaling cube. The class of correlations allowed by local hidden-variable theories is represented by a tetrahedron inscribed in this elliptope. We extend this analysis to pairs of particles of arbitrary (...) spin. The class of correlations allowed by quantum mechanics is still represented by the elliptope; the subclass of those allowed by local hidden-variable theories by polyhedra with increasing numbers of vertices and facets that get closer and closer to the elliptope. We use these results to advocate for an interpretation of quantum mechanics like Bub's. Probabilities and expectation values are primary in this interpretation. They are determined by inner products of vectors in Hilbert space. Such vectors do not themselves represent what is real in the quantum world. They encode families of probability distributions over values of different sets of observables. As in classical theory, these values ultimately represent what is real in the quantum world. Hilbert space puts constraints on possible combinations of such values, just as Minkowski space-time puts constraints on possible spatio-temporal constellations of events. Illustrating how generic such constraints are, the equation for the elliptope derived in this paper is a general constraint on correlation coefficients that can be found in older literature on statistics and probability theory. Yule (1896) already stated the constraint. De Finetti (1937) already gave it a geometrical interpretation. (shrink)
I give a defense of the Massive Modularity hypothesis: the view that the mind is composed of discrete, encapsulated, informationally isolated computational structures dedicated to particular problem domains. This view contrasts with Psychological Rationalism: the view that mental structures take the form of unencapsulated representational items, all available as inputs to one domain-general computational processor. I argue that although Psychological Rationalism is in principle able to overcome the `intractability objection', the view must borrow many features of a massively modular architecture (...) in order to do so, that although it can, in principle, overcome the `optimality objection', the way it does so does not correlate with the way we think, and that although it can, in principle, respond to the `argument from biology', it cannot do so without advancing an unrealistic and unsupported account of cognitive evolution. (shrink)
In this paper I explain and defend the content and justification of John Rawls's conception of human rights, as he outlines it in his major work: The Law of Peoples. I focus, in particular, on the criticisms of Allen Buchanan. Buchanan distinguishes four lines of argument that Rawls uses to derive what, according to Buchanan, is a 'lean' list of human rights : the Political Conception Argument, the Associationist Argument, the Cooperation Argument, and finally the Functionalist Argument. In each case (...) Buchanan proceeds to show how the premises of Rawls's argument lead to absurd consequences if taken to their logical conclusion. It can be shown, however, that the reason these consequences follow is that Buchanan misunderstands and misrepresents Rawls's premises. (shrink)