One can give a strong sense to the idea that a relation does not 'reduce' to non-relational properties by saying that a relation does not supervene upon the non-relational properties of its relata. That there are such inherent relations I call the doctrine of relational holism, a doctrine which seems to conflict with traditional ideas about physicalism. At least parts of classical physics seem to be free of relational holism, but quantum mechanics, on at least some interpretations, incorporates the doctrine (...) in an all pervasive way. (shrink)
Quantum mechanics is a subject that has captured the imagination of a surprisingly broad range of thinkers, including many philosophers of science. Quantum field theory, however, is a subject that has been discussed mostly by physicists. This is the first book to present quantum field theory in a manner that makes it accessible to philosophers. Because it presents a lucid view of the theory and debates that surround the theory, An Interpretive Introduction to Quantum Field Theory will interest students of (...) physics as well as students of philosophy. -/- Paul Teller presents the basic ideas of quantum field theory in a way that is understandable to readers who are familiar with non-relativistic quantum mechanics. He provides information about the physics of the theory without calculational detail, and he enlightens readers on how to think about the theory physically. Along the way, he dismantles some popular myths and clarifies the novel ways in which quantum field theory is both a theory about fields and about particles. His goal is to raise questions about the philosophical implications of the theory and to offer some tentative interpretive views of his own. This provocative and thoughtful book challenges philosophers to extend their thinking beyond the realm of quantum mechanics and it challenges physicists to consider the philosophical issues that their explorations have encouraged. (shrink)
We extend the work of French and Redhead [1988] further examining the relation of quantum statistics to the assumption that quantum entities have the sort of identity generally assumed for physical objects, more specifically an identity which makes them susceptible to being thought of as conceptually individuatable and labelable even though they cannot be experimentally distinguished. We also further examine the relation of such hypothesized identity of quantum entities to the Principle of the Identity of Indiscernibles. We conclude that although (...) such an assumption of identity is consistent with the facts of quantum statistics, methodological considerations show that we should take quantum entities to be entirely unindividuatable, in the way suggested by a Fock space description. (shrink)
Nature is complex, exceedingly so. A repercussion of this “complex world constraint” is that it is, in practice, impossible to connect words to the world in a foolproof manner. In this paper I explore the ways in which the complex world constraint makes vagueness, or more generally imprecision, in language in practice unavoidable, illuminates what vagueness comes to, and guides us to a sensible way of thinking about truth. Along the way we see that the problem of ceteris paribus laws (...) is exactly the problem of vagueness and susceptible to similar treatment. (shrink)
In this paper I will set out my understanding of Bas van Fraassen’s constructive empiricism, some of the difficulties which I believe beset the current version, and, very briefly, some valuable lessons I believe are nonetheless to be learned by considering this view.We’ll need to begin with a review of how van Fraassen conceives of this kind of discussion.
The practice of describing multiparticle quantum systems in terms of labeled particles indicates that we think of quantum entities as individuatable. The labels, together with particle indistinguishability, create the need for symmetrization or antisymmetrization (or, in principle, higher-order symmetries), which in turn results in “surplus formal structure” in the formalism, formal structure which corresponds to nothing in the real world. We argue that these facts show quanta to be unindividuatable entities, things in principle incapable of supporting labels, and so things (...) which support no factual difference_if two of them are thought of as being switched. When thinking of the metaphysics of quanta, we should eschew the misleading labels of the tensor product Hilbert space formalism and prefer the ontologically more faithful description of the Fock space formalism. This conception eliminates puzzles about the quantum statistics of bosons. (shrink)
This paper challenges “traditional measurement-accuracy realism”, according to which there are in nature quantities of which concrete systems have definite values. An accurate measurement outcome is one that is close to the value for the quantity measured. For a measurement of the temperature of some water to be accurate in this sense requires that there be this temperature. But there isn’t. Not because there are no quantities “out there in nature” but because the term ‘the temperature of this water’ fails (...) to refer owing to idealization and failure of specificity in picking out concrete cases. The problems can be seen as an artifact of vagueness, and so doing facilitates applying Eran Tal’s robustness account of measurement accuracy to suggest an attractive way of understanding vagueness in terms of the function of idealization, a way that sidesteps the problems of higher order vagueness and that shows how idealization provides a natural generalization of what it is to be vague. (shrink)
I hope to show that supervenience and determination, as I have here intuitively characterized them, are really different expressions of the same core idea which one may make more precise in a great number of different ways, depending on the interpretation one puts on the catchall parameters “cases”, “truth of kind P”and “truth of kind S”.
Ronald Giere has argued that at its best science gives us knowledge only from different “perspectives,” but that this knowledge still counts as scientific realism. Others have noted that his “perspectival realism” is in tension with scientific realism as traditionally understood: How can different, even conflicting, perspectives give us what there is really? This essay outlines a program that makes good on Giere’s idea with a fresh understanding of “realism” that eases this tension.
This essay endorses the conclusion of Sklar’s “Dappled Theories in a Uniform World” that he announces in his abstract, that notwithstanding recent attacks foundational theories are universal in their scope. But Sklar’s rejection of a “pluralist ontology” is questioned. It is concluded that so called “foundational” and “phenomenological” theories are on a much more equal footing as sources of knowledge than Sklar would allow, that “giving an ontology” generally involves dealing in idealizations, and that a transfigured “ficitonalism” provides an (in (...) many respects) better model of scientific knowledge than the model of “foundational truths.”. (shrink)
There are few, perhaps no known, exact, true, general laws. Some of the work of generalization is carried by ceteris paribus generalizations. I suggest that many models continue such work in more complex form, with the idea of ceteris paribus conditions thought of as extended to more general conditions of application. I use the term regularity guide to refer collectively to cp‐generalizations and such regularity‐purveying models. Laws in the traditional sense can then be thought of as idealizations, which idealize away (...) from the conditions of application of regularity guides. If we keep clearly in mind the status of laws as such idealizations, problems surrounding traditional topics—such as lawlikeness, corresponding counterfactuals and modality—no longer look to be intractable. (shrink)
The book is drawn from the Tarner lectures, delivered in Cambridge in 1993. It is concerned with the ultimate nature of reality, and how this is revealed by modern physical theories such as relativity and quantum theory. The objectivity and rationality of science are defended against the views of relativists and social constructionists. It is claimed that modern physics gives us a tentative and fallible, but nevertheless rational, approach to the nature of physical reality. The role of subjectivity in science (...) is examined in the fields of relativity theory, statistical mechanics and quantum theory, and recent claims of an essential role for human consciousness in physics are rejected. Prospects for a 'Theory of Everything' are considered, and the related question of how to assess scientific progress is carefully examined. (shrink)
The science of metrology characterizes the concept of precision in exceptionally loose and open terms. That is because the details of the concept must be filled in—what I call narrowing of the concept—in ways that are sensitive to the details of a particular measurement or measurement system and its use. Since these details can never be filled in completely, the concept of the actual precision of an instrument system must always retain some of the openness of its general characterization. The (...) idea that there is something that counts as the actual precision of a measurement system must therefore always remain an idealization, a conclusion that would appear to hold very broadly for terms and the concepts they express. (shrink)
Many in philosophy understand truth in terms of precise semantic values, true propositions. Following Braun and Sider, I say that in this sense almost nothing we say is, literally, true. I take the stand that this account of truth nonetheless constitutes a vitally useful idealization in understanding many features of the structure of language. The Fregean problem discussed by Braun and Sider concerns issues about application of language to the world. In understanding these issues I propose an alternative modeling tool (...) summarized in the idea that inaccuracy of statements can be accommodated by their imprecision. This yields a pragmatist account of truth, but one not subject to the usual counterexamples. The account can also be viewed as an elaborated error theory. The paper addresses some prima facie objections and concludes with implications for how we address certain problems in philosophy. (shrink)
This paper examines the so-called "gauge argument" sometimes used by physicists to motivate the introduction of gauge fields, here facilitated by an informal exposition of the fiber bundle formalism. The discussion suggests some preliminary ways of understanding the connection between gauge fields and interactions.
I modify and generalize Carnap’s notion of frameworks as a way of unpacking Goodman’s metaphor of “making worlds with symbols”. My frameworks provide, metaphorically, a way of making worlds out of symbols in as much as all our framework-bound access to the world is through frameworks that always stand to be improved in accuracy, precision, and usually both. Such improvement is characterized in pragmatist terms.
Knowledge requires truth, and truth, we suppose, involves unflawed representation. Science does not provide knowledge in this sense but rather provides models, representations that are limited in their accuracy, precision, or, most often, both. Truth as we usually think of it is an idealization, one that serves wonderfully in most ordinary applications, but one that can terribly mislead for certain issues in philosophy. This article sketches how this happens for five important issues, thereby showing how philosophical method must take into (...) account the idealized nature of our familiar conception of truth. (shrink)
This paper examines and finds wanting the arguments against van Fraassen's voluntarism, the view that the only constraint of rationality is consistency. Foundationalists claim that if we have no grounds or rationale for a belief or rule, rationality demands that we suspend it. But that begs the question by assuming that there have to be grounds or a rationale. Instead of asking, why should we hold a basic belief or rule, the question has to be: why should not we be (...) committed as we are? Within a system we can sometimes find internal reasons. But, short of assuming foundationalism from the outset, when it comes to our evolving system as a whole there are no grounds for abandoning the commitments that we experience so strongly. Along the way the paper develops a systematic way of talking about terms that cause confusion because of variation in usage: foundationalism, relativism, basic beliefs and rules, voluntarism, etc. (shrink)
This paper challenges “traditional measurement-accuracy realism”, according to which there are in nature quantities of which concrete systems have definite values. An accurate measurement outcome is one that is close to the value for the quantity measured. For a measurement of the temperature of some water to be accurate in this sense requires that there be this temperature. But there isn’t. Not because there are no quantities “out there in nature” but because the term ‘the temperature of this water’ fails (...) to refer owing to idealization and failure of specificity in picking out concrete cases. The problems can be seen as an artifact of vagueness, and so doing facilitates applying Eran Tal’s robustness account of measurement accuracy to suggest an attractive way of understanding vagueness in terms of the function of idealization, a way that sidesteps the problems of higher order vagueness and that shows how idealization provides a natural generalization of what it is to be vague. (shrink)
In the contemporary discussion of hidden variable interpretations of quantum mechanics, much attention has been paid to the “no hidden variable” proof contained in an important paper of Kochen and Specker. It is a little noticed fact that Bell published a proof of the same result the preceding year, in his well-known 1966 article, where it is modestly described as a corollary to Gleason's theorem. We want to bring out the great simplicity of Bell's formulation of this result and to (...) show how it can be extended in certain respects. (shrink)
Many in philosophy understand truth in terms of precise semantic values, true propositions. Following Braun and Sider, I say that in this sense almost nothing we say is, literally, true. I take the stand that this account of truth nonetheless constitutes a vitally useful idealization in understanding many features of the structure of language. The Fregean problem discussed by Braun and Sider concerns issues about application of language to the world. In understanding these issues I propose an alternative modeling tool (...) summarized in the idea that inaccuracy of statements can be accommodated by their imprecision. This yields a pragmatist account of truth, but one not subject to the usual counterexamples. The account can also be viewed as an elaborated error theory. The paper addresses some prima facie objections and concludes with implications for how we address certain problems in philosophy. (shrink)
Bogen and Woodward argued the indirect connection between data and theory in terms of their conception of “phenomena.” I outline and elaborate on their presentation. To illuminate the connection with contemporary thinking in terms of models, I distinguish between phenomena tokens, representations of which can be identified with data models, and phenomena types that can be identified with relatively low-lying models or aspects of models in the model hierarchy. Throughout I stress the role of idealization in these considerations.
In quantum field theory divergent expressions are "discarded", leaving finite expressions which provide the best predictions anywhere in science. In fact, this "renormalization procedure" involves no mystery or illegitimate operations. This paper explains, in terms accessible to non-experts, how the procedure really works and explores some different ways in which physicists have suggested that one understand it.
Nature is complex, exceedingly so. A repercussion of this “complex world constraint” is that it is, in practice, impossible to connect words to the world in a foolproof manner. In this paper I explore the ways in which the complex world constraint makes vagueness, or more generally imprecision, in language in practice unavoidable, illuminates what vagueness comes to, and guides us to a sensible way of thinking about truth. Along the way we see that the problem of ceteris paribus laws (...) is exactly the problem of vagueness and susceptible to similar treatment. (shrink)
If we take the state function of quantum mechanics to describe belief states, arguments by Stairs and Friedman-Putnam show that the projection postulate may be justified as a kind of minimal change. But if the state function takes on a physical interpretation, it provides no more than what I call a fortuitous approximation of physical measurement processes, that is, an unsystematic form of approximation which should not be taken to correspond to some one univocal "measurement process" in nature. This fact (...) suggests that the projection postulate does not provide a proper locus for interpretive investigation. Readers will also find section 3's analysis of fortuitous approximations of independent interest and presented without the perils of quantum mechanics. (shrink)
Darrin Belousek has argued that the indistinguishability of quantum particles is conventional “in the Duhemian–Einsteinian sense,” in part by critially examining prior arguments given by Redhead and Teller. Belousek's discussion provides a useful occasion to clarify some of those arguments, acknowledge respects in which they were misleading, and comment on how they can be strengthened. We also comment briefly on the relevant sense of “conventional.”.