The science of metrology characterizes the concept of precision in exceptionally loose and open terms. That is because the details of the concept must be filled in—what I call narrowing of the concept—in ways that are sensitive to the details of a particular measurement or measurement system and its use. Since these details can never be filled in completely, the concept of the actual precision of an instrument system must always retain some of the openness of its general characterization. The (...) idea that there is something that counts as the actual precision of a measurement system must therefore always remain an idealization, a conclusion that would appear to hold very broadly for terms and the concepts they express. (shrink)
Knowledge requires truth, and truth, we suppose, involves unflawed representation. Science does not provide knowledge in this sense but rather provides models, representations that are limited in their accuracy, precision, or, most often, both. Truth as we usually think of it is an idealization, one that serves wonderfully in most ordinary applications, but one that can terribly mislead for certain issues in philosophy. This article sketches how this happens for five important issues, thereby showing how philosophical method must take into (...) account the idealized nature of our familiar conception of truth. (shrink)
This paper examines and finds wanting the arguments against van Fraassen's voluntarism, the view that the only constraint of rationality is consistency. Foundationalists claim that if we have no grounds or rationale for a belief or rule, rationality demands that we suspend it. But that begs the question by assuming that there have to be grounds or a rationale. Instead of asking, why should we hold a basic belief or rule, the question has to be: why should not we be (...) committed as we are? Within a system we can sometimes find internal reasons. But, short of assuming foundationalism from the outset, when it comes to our evolving system as a whole there are no grounds for abandoning the commitments that we experience so strongly. Along the way the paper develops a systematic way of talking about terms that cause confusion because of variation in usage: foundationalism, relativism, basic beliefs and rules, voluntarism, etc. (shrink)
The traditional way of thinking about science goes back to the corpuscular philosophy with its micro-reductive mechanism and metaphor of reading God's Book of Nature. This "story-1" with its rhetoric of exact truths contrasts with "story-2" which describes science as a continuation of the always imperfect powers of representation given to us by evolution. On story-2 reduction is one among other knowledge fashioning strategies and shares the imperfections of all human knowledge. When we appreciate that human knowledge always admits of (...) refinement, what appear as "emergent properties" no long seems mysterious. (shrink)
This essay endorses the conclusion of Sklar’s “Dappled Theories in a Uniform World” that he announces in his abstract, that notwithstanding recent attacks foundational theories are universal in their scope. But Sklar’s rejection of a “pluralist ontology” is questioned. It is concluded that so called “foundational” and “phenomenological” theories are on a much more equal footing as sources of knowledge than Sklar would allow, that “giving an ontology” generally involves dealing in idealizations, and that a transfigured “ficitonalism” provides an (in (...) many respects) better model of scientific knowledge than the model of “foundational truths.”. (shrink)
There are few, perhaps no known, exact, true, general laws. Some of the work of generalization is carried by ceteris paribus generalizations. I suggest that many models continue such work in more complex form, with the idea of ceteris paribus conditions thought of as extended to more general conditions of application. I use the term regularity guide to refer collectively to cp‐generalizations and such regularity‐purveying models. Laws in the traditional sense can then be thought of as idealizations, which idealize away (...) from the conditions of application of regularity guides. If we keep clearly in mind the status of laws as such idealizations, problems surrounding traditional topics—such as lawlikeness, corresponding counterfactuals and modality—no longer look to be intractable. (shrink)
In this paper I will set out my understanding of Bas van Fraassen’s constructive empiricism, some of the difficulties which I believe beset the current version, and, very briefly, some valuable lessons I believe are nonetheless to be learned by considering this view.We’ll need to begin with a review of how van Fraassen conceives of this kind of discussion.
This paper examines the so-called "gauge argument" sometimes used by physicists to motivate the introduction of gauge fields, here facilitated by an informal exposition of the fiber bundle formalism. The discussion suggests some preliminary ways of understanding the connection between gauge fields and interactions.
Darrin Belousek has argued that the indistinguishability of quantum particles is conventional “in the Duhemian–Einsteinian sense,” in part by critially examining prior arguments given by Redhead and Teller. Belousek's discussion provides a useful occasion to clarify some of those arguments, acknowledge respects in which they were misleading, and comment on how they can be strengthened. We also comment briefly on the relevant sense of “conventional.”.
Huggett and Weingard's critical review provides an opportunity to continue the interpretive examination of quantum field theory in terms of some specific issues as well as comparison of alternative approaches to the subject. This note recasts their example of inequivalent Fock spaces in an effort to further clarify what it illustrates. Questions are addressed about the role of analogy in developing quantum field theory and about the conflict between formal vs. concrete methods in both physics and its interpretation, continuing the (...) well-known historical debate between Pierre Duhem and Clark Maxwell. Huggett and Weingard's examination very usefully occasions clarification on some points of exposition which, it is hoped, will make An Interpretive Introduction to Quantum Field Theory a more useful resource for understanding this subject. (shrink)
We extend the work of French and Redhead  further examining the relation of quantum statistics to the assumption that quantum entities have the sort of identity generally assumed for physical objects, more specifically an identity which makes them susceptible to being thought of as conceptually individuatable and labelable even though they cannot be experimentally distinguished. We also further examine the relation of such hypothesized identity of quantum entities to the Principle of the Identity of Indiscernibles. We conclude that although (...) such an assumption of identity is consistent with the facts of quantum statistics, methodological considerations show that we should take quantum entities to be entirely unindividuatable, in the way suggested by a Fock space description. (shrink)
The practice of describing multiparticle quantum systems in terms of labeled particles indicates that we think of quantum entities as individuatable. The labels, together with particle indistinguishability, create the need for symmetrization or antisymmetrization (or, in principle, higher-order symmetries), which in turn results in “surplus formal structure” in the formalism, formal structure which corresponds to nothing in the real world. We argue that these facts show quanta to be unindividuatable entities, things in principle incapable of supporting labels, and so things (...) which support no factual difference_if two of them are thought of as being switched. When thinking of the metaphysics of quanta, we should eschew the misleading labels of the tensor product Hilbert space formalism and prefer the ontologically more faithful description of the Fock space formalism. This conception eliminates puzzles about the quantum statistics of bosons. (shrink)
This paper digests technical commonplaces of quantum field theory to present an informal interpretation of the theory by emphasizing its connections with the harmonic oscillator. The resulting "harmonic oscillator interpretation" enables newcomers to the subject to get some intuitive feel for the theory. The interpretation clarifies how the theory relates to observation and to quantum mechanical problems connected with observation. Finally the interpretation moves some way towards helping us see what the theory comes to physically. The paper also argues that, (...) in important respects, interpretive problems of quantum field theory are problems we know well from conventional quantum mechanics. An important exception concerns extending the puzzles surrounding the superposition of properties in conventional quantum mechanics to an exactly parallel notion of superposition of particles. Conventional quantum mechanics seems incompatible with a classical notion of property on which all quantities always have definite values. Quantum field theory presents an exactly analogous problem with saying that the number of "particles" is always definite. (shrink)
In quantum field theory divergent expressions are "discarded", leaving finite expressions which provide the best predictions anywhere in science. In fact, this "renormalization procedure" involves no mystery or illegitimate operations. This paper explains, in terms accessible to non-experts, how the procedure really works and explores some different ways in which physicists have suggested that one understand it.
Previous work has shown that the problem of measurement in quantum mechanics is not correctly seen as one of understanding some allegedly univocal process of measurement in nature which corresponds to the projection postulate. The present paper introduces a new perspective by showing that how we are to understand the nature of the change of quantum mechanical state on measurement depends very sensitively on the interpretation of the state function, and by showing how attention to this dependence can greatly sharpen (...) the problems and relations between them. In particular, the problems take a form resembling their traditional formulation only on an inexact value interpretation, according to which the state function attributes inexact values of quantities to systems. On other interpretations we can apply (with various drawbacks) the subensemble idea, according to which a discontinuous change of quantum mechanical description results on measurement simply because we need a new state function to describe a new object. (shrink)
If we take the state function of quantum mechanics to describe belief states, arguments by Stairs and Friedman-Putnam show that the projection postulate may be justified as a kind of minimal change. But if the state function takes on a physical interpretation, it provides no more than what I call a fortuitous approximation of physical measurement processes, that is, an unsystematic form of approximation which should not be taken to correspond to some one univocal "measurement process" in nature. This fact (...) suggests that the projection postulate does not provide a proper locus for interpretive investigation. Readers will also find section 3's analysis of fortuitous approximations of independent interest and presented without the perils of quantum mechanics. (shrink)
In response to Cushing it is urged that the vicissitudes of quantum field theory do not press towards a nonrealist attitude towards the theory as strongly as he suggests. A variety of issues which Redhead raises are taken up, including photon localizability, the wave-particle distinction in the classical limit, and the interpretation of quantum statistics, vacuum fluctuations, virtual particles, and creation and annihilation operators. It is urged that quantum field theory harbors an unacknowledged inconsistency connected with the fact that the (...) zero point energy has observable consequences, while to avoid infinities it must be "thrown away". Finally, Redhead's conception of ephemerals is pressed and the paper concludes with the suggestion that the particle concept largely drops out of quantum field theory. (shrink)
This article explains why Bohr does not need to discuss the projection postulate or the "problem of measurement". Beginning with a thumbnail sketch of Bohr's general views (which should serve as an efficient introduction to Bohr for newcomers), it is argued that Bohr interprets the state function as giving a statistical summary of experimental outcomes. Against the objection that Bohr was too much a microrealist to endorse such an instrumentalist statistical interpretation it is suggested that (...) he rejected the issue of microrealism as not well formed. It is shown that on his statistical interpretation Bohr does not need the projection postulate or face the usual problem of measurement. (shrink)
In the contemporary discussion of hidden variable interpretations of quantum mechanics, much attention has been paid to the “no hidden variable” proof contained in an important paper of Kochen and Specker. It is a little noticed fact that Bell published a proof of the same result the preceding year, in his well-known 1966 article, where it is modestly described as a corollary to Gleason's theorem. We want to bring out the great simplicity of Bell's formulation of this result and to (...) show how it can be extended in certain respects. (shrink)