Recent work by Robert Batterman and Alexander Rueger has brought attention to cases in physics in which governing laws at the base level “break down” and singular limit relations obtain between base- and upper-level theories. As a result, they claim, these are cases with emergent upper-level properties. This paper contends that this inference—from singular limits to explanatory failure, novelty or irreducibility, and then to emergence—is mistaken. The van der Pol nonlinear oscillator is used to show that there can be a (...) full explanation of upper-level properties entirely in base-level terms even when singular limits are present. Whether upper-level properties are emergent depends not on the presence of a singular limit but rather on details of the ampliative approximation methods used. The paper suggests that focusing on explanatory deficiency at the base level is key to understanding emergence in physics. (shrink)
Many explanations in physics rely on idealized models of physical systems. These explanations fail to satisfy the conditions of standard normative accounts of explanation. Recently, some philosophers have claimed that idealizations can be used to underwrite explanation nonetheless, but only when they are what have variously been called representational, Galilean, controllable or harmless idealizations. This paper argues that such a half-measure is untenable and that idealizations not of this sort can have explanatory capacities.
A common methodological adage holds that diverse evidence better confirms a hypothesis than does the same amount of similar evidence. Proponents of Bayesian approaches to scientific reasoning such as Horwich, Howson and Urbach, and Earman claim to offer both a precise rendering of this maxim in probabilistic terms and an explanation of why the maxim should be part of the methodological canon of good science. This paper contends that these claims are mistaken and that, at best, Bayesian accounts of diverse (...) evidence are crucially incomplete. This failure should lend renewed force to a long-neglected global worry about Bayesian approaches. (shrink)
Quantum field theory provides the framework for many fundamental theories in modern physics, and over the last few years there has been growing interest in its historical and philosophical foundations. This anthology on the foundations of QFT brings together 15 essays by well-known researchers in physics, the philosophy of physics, and analytic philosophy.Many of these essays were first presented as papers at the conference?Ontological Aspects of Quantum Field Theory?, held at the Zentrum fr interdisziplinre Forschung, Bielefeld, Germany. The essays contain (...) cutting-edge work on ontological aspects of QFT, including: the role of measurement and experimental evidence, corpuscular versus field-theoretic interpretations of QFT, the interpretation of gauge symmetry, and localization.This book is ideally suited to anyone with an interest in the foundations of quantum physics, including physicists, philosophers and historians of physics, as well as general readers interested in philosophy or science. (shrink)
The goal of this paper is to show how scientific explanation functions in the context of idealized models. It argues that the aspect of explanation most urgently requiring investigation is the nature of the connection between global theories and explanatory local models. This aspect is neglected in traditional accounts of explanation. The paper examines causal, minimal model, and structural accounts of model-based explanation. It argues that they too fail to offer an account of the connection with global theory that can (...) justify the explanatory power of an idealized local model, and consequently these accounts are unable effectively to distinguish explanatory from non-explanatory models. On the account proposed here, scientific explanation requires theoretical integration between the local model described in the explanation and a global theory with independent explanatory power. (shrink)
The goal of this paper is to show how scientific explanation functions in the context of idealized models. It argues that the aspect of explanation most urgently requiring investigation is the nature of the connection between global theories and explanatory local models. This aspect is neglected in traditional accounts of explanation. The paper examines causal, minimal model, and structural accounts of model-based explanation. It argues that they too fail to offer an account of the connection with global theory that can (...) justify the explanatory power of an idealized local model, and consequently these accounts are unable effectively to distinguish explanatory from non-explanatory models. On the account proposed here, scientific explanation requires theoretical integration between the local model described in the explanation and a global theory with independent explanatory power. (shrink)
Many philosophers now regard causal approaches to explanation as highly promising, even in physics. This is due in large part to James Woodward's influential argument that a wide variety of scientific explanations are causal, based on his interventionist approach to causation. This article argues that some derivations describing causal relations and satisfying Woodward's criteria for causal explanation fail to be explanatory. Further, causal relations are unnecessary for a range of explanations, widespread in physics, involving highly idealized models. These constitute significant (...) limitations on the scope of causal explanation. We have good reason to doubt that causal explanation is as widespread or important in physics as Woodward and other proponents maintain. (shrink)
What does it mean to say that a scientific theory is unified? Prominent attempts by John Watkins, Philip Kitcher, and Margaret Morrison to answer this question face serious difficulties, and many analysts of science remain pessimistic about the possibility of ever rendering precise or explaining what theoretical unity consists in. This paper gives grounds for optimism, offering a novel account of the concept of unification. This account is tested against a detailed study of the standard model in contemporary high-energy physics, (...) a theory for which a high degree of unity has repeatedly been claimed. (shrink)
This paper begins by tracing interest in emergence in physics to the work of condensed matter physicist Philip Anderson. It provides a selective introduction to contemporary philosophical approaches to emergence. It surveys two exciting areas of current work that give good reason to re-evaluate our views about emergence in physics. One area focuses on physical systems wherein fundamental theories appear to break down. The other area is the quantum-to-classical transition, where some have claimed that a complete explanation of the behaviors (...) and features of the objects of classical physics entirely in quantum terms is now within our grasp. We suggest that the most useful way to approach the emergent/non-emergent distinction is in epistemic terms, and more specifically that the failure of reductive explanation is constitutive of emergence in physics. (shrink)
Field theories have been central to physics over the last 150 years, and there are several theories in contemporary physics in which physical fields play key causal and explanatory roles. This paper proposes a novel field trope-bundle (FTB) ontology on which fields are composed of bundles of particularized property instances, called tropes and goes on to describe some virtues of this ontology. It begins with a critical examination of the dominant view about the ontology of fields, that fields are properties (...) of a substantial substratum. (shrink)
This paper explores the role of physically impossible idealizations in model-based explanation. We do this by examining the explanation of gravitational waves from distant stellar objects using models that contain point-particle idealizations. Like infinite idealizations in thermodynamics, biology and economics, the point-particle idealization in general relativity is physically impossible. What makes this case interesting is that there are two very different kinds of models used for predicting the same gravitational wave phenomena, post-Newtonian models and effective field theory models. The paper (...) contends that post-Newtonian models are explanatory while effective field theory models are not, because only in the former can we eliminate the physically impossible point-particle idealization. This suggests that, in some areas of science at least, models invoking ineliminable infinite idealizations cannot have explanatory power. (shrink)
The Reeh-Schlieder theorem asserts the vacuum and certain other states to be spacelike superentangled relative to local quantum fields. This motivates an inquiry into the physical status of various concepts of localization. It is argued that a covariant generalization of Newton-Wigner localization is a physically illuminating concept. When analyzed in terms of nonlocally covariant quantum fields, creating and annihilating quanta in Newton-Wigner localized states, the vacuum is seen to not possess the spacelike superentanglement that the Reeh-Schlieder theorem displays relative to (...) local fields, and to be locally empty as well as globally empty. Newton-Wigner localization is then shown to be physically interpretable in terms of a covariant generalization of the center of energy, the two localizations being identical if the system has no internal angular momentum. Finally, some of the counterintuitive features of Newton-Wigner localization are shown to have close analogues in classical special relativity. (shrink)
This paper presents two interpretations of the fiber bundle fonnalism that is applicable to all gauge field theories. The constructionist interpretation yields a substantival spacetime. The analytic interpretation yields a structural spacetime, a third option besides the familiar substantivalism and relationalism. That the same mathematical fonnalism can be derived in two different ways leading to two different ontological interpretations reveals the inadequacy of pure fonnal arguments.
A signal development in contemporary physics is the widespread use, in explanatory contexts, of highly idealized models. This paper argues that some highly idealized models in physics have genuine explanatory power, and it extends the explanatory role for such idealizations beyond the scope of previous philosophical work. It focuses on idealizations of nonlinear oscillator systems.
Recent work on emergence in physics has focused on the presence of singular limit relations between basal and upper-level theories as a criterion for emergence. However, over-emphasis on the role of singular limit relations has somewhat obscured what it means to say that a property or behaviour is emergent. This paper argues that singular limits are not central to emergence and develops an alternative account of emergence in terms of the failure of basal explainability. As a consequence, emergence and reduction, (...) long held to be two sides of the same coin in the emergentist tradition, are largely decoupled. (shrink)
Philosophers of science have long been concerned with these questions. In the 1980s, influential work by Clark Glymour, Michael Friedman, John Watkins, and Philip Kitcher articulated general accounts of theory unification that attempted to underwrite a connection between unification, truth, and understanding. According to the ‘unifiers,’ as we may call them, a theory is unified to the extent that it has a small theoretical structure relative to the domain of phenomena it covers, and there are general syntactic criteria that allow (...) one to determine how unified a theory is. The explanatory power of a theory, and the understanding of nature it gives us, is a direct consequence of this unity. As well, the more unified a theory is, the better confirmed it will be, and under some conditions a theory’s unity can justify realism about unobservable entities posited by it. In the 1990s disunity became the dominant theme, with books such as John Dupré’s The Disorder of Things and the Galison and Stump anthology arguing that it is a mistake to view science as a unified practice and that rather than an epistemic virtue, unification in science is a metaphysical vice. (shrink)
This discussion provides a brief commentary on each of the papers presented in the symposium on the conceptual foundations of field theories in physics. In Section 2 I suggest an alternative to Paul Teller's reading of the gauge argument that may help to solve, or dissolve, its puzzling aspects. In Section 3 I contend that Sunny Auyang's arguments against substantivalism and for “objectivism” in the context of gauge field theories face serious worries. Finally, in Section 4 I claim that Gordon (...) Fleming's proposal for hyperplane-dependent Newton-Wigner fields differs importantly from his previous arguments about hyperplane-dependent properties in quantum mechanics. (shrink)
This paper explores the role of physically impossible idealizations in model-based explanation. We do this by examining the explanation of gravitational waves from distant stellar objects using models that contain point-particle idealizations. Like infinite idealizations in thermodynamics, biology and economics, the point-particle idealization in general relativity is physically impossible. What makes this case interesting is that there are two very different kinds of models used for predicting the same gravitational wave phenomena, post-Newtonian models and effective field theory models. The paper (...) contends that post-Newtonian models are explanatory while effective field theory models are not, because only in the former can we eliminate the physically impossible point-particle idealization. This suggests that, in some areas of science at least, models invoking ineliminable infinite idealizations cannot have explanatory power. (shrink)
This discussion provides a brief commentary on each of the papers presented in the symposium on the conceptual foundations of field theories in physics. In Section 2 I suggest an alternative to Paul Teller's (1999) reading of the gauge argument that may help to solve, or dissolve, its puzzling aspects. In Section 3 I contend that Sunny Auyang's (1999) arguments against substantivalism and for "objectivism" in the context of gauge field theories face serious worries. Finally, in Section 4 I claim (...) that Gordon Fleming's (1999) proposal for hyperplane-dependent Newton-Wigner fields differs importantly from his previous arguments about hyperplane-dependent properties in quantum mechanics. (shrink)
Nick Huggett and Robert Weingard (1994) have recently proposed a novel approach to interpreting field theories in physics, one which makes central use of the fact that a field generally has an infinite number of degrees of freedom in any finite region of space it occupies. Their characterization, they argue, (i) reproduces our intuitive categorizations of fields in the classical domain and thereby (ii) provides a basis for arguing that the quantum field is a field. Furthermore, (iii) it accomplishes these (...) tasks better than does a well-known rival approach due to Paul Teller (1990, 1995). This paper contends that all three of these claims are mistaken, and suggests that Huggett and Weingard have not shown how counting degrees of freedom provides any insight into the interpretation or the formal properties of field theories in physics. (shrink)