Scientific models invariably involve some degree of idealization, abstraction, or nationalization of their target system. Nonetheless, I argue that there are circumstances under which such false models can offer genuine scientific explanations. After reviewing three different proposals in the literature for how models can explain, I shall introduce a more general account of what I call model explanations, which specify the conditions under which models can be counted as explanatory. I shall illustrate this new framework by applying it to the (...) case of Bohr's model of the atom, and conclude by drawing some distinctions between phenomenological models, explanatory models, and fictional models. (shrink)
Despite widespread evidence that fictional models play an explanatory role in science, resistance remains to the idea that fictions can explain. A central source of this resistance is a particular view about what explanations are, namely, the ontic conception of explanation. According to the ontic conception, explanations just are the concrete entities in the world. I argue this conception is ultimately incoherent and that even a weaker version of the ontic conception fails. Fictional models can succeed in offering genuine explanations (...) by correctly capturing relevant patterns of counterfactual dependence and licensing correct inferences. Using the example of Newtonian force explanations of the tides, I show how, even in science, fiction can be a vehicle for truth. (shrink)
Classical mechanics and quantum mechanics are two of the most successful scientific theories ever discovered, and yet how they can describe the same world is far from clear: one theory is deterministic, the other indeterministic; one theory describes a world in which chaos is pervasive, the other a world in which chaos is absent. Focusing on the exciting field of 'quantum chaos', this book reveals that there is a subtle and complex relation between classical and quantum mechanics. It challenges the (...) received view that classical and quantum mechanics are incommensurable, and revives another, largely forgotten tradition due to Niels Bohr and Paul Dirac. By artfully weaving together considerations from the history of science, philosophy of science, and contemporary physics, this book offers a new way of thinking about intertheory relations and scientific explanation. It will be of particular interest to historians and philosophers of science, philosophically-inclined physicists, and interested non-specialists. (shrink)
There is a growing recognition that fictions have a number of legitimate functions in science, even when it comes to scientific explanation. However, the question then arises, what distinguishes an explanatory fiction from a nonexplanatory one? Here I examine two cases—one in which there is a consensus in the scientific community that the fiction is explanatory and another in which the fiction is not explanatory. I shall show how my account of “model explanations” is able to explain this asymmetry, and (...) argue that realism—of a more subtle form—does have a role in distinguishing explanatory from non- explanatory fictions. (shrink)
In 2012, the Geological Time Scale, which sets the temporal framework for studying the timing and tempo of all major geological, biological, and climatic events in Earth’s history, had one-quarter of its boundaries moved in a widespread revision of radiometric dates. The philosophy of metrology helps us understand this episode, and it, in turn, elucidates the notions of calibration, coherence, and consilience. I argue that coherence testing is a distinct activity preceding calibration and consilience, and I highlight the value of (...) discordant evidence and trade-offs scientists face in calibration. The iterative nature of calibration, moreover, raises the problem of legacy data. (shrink)
In semiclassical mechanics one finds explanations of quantum phenomena that appeal to classical structures. These explanations are prima facie problematic insofar as the classical structures they appeal to do not exist. Here I defend the view that fictional structures can be genuinely explanatory by introducing a model-based account of scientific explanation. Applying this framework to the semiclassical phenomenon of wavefunction scarring, I argue that not only can the fictional classical trajectories explain certain aspects of this quantum phenomenon, but also that (...) an explanation that does not make reference to these classical structures is, in a certain sense, deficient. (shrink)
The ontic conception of explanation, according to which explanations are "full-bodied things in the world," is fundamentally misguided. I argue instead for what I call the eikonic conception, according to which explanations are the product of an epistemic activity involving representations of the phenomena to be explained. What is explained in the first instance is a particular conceptualization of the explanandum phenomenon, contextualized within a given research program or explanatory project. I conclude that this eikonic conception has a number of (...) benefits, including making better sense of scientific practice and allowing for the full range of normative constraints on explanation. (shrink)
We clarify Bohr’s interpretation of quantum mechanics by demonstrating the central role played by his thesis that quantum theory is a rational generalization of classical mechanics. This thesis is essential for an adequate understanding of his insistence on the indispensability of classical concepts, his account of how the quantum formalism gets its meaning, and his belief that hidden variable interpretations are impossible.
Model-data symbiosis is the view that there is an interdependent and mutually beneficial relationship between data and models, whereby models are not only data-laden, but data are also model-laden or model filtered. In this paper I elaborate and defend the second, more controversial, component of the symbiosis view. In particular, I construct a preliminary taxonomy of the different ways in which theoretical and simulation models are used in the production of data sets. These include data conversion, data correction, data interpolation, (...) data scaling, data fusion, data assimilation, and synthetic data. Each is defined and briefly illustrated with an example from the geosciences. I argue that model-filtered data are typically more accurate and reliable than the so-called raw data, and hence beneficially serve the epistemic aims of science. By illuminating the methods by which raw data are turned into scientifically useful data sets, this taxonomy provides a foundation for developing a more adequate philosophy of data. (shrink)
Simulations using idealized numerical models can often generate behaviors or patterns that are visually very similar to the natural phenomenon being investigated and to be explained. The question arises, when should these model simulations be taken to provide an explanation for why the natural phenomena exhibit the patterns that they do? An important distinction for answering this question is that between ‘how-possibly’ explanations and ‘how-actually’ explanations. Despite the importance of this distinction there has been surprisingly little agreement over how exactly (...) this distinction should bedrawn. I shall argue that inadequate attention has been paid to the different contexts in which an explanation can be given and the different levels of abstraction at which the explanandum phenomenon can be framed. By tracing how scientists are using model simulations to explain a striking periodic banding of vegetation known as tiger bush, I will show how our understanding of the distinction between how-possibly and how-actually model explanations needs to be revised. (shrink)
In the spirit of explanatory pluralism, this chapter argues that causal and noncausal explanations of a phenomenon are compatible, each being useful for bringing out different sorts of insights. After reviewing a model-based account of scientific explanation, which can accommodate causal and noncausal explanations alike, an important core conception of noncausal explanation is identified. This noncausal form of model-based explanation is illustrated using the example of how Earth scientists in a subfield known as aeolian geomorphology are explaining the formation of (...) regularlyspaced sand ripples. The chapter concludes that even when it comes to everyday "medium-sized dry goods" such as sand ripples, where there is a complete causal story to be told, one can find examples of noncausal scientific explanations. (shrink)
Despite an enormous philosophical literature on models in science, surprisingly little has been written about data models and how they are constructed. In this paper, I examine the case of how paleodiversity data models are constructed from the fossil data. In particular, I show how paleontologists are using various model-based techniques to correct the data. Drawing on this research, I argue for the following related theses: First, the 'purity' of a data model is not a measure of its epistemic reliability. (...) Instead it is the fidelity of the data that matters. Second, the fidelity of a data model in capturing the signal of interest is a matter of degree. Third, the fidelity of a data model can be improved 'vicariously', such as through the use of post hoc model-based correction techniques. And, fourth, data models, like theoretical models, should be assessed as adequate (or inadequate) for particular purposes. (shrink)
Detailed examinations of scientific practice have revealed that the use of idealized models in the sciences is pervasive. These models play a central role in not only the investigation and prediction of phenomena, but in their received scientific explanations as well. This has led philosophers of science to begin revising the traditional philosophical accounts of scientific explanation in order to make sense of this practice. These new model-based accounts of scientific explanation, however, raise a number of key questions: Can the (...) fictions and falsehoods inherent in the modeling practice do real explanatory work? Do some highly abstract and mathematical models exhibit a noncausal form of scientific explanation? How can one distinguish an exploratory "how-possibly" model explanation from a genuine "how-actually" model explanation? Do modelers face tradeoffs such that a model that is optimized for yielding explanatory insight, for example, might fail to be the most predictively accurate, and vice versa? This chapter explores the various answers that have been given to these questions. (shrink)
It has frequently been suggested that quantum mechanics may provide a genuine case of ontic vagueness or metaphysical indeterminacy. However, discussions of quantum theory in the vagueness literature are often cursory and, as I shall argue, have in some respects been misguided. Hitherto much of the debate over ontic vagueness and quantum theory has centered on the “indeterminate identity” construal of ontic vagueness, and whether the quantum phenomenon of entanglement produces particles whose identity is indeterminate. I argue that this way (...) of framing the debate is mistaken. A more thorough examination of quantum theory and the phenomenon of entanglement reveals that quantum mechanics is best interpreted as supporting what I call the “vague property” construal of ontic vagueness, where vague properties are understood in terms of determinable properties without the corresponding determinates. (shrink)
This paper describes a long-standing, though little-known, debate between Paul Dirac and Werner Heisenberg over the nature of scientific methodology, theory change, and intertheoretic relations. Following Heisenberg’s terminology, their disagreements can be summarized as a debate over whether the classical and quantum theories are “open” or “closed.” A close examination of this debate sheds new light on the philosophical views of two of the great founders of quantum theory.
The fact that the same equations or mathematical models reappear in the descriptions of what are otherwise disparate physical systems can be seen as yet another manifestation of Wigner's “unreasonable effectiveness of mathematics.” James Clerk Maxwell famously exploited such formal similarities in what he called the “method of physical analogy.” Both Maxwell and Hermann von Helmholtz appealed to the physical analogies between electromagnetism and hydrodynamics in their development of these theories. I argue that a closer historical examination of the different (...) ways in which Maxwell and Helmholtz each deployed this analogy gives further insight into debates about the representational and explanatory power of mathematical models. (shrink)
: An examination of two thought experiments in contemporary physics reveals that the same thought experiment can be reanalyzed from the perspective of different and incompatible theories. This fact undermines those accounts of thought experiments that claim their justificatory power comes from their ability to reveal the laws of nature. While thought experiments do play a genuine evaluative role in science, they do so by testing the nonempirical virtues of a theory, such as consistency and explanatory power. I conclude that, (...) while their interpretation presupposes a whole set of background theories and putative laws, thought experiments nonetheless can evolve and be retooled for different theories and ends. (shrink)
Hempel's Dilemma is the claim that physicalism is an ill-formed thesis because it can offer no account of the physics that it refers to: current physics will be discarded in the future, and we don't yet know the nature of future physics. This article confronts the first horn of the dilemma, and argues that our knowledge of current physics is sufficient for offering a physicalist ontology of the mind. We have good scientific evidence that future physics will be irrelevant to (...) the mind-body problem because mental processes lie safely within the well-understood domains of applicability of current physical theories. (shrink)
The geosciences include a wide spectrum of disciplines ranging from paleontology to climate science, and involve studies of a vast range of spatial and temporal scales, from the deep-time history of microbial life to the future of a system no less immense and complex than the entire Earth. Modeling is thus a central and indispensable tool across the geosciences. Here, we review both the history and current state of model-based inquiry in the geosciences. Research in these fields makes use of (...) a wide variety of models, such as conceptual, physical, and numerical models, and more specifically cellular automata, artificial neural networks, agent-based models, coupled models, and hierarchical models. We note the increasing demands to incorporate biological and social systems into geoscience modeling, challenging the traditional boundaries of these fields. Understanding and articulating the many different sources of scientific uncertainty – and finding tools and methods to address them – has been at the forefront of most research in geoscience modeling. We discuss not only structuralmodel uncertainties, parameter uncertainties, and solution uncertainties, but also the diverse sources of uncertainty arising from the complex nature of geoscience systems themselves. Without an examination of the geosciences, our philosophies of science and our understanding of the nature of model-based science are incomplete. (shrink)
At the center of quantum chaos research is a particular family of models known as quantum maps. These maps illustrate an important “horizontal” dimension to model construction that has been overlooked in the literature on models. Three ways in which quantum maps are being used to clarify the relationship between classical and quantum mechanics are examined. This study suggests that horizontal models may provide a new and fruitful framework for exploring intertheoretic relations.
The 2006 decision by the International Astronomical Union to strip Pluto of its status as a planet generated considerable uproar not only in scientific circles, but among the lay public as well. After all, how can a vote by 424 scientists in a conference room in Prague undermine what every well-educated second grader knows is a scientific fact? The Pluto controversy provides a new and fertile ground in which to revisit the traditional philosophical problems of natural kinds and scientific change. (...) Before engaging these philosophical problems, however, there are two misguided reactions to the Pluto controversy worth dispelling from the start. The first misguided reaction is that this sort of classificatory .. (shrink)
Although predictive power and explanatory insight are both desiderata of scientific models, these features are often in tension with each other and cannot be simultaneously maximized. In such situations, scientists may adopt what I term a ‘division of cognitive labor’ among models, using different models for the purposes of explanation and prediction, respectively, even for the exact same phenomenon being investigated. Adopting this strategy raises a number of issues, however, which have received inadequate philosophical attention. More specifically, while one implication (...) may be that it is inappropriate to judge explanatory models by the same standards of quantitative accuracy as predictive models, there still needs to be some way of either confirming or rejecting these model explanations. Here I argue that robustness analyses have a central role to play in testing highly idealized explanatory models. I illustrate these points with two examples of explanatory models from the field of geomorphology. (shrink)
"Entanglement can be understood as an extraordinary degree of correlation between states of quantum systems - a correlation that cannot be given an explanation ...
Niels Bohr famously argued that a consistent understanding of quantum mechanics requires a new epistemic framework, which he named complementarity . This position asserts that even in the context of quantum theory, classical concepts must be used to understand and communicate measurement results. The apparent conflict between certain classical descriptions is avoided by recognizing that their application now crucially depends on the measurement context. ;Recently it has been argued that a new form of complementarity can provide a solution to the (...) so-called information loss paradox. Stephen Hawking argues that the evolution of black holes cannot be described by standard unitary quantum evolution, because such evolution always preserves information, while the evaporation of a black hole will imply that any information that fell into it is irrevocably lost---hence a "paradox." Some researchers in quantum gravity have argued that this paradox can be resolved if one interprets certain seemingly incompatible descriptions of events around black holes as instead being complementary. ;In this dissertation I assess the extent to which this black hole complementarity can be undergirded by Bohr's account of the limitations of classical concepts. I begin by offering an interpretation of Bohr's complementarity and the role that it plays in his philosophy of quantum theory. After clarifying the nature of classical concepts, I offer an account of the limitations these concepts face, and argue that Bohr's appeal to disturbance is best understood as referring to these conceptual limits. ;Following preparatory chapters on issues in quantum field theory and black hole mechanics, I offer an analysis of the information loss paradox and various responses to it. I consider the three most prominent accounts of black hole complementarity and argue that they fail to offer sufficient justification for the proposed incompatibility between descriptions. ;The lesson that emerges from this dissertation is that we have as much to learn from the limitations facing our scientific descriptions as we do from the successes they enjoy. Because all of our scientific theories offer at best limited, effective accounts of the world, an important part of our interpretive efforts will be assessing the borders of these domains of description. (shrink)
In 1962, the publication of Thomas Kuhn’s Structure ‘revolutionized’ the way one conducts philosophical and historical studies of science. Through the introduction of both memorable and controversial notions, such as paradigms, scientific revolutions, and incommensurability, Kuhn argued against the traditionally accepted notion of scientific change as a progression towards the truth about nature, and instead substituted the idea that science is a puzzle solving activity, operating under paradigms, which become discarded after it fails to respond accordingly to anomalous challenges and (...) a rival paradigm. Kuhn’s Structure has sold over 1.4 million copies and the Times Literary Supplement named it one of the “Hundred Most Influential Books since the Second World War.” Now, fifty years after this groundbreaking work was published, this volume offers a timely reappraisal of the legacy of Kuhn’s book and an investigation into what Structure offers philosophical, historical, and sociological studies of science in the future. (shrink)
: Although Dirac rarely participated in the interpretational debates over quantum theory, it is traditionally assumed that his views were aligned with Heisenberg and Bohr in the so-called Copenhagen-Göttingen camp. However, an unpublished—and apparently unknown—lecture of Dirac's reveals that this view is mistaken; in the famous debate between Einstein and Bohr, Dirac sided with Einstein. Surprisingly, Dirac believed that quantum mechanics was not complete, that the uncertainty principle would not survive in the future physics, and that a deterministic description of (...) the microworld would be recovered. In this paper I show how we can make sense of this unpublished lecture in the context of Dirac's broader philosophy of quantum mechanics, and how our present understanding of Dirac's philosophical views must be revised. (shrink)
This article addresses the question whether supertasks are possible within the context of non-relativistic quantum mechanics. The supertask under consideration consists of performing an infinite number of quantum mechanical measurements in a finite amount of time. Recent arguments in the physics literature claim to show that continuous measurements, understood as N discrete measurements in the limit where N goes to infinity, are impossible. I show that there are certain kinds of measurements in quantum mechanics for which these arguments break down. (...) This suggests that there is a new context in which quantum mechanics, in principle, permits the performance of a supertask. (shrink)
Belot, Earman, and Ruetsche dismiss the black hole remnant proposal as an inadequate response to the Hawking information loss paradox. I argue that their criticisms are misplaced and that, properly understood, remnants do offer a substantial reply to the argument against the possibility of unitary evolution in spacetimes that contain evaporating black holes. The key to understanding these proposals lies in recognizing that the question of where and how our current theories break down is at the heart of these debates (...) in quantum gravity. I also argue that the controversial nature of assessing the limits of general relativity and quantum field theory illustrates the significance of attempts to establish the proper borders of our effective theories. (shrink)
Niels Bohr’s “correspondence principle” is typically believed to be the requirement that in the limit of large quantum numbers (n→∞) there is a statistical agreement between the quantum and classical frequencies. A closer reading of Bohr’s writings on the correspondence principle, however, reveals that this interpretation is mistaken. Specifically, Bohr makes the following three puzzling claims: First, he claims that the correspondence principle applies to small quantum numbers as well as large (while the statistical agreement of frequencies is only for (...) large n); second, he claims that the correspondence principle is a law of quantum theory; and third, Bohr argues that formal apparatus of matrix mechanics (the new quantum theory) can be thought of as a precise formulation of the correspondence principle. With further textual evidence, I offer an alternative interpretation of the correspondence principle in terms of what I call Bohr’s selection rule. I conclude by showing how this new interpretation of the correspondence principle readily makes sense of Bohr’s three puzzling claims. (shrink)
A proper understanding of black hole complementarity as a response to the information loss paradox requires recognizing the essential role played by arguments for the applicability and limitations of effective semiclassical theories. I argue that this perspective sheds important light on the arguments advanced by Susskind, Thorlacius, and Uglum—although ultimately I argue that their position is unsatisfactory. I also consider the argument offered by ’t Hooft for the breakdown of microcausality around black holes, and conclude that it relies on a (...) mistaken treatment of measurement collapse. There is, however, a legitimate argumentative role for black hole complementarity, exemplified by the position of Kiem, Verlinde, and Verlinde, that calls for a more subtle analysis of the limitations facing our effective theories. (shrink)
Belot, Earman, and Ruetsche (1999) dismiss the black hole remnant proposal as an inadequate response to the Hawking information loss paradox. I argue that their criticisms are misplaced and that, properly understood, remnants do offer a substantial reply to the argument against the possibility of unitary evolution in spacetimes that contain evaporating black holes. The key to understanding these proposals lies in recognizing that the question of where and how our current theories break down is at the heart of these (...) debates in quantum gravity. I also argue that the controversial nature of assessing the limits of general relativity and quantum field theory illustrates the significance of attempts to establish the proper borders of our effective theories. (shrink)
While everyone knows of Einstein’s brilliant work on relativity theory and many know of his later opposition to quantum theory as immortalized in his remark “He [God] does not play dice,” few outside of limited academic circles know of Einstein’s many seminal contributions to the development of quantum theory. In this highly accessible and enjoyable popular science book, Douglas Stone seeks to revise our popular conception of Einstein and bring the story of his profound and revolutionary insights into quantum theory (...) to a broader audience. (shrink)
1. Petkov assumes that the standard relativistic interpretations of measurement procedures are to be respected, but this is precisely what 3D-er (the 3-dimensionalist) will deny. Petkov’s apparent contradictions are due to the fact that he considers an inconsistent mixture of 3D ontology and the standard interpretation of special relativity.
We critically engage two traditional views of scientific data and outline a novel philosophical view that we call the pragmatic-representational (PR) view of data. On the PR view, data are representations that are the product of a process of inquiry, and they should be evaluated in terms of their adequacy or fitness for particular purposes. Some important implications of the PR view for data assessment, related to misrepresentation, context-sensitivity, and complementary use, are highlighted. The PR view provides insight into the (...) common but little-discussed practices of iteratively reusing and repurposing data, which result in many datasets' having a phylogeny—an origin and complex evolutionary history—that is relevant to their evaluation and future use. We relate these insights to the open-data and data-rescue movements, and highlight several future avenues of research that build on the PR view of data. (shrink)
Presentations of black hole complementarity by van Dongen and de Haro, as well as by 't Hooft, suffer from a mistaken claim that interactions between matter falling into a black hole and the emitted Hawking-like radiation should lead to a failure of commutativity between space-like-related observables localized inside and outside the black hole. I show that this conclusion is not supported by our standard understanding of quantum interactions. We have no reason to believe that near-horizon interactions will threaten microcausality. If (...) these interactions reliably transfer information to the outgoing radiation, then this response to Hawking's information loss argument should amount to a version of the bleaching scenario. I argue that the challenge facing black hole complementarity is that of reconciling this commitment to bleaching with the expectation that the event horizon will be locally unremarkable. This challenge is most promisingly met by proposals that postulate a consistent account of the limitations of our local semi-classical theories, but no support is added to these postulates by appeals to verificationism or to the interactions considered by 't Hooft. (shrink)
Traditionally Ψ is used to stand in for both the mathematical wavefunction (the representation) and the quantum state (the thing in the world). This elision has been elevated to a metaphysical thesis by advocates of the view known as wavefunction realism. My aim in this paper is to challenge the hegemony of the wavefunction by calling attention to a little-known formulation of quantum theory that does not make use of the wavefunction in representing the quantum state. This approach, called Lagrangian (...) quantum hydrodynamics (LQH), is not an approximation scheme, but rather a full alternative formulation of quantum theory. I argue that a careful consideration of alternative formalisms is an essential part of any realist project that attempts to read the ontology of a theory off of the mathematical formalism. In particular, I show that LQH undercuts the central presumption of wavefunction realism and falsifies the claim that one must represent the many-body quantum state as living in a 3n-dimensional configuration space. I conclude by briefly sketching three different realist approaches one could take toward LQH, and argue that both models of the quantum state should be admitted. When exploring quantum realism, regaining sight of the proverbial forest of quantum representations beyond the Ψ is just the first step. (shrink)
Model-data symbiosis is the view that there is an interdependent and mutually beneficial relationship between data and models, whereby models are data-laden and data are model-laden. In this articl...
At the intersection of taxonomy and nomenclature lies the scientific practice of typification. This practice occurs in biology with the use of holotypes (type specimens), in geology with the use of stratotypes, and in metrology with the use of measurement prototypes. In this paper I develop the first general definition of a scientific type and outline a new philosophical theory of types inspired by Pierre Duhem. I use this general framework to resolve the necessity-contingency debate about type specimens in philosophy (...) of biology, to advance the debate over the myth of the absolute accuracy of standards in metrology, and to address the definition-correlation debate in geology. I conclude that just as there has been a productive synergy between philosophical accounts of natural kinds and scientific taxonomic practices, so too there is much to be gained from developing a deeper understanding of the practices and philosophy of scientific types. (shrink)