Most scientific models are not physical objects, and this raises important questions. What sort of entity are models, what is truth in a model, and how do we learn about models? In this paper I argue that models share important aspects in common with literary fiction, and that therefore theories of fiction can be brought to bear on these questions. In particular, I argue that the pretence theory as developed by Walton has the resources to answer these questions. I introduce (...) this account, outline the answers that it offers, and develop a general picture of scientific modelling based on it. (shrink)
We reconsider the Nagelian theory of reduction and argue that, contrary to a widely held view, it is the right analysis of intertheoretic reduction. The alleged difficulties of the theory either vanish upon closer inspection or turn out to be substantive philosophical questions rather than knock-down arguments.
GRW Theory postulates a stochastic mechanism assuring that every so often the wave function of a quantum system is `hit', which leaves it in a localised state. How are we to interpret the probabilities built into this mechanism? GRW theory is a firmly realist proposal and it is therefore clear that these probabilities are objective probabilities (i.e. chances). A discussion of the major theories of chance leads us to the conclusion that GRW probabilities can be understood only as either single (...) case propensities or Humean objective chances. Although single case propensities have some intuitive appeal in the context of GRW theory, on balance it seems that Humean objective chances are preferable on conceptual grounds because single case propensities suffer from various well know problems such as unlimited frequency tolerance and lack of a rationalisation of the principal principle. (shrink)
It is now part and parcel of the official philosophical wisdom that models are essential to the acquisition and organisation of scientific knowledge. It is also generally accepted that most models represent their target systems in one way or another. But what does it mean for a model to represent its target system? I begin by introducing three conundrums that a theory of scientific representation has to come to terms with and then address the question of whether the semantic view (...) of theories, which is the currently most widely accepted account of theories and models, provides us with adequate answers to these questions. After having argued in some detail that it does not, I conclude by pointing out in what direction a tenable account of scientific representation might be sought. (shrink)
Computer simulations are an exciting tool that plays important roles in many scientific disciplines. This has attracted the attention of a number of philosophers of science. The main tenor in this literature is that computer simulations not only constitute interesting and powerful new science , but that they also raise a host of new philosophical issues. The protagonists in this debate claim no less than that simulations call into question our philosophical understanding of scientific ontology, the epistemology and semantics of (...) models and theories, and the relation between experimentation and theorising, and submit that simulations demand a fundamentally new philosophy of science in many respects. The aim of this paper is to critically evaluate these claims. Our conclusion will be sober. We argue that these claims are overblown and that simulations, far from demanding a new metaphysics, epistemology, semantics and methodology, raise few if any new philosophical problems. The philosophical problems that do come up in connection with simulations are not specific to simulations and most of them are variants of problems that have been discussed in other contexts before. (shrink)
Classical statistical mechanics posits probabilities for various events to occur, and these probabilities seem to be objective chances. This does not seem to sit well with the fact that the theory’s time evolution is deterministic. We argue that the tension between the two is only apparent. We present a theory of Humean objective chance and show that chances thus understood are compatible with underlying determinism and provide an interpretation of the probabilities we find in Boltzmannian statistical mechanics.
Models are of central importance in many scientific contexts. The centrality of models such as the billiard ball model of a gas, the Bohr model of the atom, the MIT bag model of the nucleon, the Gaussian-chain model of a polymer, the Lorenz model of the atmosphere, the Lotka-Volterra model of predator-prey interaction, the double helix model of DNA, agent-based and evolutionary models in the social sciences, or general equilibrium models of markets in their respective domains are cases in point. (...) Scientists spend a great deal of time building, testing, comparing and revising models, and much journal space is dedicated to introducing, applying and interpreting these valuable tools. In short, models are one of the principal instruments of modern science. (shrink)
In this paper we explore the constraints that our preferred account of scientific representation places on the ontology of scientific models. Pace the Direct Representation view associated with Arnon Levy and Adam Toon we argue that scientific models should be thought of as imagined systems, and clarify the relationship between imagination and representation.
Various processes are often classified as both deterministic and random or chaotic. The main difficulty in analysing the randomness of such processes is the apparent tension between the notions of randomness and determinism: what type of randomness could exist in a deterministic process? Ergodic theory seems to offer a particularly promising theoretical tool for tackling this problem by positing a hierarchy, the so-called ‘ergodic hierarchy’, which is commonly assumed to provide a hierarchy of increasing degrees of randomness. However, that notion (...) of randomness requires clarification. The mathematical definition of EH does not make explicit appeal to randomness; nor does the usual way of presenting EH involve a specification of the notion of randomness that is supposed to underlie the hierarchy. In this paper we argue that EH is best understood as a hierarchy of random behaviour if randomness is explicated in terms of unpredictability. We then show that, contrary to common wisdom, EH is useful in characterising the behaviour of Hamiltonian dynamical systems. (shrink)
Everything you always wanted to know about structural realism but were afraid to ask Content Type Journal Article Pages 227-276 DOI 10.1007/s13194-011-0025-7 Authors Roman Frigg, Department of Philosophy, Logic and Scientific Method, London School of Economics and Political Science, Houghton Street, London, WC2A 2AE UK Ioannis Votsis, Philosophisches Institut, Heinrich-Heine-Universität Düsseldorf, Universitätsstraße 1, Geb. 23.21/04.86, 40225 Düsseldorf, Germany Journal European Journal for Philosophy of Science Online ISSN 1879-4920 Print ISSN 1879-4912 Journal Volume Volume 1 Journal Issue Volume 1, Number 2.
Scientific discourse is rife with passages that appear to be ordinary descriptions of systems of interest in a particular discipline. Equally, the pages of textbooks and journals are filled with discussions of the properties and the behavior of those systems. Students of mechanics investigate at length the dynamical properties of a system consisting of two or three spinning spheres with homogenous mass distributions gravitationally interacting only with each other. Population biologists study the evolution of one species procreating at a constant (...) rate in an isolated ecosystem. And when studying the exchange of goods, economists consider a situation in which there are only two goods, two perfectly rational agents, no restrictions on available information, no transaction costs, no money, and dealings are done immediately. Their surface structure notwithstanding, no competent scientist would mistake descriptions of such systems as descriptions of an actual system: we know very well that there are no such systems. These descriptions are descriptions of a model-system, and scientists use model-systems to represent parts or aspects of the world they are interested in. Following common practice, I refer to those parts or aspects as target-systems. What are we to make of this? Is discourse about such models merely a picturesque and ultimately dispensable façon de parler? This was the view of some early twentieth century philosophers. Duhem (1906) famously guarded against confusing model building with scientific theorizing and argued that model building has no real place in science, beyond a minor heuristic role. The aim of science was, instead, to construct theories, with theories understood as classificatory or representative structures systematically presented and formulated in precise symbolic.. (shrink)
The United Kingdom Climate Impacts Program’s UKCP09 project makes high-resolution forecasts of climate during the 21st century using state of the art global climate models. The aim of this paper is to introduce and analyze the methodology used and then urge some caution. Given the acknowledged systematic errors in all current climate models, treating model outputs as decision relevant probabilistic forecasts can be seriously misleading. This casts doubt on our ability, today, to make trustworthy, high-resolution predictions out to the end (...) of this century. (shrink)
Gases reach equilibrium when left to themselves. Why do they behave in this way? The canonical answer to this question, originally proffered by Boltzmann, is that the systems have to be ergodic. This answer has been criticised on different grounds and is now widely regarded as flawed. In this paper we argue that some of the main arguments against Boltzmann's answer, in particular, arguments based on the KAM-theorem and the Markus-Meyer theorem, are beside the point. We then argue that something (...) close to Boltzmann's original proposal is true for gases: gases behave thermodynamic-like if they are epsilon-ergodic, i.e., ergodic on the entire accessible phase space except for a small region of measure epsilon. This answer is promising because there are good reasons to believe that relevant systems in statistical mechanics are epsilon-ergodic. (shrink)
The United Kingdom Climate Impacts Programme’s UKCP09 project makes highresolution projections of the climate out to 2100 by post-processing the outputs of a large-scale global climate model. The aim of this paper is to describe and analyse the methodology used and then urge some caution. Given the acknowledged systematic, shared shortcomings in all current climate models, treating model outputs as decision relevant projections can be significantly misleading. In extrapolatory situations, such as projections of future climate change impacts, there is little (...) reason to expect that postprocessing of model outputs can correct for the consequences of such errors. This casts doubt on our ability, today, to make trustworthy, high-resolution probabilistic projections out to the end of this century. (shrink)
On the face of it ‘deterministic chance’ is an oxymoron: either an event is chancy or deterministic, but not both. Nevertheless, the world is rife with events that seem to be exactly that: chancy and deterministic at once. Simple gambling devices like coins and dice are cases in point. On the one hand they are governed by deterministic laws – the laws of classical mechanics – and hence given the initial condition of, say, a coin toss it is determined whether (...) it will land heads or tails.2 On the other hand, we commonly assign probabilities to the different outcomes a coin toss, and doing so has proven successful in guiding our actions. The same dilemma also emerges in less mundane contexts. Classical statistical mechanics (which is still an important part of modern physics) assigns probabilities to the occurrence of certain events – for instance to the spreading of a gas that is originally confined to the left half of a container – but at the same time assumes that the relevant systems are deterministic. How can this apparent conflict be resolved? (shrink)
The sensitive dependence on initial conditions (SDIC) associated with nonlinear models imposes limitations on the models’ predictive power. We draw attention to an additional limitation than has been under-appreciated, namely structural model error (SME). A model has SME if the model-dynamics differ from the dynamics in the target system. If a nonlinear model has only the slightest SME, then its ability to generate decision-relevant predictions is compromised. Given a perfect model, we can take the effects of SDIC into account by (...) substituting probabilistic predictions for point predictions. This route is foreclosed in the case of SME. (shrink)
Boltzmannian statistical mechanics partitions the phase space of a sys- tem into macro-regions, and the largest of these is identified with equilibrium. What justifies this identification? Common answers focus on Boltzmann’s combinatorial argument, the Maxwell-Boltzmann distribution, and maxi- mum entropy considerations. We argue that they fail and present a new answer. We characterise equilibrium as the macrostate in which a system spends most of its time and prove a new theorem establishing that equilib- rium thus defined corresponds to the largest (...) macro-region. Our derivation is completely general in that it does not rely on assumptions about a system’s dynamics or internal interactions. (shrink)
A gas prepared in a non-equilibrium state will approach equilibrium and stay there. An influential contemporary approach to Statistical Mechanics explains this behaviour in terms of typicality. However, this explanation has been criticised as mysterious as long as no connection with the dynamics of the system is established. We take this criticism as our point of departure. Our central claim is that Hamiltonians of gases which are epsilon-ergodic are typical with respect to the Whitney topology. Because equilibrium states are typical, (...) we argue that there follows the desired conclusion that typical initial conditions approach equilibrium and stay there. (shrink)
In Boltzmannian statistical mechanics macro-states supervene on micro-states. This leads to a partitioning of the state space of a system into regions of macroscopically indistinguishable micro-states. The largest of these regions is singled out as the equilibrium region of the system. What justifies this association? We review currently available answers to this question and find them wanting both for conceptual and for technical reasons. We propose a new conception of equilibrium and prove a mathematical theorem which establishes in full generality (...) -- i.e. without making any assumptions about the system's dynamics or the nature of the interactions between its components -- that the equilibrium macro-region is the largest macro-region. We then turn to the question of the approach to equilibrium, of which there exists no satisfactory general answer so far. In our account, this question is replaced by the question when an equilibrium state exists. We prove another -- again fully general -- theorem providing necessary and sufficient conditions for the existence of an equilibrium state. This theorem changes the way in which the question of the approach to equilibrium should be discussed: rather than launching a search for a crucial factor, the focus should be on finding triplets of macro-variables, dynamical conditions, and effective state spaces that satisfy the conditions of the theorem. (shrink)
Consider a gas that is adiabatically isolated from its environment and conﬁned to the left half of a container. Then remove the wall separating the two parts. The gas will immediately start spreading and soon be evenly distributed over the entire available space. The gas has approached equilibrium. Thermodynamics (TD) characterizes this process in terms of an increase of thermodynamic entropy, which attains its maximum value at equilibrium. The second law of thermodynamics captures the irreversibility of this process by positing (...) that in an isolated system such as the gas entropy cannot decrease. The aim of statistical mechanics (SM) is to explain the behavior of the gas and, in particular, its conformity with the second law in terms of the dynamical laws governing the individual molecules of which the gas is made up. In what follows these laws are assumed to be the ones of Hamiltonian classical mechanics. We should not, however, ask for an explanation of the second law literally construed. This law is a universal law and as such cannot be explained by a statistical theory. But this is not a problem because we.. (shrink)
In two recent papers Barry Loewer (2001, 2004) has suggested to interpret probabilities in statistical mechanics as Humean chances in David Lewis’ (1994) sense. I first give a precise formulation of this proposal, then raise two fundamental objections, and finally conclude that these can be overcome only at the price of interpreting these probabilities epistemically.
Climate change adaptation is largely a local matter, and adaptation planning can benefit from local climate change projections. Such projections are typically generated by accepting climate model outputs in a relatively uncritical way. We argue, based on the IPCC’s treatment of model outputs from the CMIP5 ensemble, that this approach is unwarranted and that subjective expert judgment should play a central role in the provision of local climate change projections intended to support decision-making.
Science provides us with representations of atoms, elementary particles, polymers, populations, genetic trees, economies, rational decisions, aeroplanes, earthquakes, forest fires, irrigation systems, and the world’s climate. It's through these representations that we learn about the world. This entry explores various different accounts of scientific representation, with a particular focus on how scientific models represent their target systems. As philosophers of science are increasingly acknowledging the importance, if not the primacy, of scientific models as representational units of science, it's important to (...) stress that how they represent plays a fundamental role in how we are to answer other questions in the philosophy of science. This entry begins by disentangling ‘the’ problem of scientific representation, before critically evaluating the current options available in the literature. (shrink)
The last decade and a half has seen an ardent development of self-organised criticality, a new approach to complex systems, which has become important in many domains of natural as well as social science, such as geology, biology, astronomy, and economics, to mention just a few. This has led many to adopt a generalist stance towards SOC, which is now repeatedly claimed to be a universal theory of complex behaviour. The aim of this paper is twofold. First, I provide a (...) brief and non-technical introduction to SOC. Second, I critically discuss the various bold claims that have been made in connection with it. Throughout, I will adopt a rather sober attitude and argue that some people have been too readily carried away by fancy contentions. My overall conclusion will be that none of these bold claims can be maintained. Nevertheless, stripped of exaggerated expectations and daring assertions, many SOC models are interesting vehicles for promising scientific research.Author Keywords: Self-organised criticality; Scaling law; Models; Formal analogy. (shrink)
The aim of this article is twofold. Recently, Lewis has presented an argument, now known as the "counting anomaly", that the spontaneous localization approach to quantum mechanics, suggested by Ghirardi, Rimini, and Weber, implies that arithmetic does not apply to ordinary macroscopic objects. I will take this argument as the starting point for a discussion of the property structure of realist collapse interpretations of quantum mechanics in general. At the end of this I present a proof of the fact that (...) the composition principle, which holds true in standard quantum mechanics, fails in all realist collapse interpretations. On the basis of this result I reconsider the counting anomaly and show that what lies at the heart of the anomaly is the failure to appreciate the peculiarities of the property structure of such interpretations. Once this flaw is uncovered, the anomaly vanishes. (shrink)
On an influential account, chaos is explained in terms of random behaviour; and random behaviour in turn is explained in terms of having positive Kolmogorov-Sinai entropy (KSE). Though intuitively plausible, the association of the KSE with random behaviour needs justification since the definition of the KSE does not make reference to any notion that is connected to randomness. I provide this justification for the case of Hamiltonian systems by proving that the KSE is equivalent to a generalized version of Shannon's (...) communication-theoretic entropy under certain plausible assumptions. I then discuss consequences of this equivalence for randomness in chaotic dynamical systems. Introduction Elements of dynamical systems theory Entropy in communication theory Entropy in dynamical systems theory Comparison with other accounts Product versus process randomness. (shrink)
Models occupy a central role in the scientific endeavour. Among the many purposes they serve, representation is of great importance. Many models are representations of something else; they stand for, depict, or imitate a selected part of the external world (often referred to as target system, parent system, original, or prototype). Well-known examples include the model of the solar system, the billiard ball model of a gas, the Bohr model of the atom, the Gaussian-chain model of a polymer, the MIT (...) bag model of quark confinement, the Lorenz model of the atmosphere, the Lotka-Volterra model of the predator-prey interaction, or the hydraulic model of an economy, to mention just a few. All these models represent their target systems (or selected parts of them) in one way or another. (shrink)
This is the second of three parts of an introduction to the philosophy of climate science. In this second part about modelling climate change, the topics of climate modelling, confirmation of climate models, the limits of climate projections, uncertainty and finally model ensembles will be discussed.
Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the most important notions (...) of entropy and to clarify the relations between them, After setting the stage by introducing the thermodynamic entropy, we discuss notions of entropy in information theory, statistical mechanics, dynamical systems theory and fractal geometry. (shrink)
This volume is a serious attempt to open up the subject of European philosophy of science to real thought, and provide the structural basis for the interdisciplinary development of its specialist fields, but also to provoke reflection on the idea of ‘European philosophy of science’. This efforts should foster a contemporaneous reflection on what might be meant by philosophy of science in Europe and European philosophy of science, and how in fact awareness of it could assist philosophers interpret and motivate (...) their research through a stronger collective identity. The overarching aim is to set the background for a collaborative project organising, systematising, and ultimately forging an identity for, European philosophy of science by creating research structures and developing research networks across Europe to promote its development. (shrink)
An important contemporary version of Boltzmannian statistical mechanics explains the approach to equilibrium in terms of typicality. The problem with this approach is that it comes in different versions, which are, however, not recognized as such and not clearly distinguished. This article identifies three different versions of typicality‐based explanations of thermodynamic‐like behavior and evaluates their respective successes. The conclusion is that the first two are unsuccessful because they fail to take the system's dynamics into account. The third, however, is promising. (...) I give a precise formulation of the proposal and present an argument in support of its central contention. †To contact the author, please write to: Department of Philosophy, Logic, and Scientific Method, London School of Economics, Houghton Street, London WC2A 2AE, England; e‐mail: email@example.com. (shrink)
Special issue. With contributions by Anouk Barberouse, Sarah Francescelli and Cyrille Imbert, Robert Batterman, Roman Frigg and Julian Reiss, Axel Gelfert, Till Grüne-Yanoff, Paul Humphreys, James Mattingly and Walter Warwick, Matthew Parker, Wendy Parker, Dirk Schlimm, and Eric Winsberg.
The so-called ergodic hierarchy (EH) is a central part of ergodic theory. It is a hierarchy of properties that dynamical systems can possess. Its five levels are egrodicity, weak mixing, strong mixing, Kolomogorov, and Bernoulli. Although EH is a mathematical theory, its concepts have been widely used in the foundations of statistical physics, accounts of randomness, and discussions about the nature of chaos. We introduce EH and discuss how its applications in these fields.
Many scientific models are representations. Building on Goodman and Elgin’s notion of representation-as we analyse what this claim involves by providing a general definition of what makes something a scientific model and formulating a novel account of how models represent. We call the result the DEKI account of representation, which offers a complex kind of representation involving an interplay of denotation, exemplification, keying up of properties, and imputation. Throughout we focus on material models, and we illustrate our claims with the (...) Phillips-Newlyn machine. In the conclusion we suggest that, mutatis mutandis, the DEKI account can be carried over to other kinds of models, notably fictional and mathematical models. (shrink)
Let us begin with a characteristic example. Consider a gas that is confined to the left half of a box. Now we remove the barrier separating the two halves of the box. As a result, the gas quickly disperses, and it continues to do so until it homogeneously fills the entire box. This is illustrated in Figure 1.
Models are of central importance in many scientific contexts. The roles the MIT bag model of the nucleon, the billiard ball model of a gas, the Bohr model of the atom, the Gaussian-chain model of a polymer, the Lorenz model of the atmosphere, the Lotka- Volterra model of predator-prey interaction, agent-based and evolutionary models of social interaction, or general equilibrium models of markets play in their respective domains are cases in point.
In its most common use, the term ‘model’ refers to a simplified and stylised version of the socalled target system, the part or aspect of the world that we are interested in. For instance, in order to determine the orbit of a planet moving around the sun we model the planet and the sun as perfect homogenous spheres that gravitationally interact with each other but nothing else in the universe, and then apply Newtonian mechanics to this system, which reveals that (...) the planet moves on an elliptical orbit. Views diverge about what sort of entity such a model is. Those focussing on the formal aspects of models regard them either as equations or settheoretical structures, while those opposed to such an approach take them to be descriptions or abstract (yet non-mathematical) entities. A further question concerns the relation of models and theories. In some cases models can be derived from theory simply by specifying the relevant determinables in a theory’s general equations. But many models cannot be obtained from theory in this straightforward way, and some even involve assumptions that contradict the fundamental theory. The relation of models to their respective target systems is equally complex and fraught with controversy. Two influential proposals take the relation between a model and its target to be isomorphism or similarity, respectively. This, however, has been criticised as too restrictive as many models do not seem to fit this mould. (shrink)
this study is to analyse the biomechanical influence of the been studied extensively, little is known how it is influclinical data on stability of the ankle joint. A two-dimenenced by the osseous joint configuration. Based on lateral..
Consider a gas confined to the left half of a container. Then remove the wall separating the two parts. The gas will start spreading and soon be evenly distributed over the entire available space. The gas has approached equilibrium. Why does the gas behave in this way? The canonical answer to this question, originally proffered by Boltzmann, is that the system has to be ergodic for the approach to equilibrium to take place. This answer has been criticised on different grounds (...) and is now widely regarded as flawed. In this paper we argue that these criticisms have dismissed Boltzmann’s answer too quickly and that something almost like Boltzmann’s answer is true: the approach to equilibrium takes place if the system is epsilon-ergodic, i.e. ergodic on the entire accessible phase space except for a small region of measure epsilon. We introduce epsilon-ergodicity and argue that relevant systems in statistical mechanics are indeed espsilon-ergodic. (shrink)
This is the first of three parts of an introduction to the philosophy of climate science. In this first part about observing climate change, the topics of definitions of climate and climate change, data sets and data models, detection of climate change, and attribution of climate change will be discussed.
The received wisdom in statistical mechanics is that isolated systems, when left to themselves, approach equilibrium. But under what circumstances does an equilibrium state exist and an approach to equilibrium take place? In this paper we address these questions from the vantage point of the long-run fraction of time definition of Boltzmannian equilibrium that we developed in two recent papers. After a short summary of Boltzmannian statistical mechanics and our definition of equilibrium, we state an existence theorem which provides general (...) criteria for the existence of an equilibrium state. We first illustrate how the theorem works with a toy example, which allows us to illustrate the various elements of the theorem in a simple setting. After a look at the ergodic programme, we discuss equilibria in a number of different gas systems: the ideal gas, the dilute gas, the Kac gas, the stadium gas, the mushroom gas and the multi-mushroom gas. In the conclusion we briefly summarise the main points and highlight open questions. (shrink)
Why do systems prepared in a non-equilibrium state approach, and eventually reach, equilibrium? An important contemporary version of the Boltzmannian approach to statistical mechanics answers this question by an appeal to the notion of typicality. The problem with this approach is that it comes in different versions, which are, however, not recognised as such, much less clearly distinguished, and we often find different arguments pursued side by side. The aim of this paper is to disentangle different versions of typicality-based explanations (...) of thermodynamic behaviour and evaluate their respective success. My conclusion will be that the boldest version fails for technical reasons, while more prudent versions leave unanswered essential questions. (shrink)