Whereas computer simulations involve no direct physical interaction between the machine they are run on and the physical systems they are used to investigate, they are often used as experiments and yield data about these systems. It is commonly argued that they do so because they are implemented on physical machines. We claim that physicality is not necessary for their representational and predictive capacities and that the explanation of why computer simulations generate desired information about their target system is only (...) to be found in the detailed analysis of their semantic levels. We provide such an analysis and we determine the actual consequences of physical implementation for simulations. (shrink)
Computer simulations are widely used in current scientific practice, as a tool to obtain information about various phenomena. Scientists accordingly rely on the outputs of computer simulations to make statements about the empirical world. In that sense, simulations seem to enable scientists to acquire empirical knowledge. The aim of this paper is to assess whether computer simulations actually allow for the production of empirical knowledge, and how. It provides an epistemological analysis of present-day empirical science, to which the traditional epistemological (...) categories cannot apply in any simple way. Our strategy consists in acknowledging the complexity of scientific practice, and trying to assess its rationality. Hence, while we are careful not to abstract away from the details of scientific practice, our approach is not strictly descriptive: our goal is to state in what conditions empirical science can rely on computer simulations. In order to do so, we need to adopt a renewed epistemological framework, whose categories would enable us to give a finer-grained, and better-fitted analysis of the rationality of scientific practice. (shrink)
We propose a philosophical theory of scientific models. Our main claim is that they should be understood as fictions. We illustrate the relevance of the claim by illustrations drawn from the history of science, and we propose a typology.
Empirical agreement is often used as an important criterion when assessing the validity of scientific models. However, it is by no means a sufficient criterion as a model can be so adjusted as to fit available data even though it is based on hypotheses whose plausibility is known to be questionable. Our aim in this paper is to investigate into the uses of empirical agreement within the process of model validation.
In 1907 Borel published a remarkable essay on the paradox of the Heap (“Un paradoxe économique: le sophisme du tas de blé et les vérités statistiques”), in which Borel proposes what is likely the first statistical account of vagueness ever written, and where he discusses the practical implications of the sorites paradox, including in economics. Borel’s paper was integrated in his book Le Hasard, published 1914, but has gone mostly unnoticed since its publication. One of the originalities of Borel’s essay (...) is that it puts forward a model of vagueness as imprecision, making particular use of the Gaussian law of measurement errors to model categorization. The aim of our paper is to give a presentation of the historical context of Borel’s essay, to spell out the mathematical details of his model, and to provide a critical assessment of his theory. Three aspects of Borel’s account are particularly discussed: the first concerns the comparison between Borel’s statistical account and posterior degree-theoretic accounts of vagueness. The second concerns the anti-epistemicist flavor of Borel’s approach, whereby the idea of statistical fluctuation is used to undermine the notion of sharp boundary for vague predicates. The third concerns the problematic link between Borel’s model of vagueness as imprecision and the notion of semantic indeterminacy. An English translation of Borel’s original essay is appended to this paper (Erkenntnis, this issue). (shrink)
What is a natural kind? This old yet lasting philosophical question has recently received new competing answers. We show that the main ingredients of an encompassing and coherent account of natural kinds are actually on the table, but in need of the right articulation. It is by adopting a non-reductionist, naturalistic and non-conceptualist approach that, in this paper, we elaborate a new synthesis of all these ingredients. Our resulting proposition is a multiple-compartment theory of natural kinds that defines them in (...) purely ontological terms, clearly distinguishes and relates ontological and epistemological issues —more precisely, two grains of ontological descriptions and two grains of explanatory success of natural kinds—, and which sheds light on why natural kinds play an epistemic role both within science and in everyday life. (shrink)
This volume is the best available tool to compare and appraise the different approaches of today’s biology and their conceptual frameworks, serving as a springboard for new research on a clarified conceptual basis. It is expected to constitute a key reference work for biologists and philosophers of biology, as well as for all scientists interested in understanding what is at stake in the present transformations of biological models and theories. The volume is distinguished by including, for the first time, self-reflections (...) and exchanges of views on practice and theoretical attitudes by important participants in recent biological debates. The questions of how biological models and theories are constructed, how concepts are chosen and how different models can be articulated, are asked. Then the book explores some of these convergences between different models or theoretical frameworks. Confronting views on adaptive complexity are investigated, as well as the role of self-organization in evolution; niche construction meets developmental biology; the promises of the emergent field of ecological-evolutionary-development are examined. In sum, this book is a marvellous account of the dynamism of today’s theoretical biology. Foreword: Carving Nature at its Joints? Richard Lewontin Chapter 1: Introduction Anouk Barberousse, Michel Morange, Thomas Pradeu Chapter 2: Articulating Different Modes of Explanation: The Present Boundary in Biological Research Michel Morange Chapter 3: Compromising Positions: The Minding of Matter Susan Oyama Chapter 4:ions, Idealizations, and Evolutionary Biology Peter Godfrey-Smith Chapter 5: The Adequacy of Model Systems for Evo-Devo: Modeling the Formation Of Organisms / Modeling the Formation Of Society Scott F. Gilbert Chapter 6: Niche Construction in Evolution, Ecosystems and Developmental Biology John Odling-Smee Chapter 7: Novelty, Plasticity and Niche Construction: The Influence of Phenotypic Variation on Evolution Kim Sterelny Chapter 8: The Evolution of Complexity Mark A. Bedau Chapter 9: Self-Organization, Self-Assembly, and the Origin of Life Evelyn Fox Keller Chapter 10: Self-Organization and Complexity in Evolutionary Theory, or, In this Life the Bread Always Falls Jammy Side Down Michael Ruse. (shrink)
Whereas experiments and computer simulations seem very different at first view because the former, but not the latter, involve interactions with material properties, we argue that this difference is not so important with respect to validation, as far as epistemologyEpistemology is concerned. Major differences remain nevertheless from the methodological point of view. We present and defend this distinction between epistemology and methodology. We illustrate this distinction and related claims by comparing how experiments and simulations are validated in evolutionary studies, a (...) domain in which both experiments in the lab and computer simulations are relatively new but mutually reinforcing. (shrink)
Scientific models need to be investigated if they are to provide valuable information about the systems they represent. Surprisingly, the epistemological question of what enables this investigation has hardly been investigated. Even authors who consider the inferential role of models as central, like Hughes or Bueno and Colyvan, content themselves with claiming that models contain mathematical resources that provide inferential power. We claim that these notions require further analysis and argue that mathematical formalisms contribute to this inferential role. We characterize (...) formalisms, illustrate how they extend our mathematical resources, and highlight how distinct formalisms offer various inferential affordances. (shrink)
We analyze the effects of the introduction of new mathematical tools on an old branch of physics by focusing on lattice fluids, which are cellular automata -based hydrodynamical models. We examine the nature of these discrete models, the type of novelty they bring about within scientific practice and the role they play in the field of fluid dynamics. We critically analyze Rohrlich's, Fox Keller's and Hughes' claims about CA-based models. We distinguish between different senses of the predicates “phenomenological” and “theoretical” (...) for scientific models and argue that it is erroneous to conclude, as they do, that CA-based models are necessarily phenomenological in any sense of the term. We conversely claim that CA-based models of fluids, though at first sight blatantly misrepresenting fluids, are in fact conservative as far as the basic laws of statistical physics are concerned and not less theoretical than more traditional models in the field. Based on our case-study, we propose a general discussion of the prospect of CA for modeling in physics. We finally emphasize that lattice fluids are not just exotic oddities but do bring about new advantages in the investigation of fluids' behavior. (shrink)
Recent years have seen a notable increase in the production of scientific expertise by large multidisciplinary groups. The issue we address is how reports may be written by such groups in spite of their size and of formidable obstacles: complexity of subject matter, uncertainty, and scientific disagreement. Our focus is on the International Panel on Climate Change, unquestionably the best-known case of such collective scientific expertise. What we show is that the organization of work within the IPCC aims to make (...) it possible to produce documents that are indeed expert reports. To do so, we first put forward the epistemic norms that apply to expert reports in general, that is, the properties that reports should have in order to be useful and to help decision-making. Section 2 claims that these properties are: intelligibility, relevance and accuracy. Based on this analysis, section 3 points to the difficulties of having IPCC reports indeed satisfying these norms. We then show how the organization of work within the IPCC aims at and to a large extent secures intelligibility, relevance and accuracy, with the result that IPCC reports can be relied on for decision-making. Section 4 focuses on the fundamentals of IPCC’s work organization--that is, division of labour within the IPCC--while section 5 investigates three frameworks that were introduced over the course of the functioning of the IPCC: the reviewing procedure of IPCC reports, the language that IPCC authors use to express uncertainty and the Coupled Model Intercomparison Project. Concluding remarks are offered in section 6. (shrink)
Convergence of model projections is often considered by climate scientists to be an important objective in so far as it may indicate the robustness of the models’ core hypotheses. Consequently, the range of climate projections from a multi-model ensemble, called “model spread”, is often expected to reduce as climate research moves forward. However, the successive Assessment Reports of the Intergovernmental Panel on Climate Change indicate no reduction in model spread, whereas it is indisputable that climate science has made improvements in (...) its modelling. In this paper, after providing a detailed explanation of the situation, we describe an epistemological setting in which a steady model spread is not doomed to be seen as negative, and is indeed compatible with a desirable evolution of climate models taken individually. We further argue that, from the perspective of collective progress, as far as the improvement of the products of a multi-model ensemble is concerned, reduction of model spread is of lower priority than model independence. (shrink)
Why are some models, like the harmonic oscillator, the Ising model, a few Hamiltonian equations in quantum mechanics, the poisson equation, or the Lokta-Volterra equations, repeatedly used within and across scientific domains, whereas theories allow for many more modeling possibilities? Some historians and philosophers of science have already proposed plausible explanations. For example, Kuhn and Cartwright point to a tendency toward conservatism in science, and Humphreys emphasizes the importance of the intractability of what he calls “templates.” This paper investigates more (...) systematically the reasons for this remarkable interdisciplinary recurrence. To this aim, the authors describe in more detail the phenomenon they focus on and review competing potentialexplanations. The authors disentangle the various assumptions underlying these explanations based on sensitivity to a computational constraints and assess its relationships with the other analyzed explanatons. (shrink)
ArgumentThe use of diagrams is pervasive in theoretical physics. Together with mathematical formulae and natural language, diagrams play a major role in theoretical modeling. They enrich the expressive power of physicists and help them to explore new theoretical ideas. Diagrams are not only heuristic or pedagogical tools, but they are also tools that enable developing the content of models into novel implications.
Les théories physiques sont aujourd'hui très mathématisées, et ce que les scientifiques manipulent pour décrire, prédire et contrôler les phénomènes, ce sont (entre autres) des équations, comportant de nombreux symboles mathématiques. Ces objets mathématiques n'ont pas de signification physique en eux-mêmes : ils ne « parlent » pas d'eux-mêmes des phénomènes. Une interprétation est nécessaire. Ce qui nous intéresse dans cet article est ainsi l'interprétation dont une théorie physique doit faire l'objet pour remplir son rôle. Nous commençons par expliciter une (...) distinction traditionnelle : l'interprétation « pauvre » (simple instrument permettant d'assigner aux symboles de la théorie un sens physique strictement limité aux résultats des expériences) diffère de l'interprétation « riche » (laquelle compose une image du monde compatible avec la façon dont la théorie décrit mathématiquement les résultats des expériences). Notre but dans cet article est de montrer que cette distinction doit être amendée. Nous nous appuyons sur l'exemple de la mécanique quantique, mais la distinction se veut valable en général pour toute théorie physique. (shrink)
The Developmental Systems Theory (DST) presented by its proponents as a challenging approach in biology is aimed at transforming the workings of the life sciences from both a theoretical and experimental point of view (see, in particular, Oyama  2000; Oyama et al. 2001). Even though some may have the impression that the enthusiasm surrounding DST has faded in very recent years, some of the key concepts, ideas, and visions of DST have in fact pervaded biology and philosophy of biology. (...) It seems crucial to us both to establish which of these ideas are truly specific to DST, and to shift through these ideas in order to determine the criticisms they have drawn, or may draw (e.g., Sterelny et al. 1996; Griesemer 2000; Sterelny 2000; Kitcher 2001; Keller 2005; Waters 2007). (shrink)
Cellular Automata (CA) based simulations are widely used in a great variety of domains, fromstatistical physics to social science. They allow for spectacular displays and numerical predictions. Are they forall that a revolutionary modeling tool, allowing for “direct simulation”, or for the simulation of “the phenomenon itself”? Or are they merely models "of a phenomenological nature rather than of a fundamental one”? How do they compareto other modeling techniques? In order to answer these questions, we present a systematic exploration of (...) CA’s various uses. (shrink)
Scientific inquiry possibly shares with people's ordinary understanding the same evolutionary determinants, and affect-laden intuitions that shape moral judgments also play a decisive role in decision-making, planning, and scientific reasoning. Therefore, if ordinary understanding does differ from scientific inquiry, the reason does not reside in the fact that the former (but not the latter) is endowed with moral considerations.
Were Maxwell and Boltzmann irrational to develop statistical mechanics whereas it was empirically refuted by the specific heats problem? My analysis of this historical episode departs from the current proposals about belief change. I first give a detailed description of Maxwell's and Boltzmann's epistemic states in the years they were working on statistical mechanics and then make some methodological proposals in epistemology that would account for the complexity of this case.
L’étude de l’évolution du climat passe nécessairement par des simulations numériques. Leur utilisation a été remise en cause en raison des incertitudes qui affectent leurs résultats. Cet article présente les différentes composantes des simulations numériques utilisées en climatologie et en propose une analyse épistémologique. La conclusion en est que ces simulations numériques procèdent des exigences scientifiques plus généralement à l’œuvre dans la science contemporaine.In order to study the evolution of global climate, computer simulation is required. The use of computer simulation (...) has been criticized since the results are uncertain. In this paper, I present the various components of the simulations used in climate studies and I analyze those components from an epistemological point of view. I conclude in showing that these simulations obey the scientific requirements usually governing contemporary science. (shrink)
Comment comprendre les énoncés statistiques et probabilistes? Cette question concerne aussi bien le domaine scientifique que notre vie quotidienne. Nous attribuons constamment des degrés de possibilité aux événements que nous envisageons, et nous tirons des inférences à partir de ces attributions. Comment ces inférences sont-elles justifiées? Quelle est la signification de leurs conclusions? Ce livre tente d’apporter une réponse philosophique à ces questions, en explorant divers aspects de la philosophie des probabilités.Les probabilités jouent en outre un rôle considérable dans l’activité (...) scientifique depuis près de deux siècles. Une enquête sur leur signification se doit donc de prendre en compte leurs usages scientifiques aussi bien que leurs usages quotidiens. A cette fin, l’exemple de la mécanique statistique est étudié en détail, tant sous l’aspect historique que sous l’aspect philosophique. Cette étude permet de présenter les solutions qui ont été apportées dans ce domaine particulier de la physique aux problèmes de la philosophie des probabilités. (shrink)
Bayesian methods are currently underdoing a deep transformation due to the use of computing power. The aim of this chapter is to analyze this transformation by examining a specific example: the use of Baysian methods in climate science.