The purpose of this paper is to examine in detail a particularly interesting pair of first-order theories. In addition to clarifying the overall geography of notions of equivalence between theories, this simple example yields two surprising conclusions about the relationships that theories might bear to one another. In brief, we see that theories lack both the Cantor-Bernstein and co-Cantor-Bernstein properties.
This is a chapter of the planned monograph "Out of Nowhere: The Emergence of Spacetime in Quantum Theories of Gravity", co-authored by Nick Huggett and Christian Wüthrich and under contract with Oxford University Press. (More information at www<dot>beyondspacetime<dot>net.) This chapter investigates the meaning and significance of string theoretic dualities, arguing they reveal a surprising physical indeterminateness to spacetime.
This paper represents a philosophical experiment inspired by the formalist philosophy of mathematics. In the formalist picture of cognition, the principal act of knowledge generation is represented as tentative postulation – as introduction of a new knowledge construct followed by exploration of the consequences that can be derived from it. Depending on the result, the new construct may be accepted as normative, rejected, modified etc. Languages and means of reasoning are generated and selected in a similar process. In the formalist (...) picture, all kinds of “truth” are detected intra-theoretically. Some knowledge construct may be considered as “true”, if it is accepted in a particular normative knowledge system. Some knowledge construct may be considered as persistently true, if it remains invariant during the evolution of some knowledge system for a sufficiently long time. And, if you wish, you may consider some knowledge construct as absolutely true, if you do not intend abandoning it in your knowledge system. And finally, in the formalist picture, all kinds of ontologies generated by humans can be demystified by reconstructing them within the basic solipsist ontology simply as hypothetical branches of it. (shrink)
First, I propose a new argument in favor of the Dappled World perspective introduced by Nancy Cartwright. There are systems, for which detailed models can't exist in the natural world. And this has nothing to do with the limitations of human minds or technical resources. The limitation is built into the very principle of modeling: we are trying to replace some system by another one. In full detail, this may be impossible. Secondly, I'm trying to refine the Dappled World perspective (...) by applying the correct distinction between models and theories. At the level of models, because of the above-mentioned limitations, we will always have only a patchwork of models each very restricted in its application scope. And at the level of theories, we will never have a single complete Theory of Everything allowing, without additional postulates, to generate all the models we may need for surviving in this world. (shrink)
Much debate about scientific realism concerns the issue of whether it is compatible with theory change over time. Certain forms of ‘selective realism’ have been suggested with this in mind. Here I consider a closely related challenge for realism: that of articulating how a theory should be interpreted at any given time. In a crucial respect the challenges posed by diachronic and synchronic interpretation are the same; in both cases, realists face an apparent dilemma. The thinner their interpretations, the easier (...) realism is to defend, but at the cost of more substantial commitment. The more substantial their interpretations, the more difficult they are to defend. I consider this worry in the context of the Standard Model of particle physics. Examining some selective realist attempts at interpretation, I argue that realism is, in fact, compatible with different commitments on the spectrum of thinner to more substantial, thus mitigating the dilemma. (shrink)
Normative political theorists frequently compare hypothetical scenarios for the purpose of identifying reasons to prefer one kind of institution to alternatives. We examine three types of "unfair" comparisons and the reasoning errors associated with each. A theorist makes an _obscure comparison_ when one (or more) of the alternatives under consideration is underspecified; a theorist makes a _mismatched comparison_ when they fail to hold fixed the relevant contextual factors while comparing alternatives; and a theorist makes an _irrelevant comparison_ when they compare (...) alternatives assuming contextual factors that differ in important respects from those they "should" assume given their theoretical aims. We then introduce the notion of a modeling mindset and show how this mindset can help theorists detect and avoid the three types of error. We conclude with a reconstruction of Cohen's (2009) camping trip thought experiment to illustrate the approach. (shrink)
It is often claimed that one can avoid the kind of underdetermination that is a typical consequence of symmetries in physics by stipulating that symmetry-related models represent the same state of affairs (Leibniz Equivalence). But recent commentators (Dasgupta 2011; Pooley 2021; Pooley and Read 2021; Teitel 2021a) have responded that claims about the representational capacities of models are irrelevant to the issue of underdetermination, which concerns possible worlds themselves. In this paper I distinguish two versions of this objection: (1) that (...) a theory’s formalism does not (fully) determine the space of physical possibilities, and (2) that the relevant notion of possibility is not physical possibility. I offer a refutation of each. (shrink)
Although the idiom “genesis and structure” is usually associated with the rise of structuralism in the late 1950s and early 1960s, the two notions are arguably among the most persistent methods in the history of modern philosophy. This article outlines the emergence of “genetic epistemology” in the seventeenth century, when the seemingly antithetical character of the conceptual pair was reworked into a productive epistemological theory, especially in Descartes, Hobbes, Spinoza, and Leibniz, who increasingly used diachronic (genetic) narratives to explain the (...) synchronic (structural) features in their theories. Against Cassirer, I argue that it was Descartes rather than Hobbes who first presented structural issues genetically. In Descartes’ natural philosophy, his frequent claims that showing how a thing is produced reveals its true nature foreshadow precisely what Hobbes and Isaac Barrow later describe as causal definitions of geometric figures, in which the process of ideal generation by motion is what constitutes the very essence of a figure. I link this method to the historicizing discourse on origins in the Enlightenment and conclude by suggesting that there is a trace of Platonic idealism in genetic epistemology. (shrink)
This paper presents a simple pair of first-order theories that are not definitionally (nor Morita) equivalent, yet are mutually conservatively translatable and mutually 'surjectively' translatable. We use these results to clarify the overall geography of standards of equivalence and to show that the structural commitments that theories make behave in a more subtle manner than has been recognized.
Previously, I (Boesch 2017) described a notion called “representational licensing”—the set of activities of scientific practice by which scientists establish the intended representational use of a vehicle. In this essay, I expand and develop this concept of representational licensing. I begin by showing how the concept is of value for both pragmatic and substantive approaches to scientific representation. Then, through the examination of a case study of the Mississippi River Basin Model, I point out and explain some of the activities (...) of representational licensing that help to establish the representational nature of this model. Throughout the exploration of the case study, I pause to identify some important lessons which apply more generally about the nature of representational licensing in science. (shrink)
It is a common view among philosophers of science that theoretical virtues (also known as epistemic or cognitive values), such as simplicity and consistency, play an important role in scientific practice. In this article, I set out to study the role that theoretical virtues play in scientific practice empirically. I apply the methods of data science, such as text mining and corpus analysis, to study large corpora of scientific texts in order to uncover patterns of usage. These patterns of usage, (...) in turn, might shed some light on the role that theoretical virtues play in scientific practice. Overall, the results of this empirical study suggest that scientists invoke theoretical virtues explicitly, albeit rather infrequently, when they talk about models (less than 30%), theories (less than 20%), and hypotheses (less than 15%) in their published works. To the extent that they are mentioned in scientific publications, the results of this study suggest that accuracy, consistency, and simplicity are the theoretical virtues that scientists invoke more frequently than the other theoretical virtues tested in this study. Interestingly, however, depending on whether they talk about hypotheses, theories, or models, scientists may invoke one of those theoretical virtues more than the others. (shrink)
Recently, the distinction between nature and technology has been increasingly questioned. We are told that the changes to nature made possible by technology had reached such dimensions that it was no longer possible to clearly differentiate between nature and technology. Against the critical voices, I argue for the possibility and necessity of applying a version of the distinction that I call “classical”. I begin by examining selected historical origins of this distinction and thereby discussing some of its critics debated today (...) – namely Hans Blumenberg, Bruno Latour, Donna Haraway and Philippe Descola. In the next section, I introduce a concept of hybrid of nature and technology, which is suitable for delimiting and differentiating the scope of the classical distinction. I also need the concept of hybrid in order to assess the current tendencies in the development of the nature-technology relation, which are the subject of the last section. Against the background of these tendencies, I conclude by compiling some arguments in favour of maintaining the classical distinction between nature and technology. (shrink)
We critically engage two traditional views of scientific data and outline a novel philosophical view that we call the pragmatic-representational view of data. On the PR view, data are representations that are the product of a process of inquiry, and they should be evaluated in terms of their adequacy or fitness for particular purposes. Some important implications of the PR view for data assessment, related to misrepresentation, context-sensitivity, and complementary use, are highlighted. The PR view provides insight into the common (...) but little-discussed practices of iteratively reusing and repurposing data, which result in many datasets’ having a phylogeny—an origin and complex evolutionary history—that is relevant to their evaluation and future use. We relate these insights to the open-data and data-rescue movements, and highlight several future avenues of research that build on the PR view of data. (shrink)
Abstraction and generalization are two processes of reasoning that have a special role in the construction of scientific theories and models. They have been important parts of the scientific method ever since the nineteenth century. A philosophical and historical analysis of scientific practices shows how abstraction and generalization found their way into the theory of the logic of science of the nineteenth-century philosopher Charles S. Peirce. Our case studies include the scientific practices of Francis Galton and John Herschel, who introduced (...) composite photographs and graphical methods, respectively, as technologies of generalization and thereby influenced Peirce’s logic of abstraction. Herschel’s account of generalization is further supported by William Whewell, who was very influential on Peirce. By connecting Herschel’s scientific technology of abstraction to Peirce’s logical technology of abstraction—namely, diagrams—we highlight the role of judgments in scientific observation by hypostatic abstractions. We also relate Herschel’s discovery-driven logic of science and Peirce’s open-ended diagrammatic logic to the use of models in science. Ultimately, Peirce’s theory of abstraction is a case of showing how logic applies to reality. (shrink)
Since the 1970s, the application of microprocessor in industrial machinery and the development of computer systems have transformed the manufacturing landscape. The rapid integration and automation of production systems have outpaced the development of suitable human design criteria, creating a deepening gap between humans and systems in which human was seen as an important source of errors and disruptions. Today, the situation seems different: the scientific and public debate about the concept of Industry 4.0 has raised awareness about the central (...) role humans have to play in manufacturing systems, the design of which must be considered from the very beginning. The future of industrial systems, as represented by Industry 4.0, will rely on the convergence of several research fields such as Intelligent Manufacturing Systems (IMS), Cyber-Physical Systems (CPS), Internet of Things (IoT), but also socio-technical fields such as social approaches within technical systems. This article deals with different human social dimensions associated with CPS and IoT and focuses on their conceptual evolution regarding automated production systems’ sociability, notably by bringing humans back in the loop. Hereby, this paper aims to take stock of current research trends to show the importance of integrating human operators as a part of a socio-technical system based autonomous and intelligent products or resources. Consequently, different models of sociability as a way to integrate humans in the broad sense and/or the develop future automated production systems have been identified from the literature and analysed. (shrink)
The philosophy of science community mourns the loss of Margaret Catherine Morrison, who passed away on January 9, 2021, after a long battle with cancer. Margie, as she was known to all who knew her, was highly regarded for her influential contributions to the philosophy of science, particularly her studies of the role of models and simulations in the natural and social sciences. These contributions made her a world-leading philosopher of science, instrumental in shifting philosophers' attention from the structure of (...) scientific theories to the practice of science. Her sophisticated studies of the function of models in scientific practice drew on detailed knowledge of the theories and experiments of physics as well as the history of physics. In emphasizing the autonomy of scientific models and their interventional character, her insights had some affinity with Cartwright's and Hacking’s views on phenomenological laws, entity realism, the instrumentalist interpretation of scientific theories, and the disunity of science. But Morrison’s approach was distinguished by the conviction that the existence of unobservable entities cannot be defended independently of the theories that support their evidence, and that scientific practice cannot be adequately understood without examining the reasons for theory unification. (shrink)
In this chapter, we argue that in order to understand the interdisciplinary and transdisciplinary dialectics in sustainability science, it is useful to see sustainability science as a kind of management science, and then to highlight the hard-soft distinction in systems thinking. First, we argue that the commonly made natural-social science dichotomy is relatively unimportant and unhelpful. We then outline the differences between soft and hard systems thinking as a more relevant and helpful distinction, mainly as a difference between perspectives in (...) systemic modeling toward models. We also illustrate that the distinction is methodologically useful to advance sustainability science by enabling us (i) to suggest novel ways of using existing theoretical, experimental, and computational resources of the sciences for renewable resource management, and (ii) to disentangle disciplinary disagreements in climate science. (shrink)
This paper contains a detailed exposition and analysis of The Philosophy of “As If“ proposed by Hans Vaihinger in his book published in 1911. However, the principal chapters of the book (Part I) reproduce Vaihinger’s Habilitationsschrift, which was written during the autumn and winter of 1876. Part I is extended by Part II based on texts written during 1877–1878, when Vaihinger began preparing the book. The project was interrupted, resuming only in the 1900s. My conclusion is based exclusively on the (...) texts written in 1876-1878: Vaihinger was, decades ahead of the time, a philosopher of modeling in the modern sense – a brilliant achievement for the 1870s! And, in the demystification of such principal aspects of cognition as truth, understanding and causality, is he not still ahead of many of us? According to Vaihinger, what we set beyond sensations is our invention (fiction), the correspondence of which with reality cannot (and need not) be verified in the mystical, absolute sense many people expect. (shrink)
In this paper I challenge two widespread convictions about unification in physics: unification is an aim of physics and unification is driven by metaphysical or metatheoretical presuppositions. I call these external explanations of why there is unification in physics. Against this, I claim that unification is a by-product of physical research and unification is driven by basic methodological strategies of physics alone. I call this an internal explanation of why there is unification in physics. To support my claims, I will (...) investigate the actual practice undertaken in physics in paradigmatic examples of unification. (shrink)
Formal criteria of theoretical equivalence are mathematical mappings between specific sorts of mathematical objects, notably including those objects used in mathematical physics. Proponents of formal criteria claim that results involving these criteria have implications that extend beyond pure mathematics. For instance, they claim that formal criteria bear on the project of using our best mathematical physics as a guide to what the world is like, and also have deflationary implications for various debates in the metaphysics of physics. In this paper, (...) I investigate whether there is a defensible view according to which formal criteria have significant non-mathematical implications, of these sorts or any other, reaching a chiefly negative verdict. Along the way, I discuss various foundational issues concerning how we use mathematical objects to describe the world when doing physics, and how this practice should inform metaphysics. I diagnose the prominence of formal criteria as stemming from contentious views on these foundational issues, and endeavor to motivate some alternative views in their stead. (shrink)
Born on October 26, 1946, Carlos Ulises Moulines studied Physics, Philosophy and Psychology at the University of Barcelona, obtaining a Degree in Philosophy from that same University and a PhD in Philosophy from the University of Munich, supervised by Wolfgang Stegmüller. He is one of the most outstanding contemporary philosophers of science and one of the most prominent exponents of metascientific structuralism. In this interview he talks about biographical aspects, philosophy of science in general and specific topics, as well as (...) presenting his impressions on some topics related to academic life. (shrink)
The epistemic probability of A given B is the degree to which B evidentially supports A, or makes A plausible. This paper is a first step in answering the question of what determines the values of epistemic probabilities. I break this question into two parts: the structural question and the substantive question. Just as an object’s weight is determined by its mass and gravitational acceleration, some probabilities are determined by other, more basic ones. The structural question asks what probabilities are (...) not determined in this way—these are the basic probabilities which determine values for all other probabilities. The substantive question asks how the values of these basic probabilities are determined. I defend an answer to the structural question on which basic probabilities are the probabilities of atomic propositions conditional on potential direct explanations. I defend this against the view, implicit in orthodox mathematical treatments of probability, that basic probabilities are the unconditional probabilities of complete worlds. I then apply my answer to the structural question to clear up common confusions in expositions of Bayesianism and shed light on the “problem of the priors.”. (shrink)
Many phenomena in the natural world are complex, so scientists study them through simplified and idealised models. Philosophers of science have sought to explain how these models relate to the world. On most accounts, models do not represent the world directly, but through target systems. However, our knowledge of target systems is incomplete. First, what is the process by which target systems come about? Second, what types of entity are they? I argue that the basic conception of target systems, on (...) which other conceptions depend, is as parts of the world. I outline the process of target system specification and show that it is a crucial step in modelling. I also develop an account of target system evaluation, based on aptness. Paying close attention to target system specification and evaluation can help scientists minimise the frequency and extent of mistakes, when they are using models to investigate phenomena in complex real-world systems. (shrink)
Three metascientific concepts that have been object of philosophical analysis are the concepts oflaw, model and theory. The aim ofthis article is to present the explication of these concepts, and of their relationships, made within the framework of Sneedean or Metatheoretical Structuralism (Balzer et al. 1987), and of their application to a case from the realm of biology: Population Dynamics. The analysis carried out will make it possible to support, contrary to what some philosophers of science in general and of (...) biology in particular hold, the following claims: a) there are "laws" in biological sciences, b) many of the heterogeneous and different "models" of biology can be accommodated under some "theory", and c) this is exactly what confers great unifying power to biological theories. (shrink)
A crucial aspect of scientific realism is what do we mean by true. In Luk’s theory and model of scientific study, a theory can be believed to be “true” but a model is only accurate. Therefore, what do we mean by a “true” theory in scientific realism? Here, we focus on exploring the notion of truth by some thought experiments and we come up with the idea that truth is related to what we mean by the same. This has repercussion (...) to the repeatability of the experiments and the predictive power of scientific knowledge. Apart from sameness, we also found that truth is related to the granularity of the observation, the limit of detection, the distinguishability of the objects in theory, the simultaneous measurements of objects/processes, the consistencies of the theory and the one-to-one correspondence between terms/events and objects/processes, respectively. While there is no guarantee that we can arrive at the final “true” theory, we have a process/procedure with more and more experiments together with our own ingenuity, to direct us towards such a “true” theory. For quantum mechanics, since a particle is also regarded as a wave, quantum mechanics cannot be considered as a true theory based on the correspondence theory of truth. Failing this, truth may be defined by the coherence theory of truth which is similar to the coherence of beliefs. However, quantum mechanics may not be believed to be a true theory based on the coherence theory of truth because wave properties and particle properties may contradict. Further research is needed to address this problem if we want to regard quantum mechanics as a “true” theory. (shrink)
Fisher criticised the Neyman-Pearson approach to hypothesis testing by arguing that it relies on the assumption of “repeated sampling from the same population.” The present article considers the responses to this criticism provided by Pearson and Neyman. Pearson interpreted alpha levels in relation to imaginary replications of the original test. This interpretation is appropriate when test users are sure that their replications will be equivalent to one another. However, by definition, scientific researchers do not possess sufficient knowledge about the relevant (...) and irrelevant aspects of their tests and populations to be sure that their replications will be equivalent to one another. Pearson also interpreted the alpha level as a personal rule that guides researchers’ behavior during hypothesis testing. However, this interpretation fails to acknowledge that the same researcher may use different alpha levels in different testing situations. Addressing this problem, Neyman proposed that the average alpha level adopted by a particular researcher can be viewed as an indicator of that researcher’s typical Type I error rate. Researchers’ average alpha levels may be informative from a metascientific perspective. However, they are not useful from a scientific perspective. Scientists are more concerned with the error rates of specific tests of specific hypotheses, rather than the error rates of their colleagues. It is concluded that neither Neyman nor Pearson adequately rebutted Fisher’s “repeated sampling” criticism. Fisher’s significance testing approach is briefly considered as an alternative to the Neyman-Pearson approach. (shrink)
Whereas experiments and computer simulations seem very different at first view because the former, but not the latter, involve interactions with material properties, we argue that this difference is not so important with respect to validation, as far as epistemologyEpistemology is concerned. Major differences remain nevertheless from the methodological point of view. We present and defend this distinction between epistemology and methodology. We illustrate this distinction and related claims by comparing how experiments and simulations are validated in evolutionary studies, a (...) domain in which both experiments in the lab and computer simulations are relatively new but mutually reinforcing. (shrink)
In this essay, I examine the role of dissimilarity in scientific representation. After briefly reviewing some of the philosophical literature which places a strong emphasis on the role of similarity, I turn to examine some work from Carroll and Borges which demonstrates that perfect similarity is not valuable in the representational use of maps. Expanding on this insight, I go on to argue that this shows that dissimilarity is an important part of the representational use of maps—a point I then (...) extend to the case of scientific representation. Relying on some work from Latour, I argue that dissimilarity plays an essential role in representational practice, by providing novel forms of manipulation and use which affords the achievement of various epistemic and nonepistemic aims. After showing how this point connects to some other literature on scientific representation, I discuss some examples of the value of dissimilarity in the use of representational vehicles. Overall, I argue that to understand scientific representation, we will need to consider more than just similarity. We will need to explore dissimilarities as well. (shrink)
This peer-reviewed philosophy of science book examines the scope, purpose and methodology of science, and areas of the universe, reality and knowledge that lay beyond its scope. Science itself and scientists themselves say that there are important areas, topics and questions, including within and about science, that cannot be answered and often even addressed by science’s tools of sensory observation, empirical testing and logic.
In recent years, several authors have called to ground descriptive and normative decision theory on neuro-psychological measures of utility. In this paper, I combine insights from the best available neuro-psychological findings, leading philosophical conceptions of welfare and contemporary decision theory to rebut these prominent calls. I argue for two claims of general interest to philosophers, choice modellers and policy makers. First, severe conceptual, epistemic and evidential problems plague ongoing attempts to develop accurate and reliable neuro-psychological measures of utility. And second, (...) even if these problems are solved, neuro-psychological measures of utility lack the potential to inform welfare analyses and policy evaluations. (shrink)
Scientific theories are used for a variety of purposes. For example, physical theories such as classical mechanics and electrodynamics have important applications in engineering and technology, and we trust that this results in useful machines, stable bridges, and the like. Similarly, theories such as quantum mechanics and relativity theory have many applications as well. Beyond that, these theories provide us with an understanding of the world and address fundamental questions about space, time, and matter. Here we trust that the answers (...) scientific theories give are reliable and that we have good reason to believe that the features of the world are similar to what the theories say about them. But why do we trust scientific theories, and what counts as evidence in favor of them? (shrink)
Some experiments are risky in that they cannot repeatedly produce certain phenomenon at will for study because the scientific knowledge of the process generating the uncertain phenomenon is poorly understood or may directly contradict with existing scientific knowledge. These experiments may have great impact not just to the scientific community but to mankind in general. Banning them from study may incur societies a great opportunity cost but accepting them runs the risk that scientists are doing junk science. How to make (...) an informed decision to accept/reject such study scientifically for the mainstream scientific community is of great importance to mankind. Here, we propose a statistical methodology to handle the situation. Specifically, we consider the likelihood of not observing the phenomenon after n trails so that it is statistically significant to have nil result. Consequently, we reject the hypothesis that there is some probability that we observe the phenomenon. (shrink)
Was there a concept of data before the so-called ‘data revolution’? This paper contributes to the history of the concept of data by investigating uses of the term ‘data’ in texts of the Royal Society's Philosophical Transactions for the period 1665–1886. It surveys how the notion enters the journal as a technical term in mathematics, and charts how over time it expands into various other scientific fields, including Earth sciences, physics and chemistry. The paper argues that in these texts the (...) notion of data is not used merely as a rhetorical category, and also cannot strictly be identified with the category of evidence. Instead, the notion comes with an associated epistemic structure, one that is in line with its development from an early mathematical use. (shrink)
Taking user’s role and features as milestones for an approach on scientific representation has become a growing trend. We shall investigate the implications that pragmatics bring in the relevant debate. Proponents of pragmatic approaches support that questions such as ‘how an object represents another’ or ‘which features of a certain object represent the target of the representation and in what way’ can be answered only within the given context of representation’s use. Thus, attention is drawn to the intentionality of the (...) representation, in contrast to the semantic tradition, according to which the representational function is based on morphic relations between the representation and the represented object. Given that scientific representations surrogate objects and phenomena in our studies, they should reproduce aspects, relations and interactions of them, possessing the appropriate features. Therefore, we support that user’s intention is not enough to build the representational relation on it. We claim that a) a sustainable and successful theory of scientific representations cannot be grounded on pragmatics b) pragmatic approaches undermine the objectivity of the knowledge inferred by representations c) the important role of the cognizing subject in a theory of scientific representation can be rescued without the burden coming with pragmatic approaches. (shrink)
Vier Entwicklungtendenzen des Verhältnisses von Natur und Technik betreffen industrielle Gesellschafen als Ganzes: 1. der zunehmende Naturferne Technik, 2. zunehmende Naturnähe der Technik, 3. vermehrte Hybridzustände von Natur und Technik und 4. zunehmende Eindringtiefe der Technik in die Natur. Vor dem Hintergrund dieser teils gegenläufigen Tendenzen kann von Grenzen der Technisierung in industriellen Gesellschaften nicht im Allgemeinen, sondern nur in Bezug auf besondere Kontexte gesprochen werden. Zu ihnen gehört die Lebenswelt als ein nichtprofessioneller der und privater Erfahrungsbereich, es immer noch (...) erlaubt, kulturwirksam zwischen Natur und Technik zu unterscheiden. Zwei Beispiele werden diskutier: Die Wahrnehmung des Leibes, der sich als das lebensweltliche Zentrum das Natur erweist, sensibel auf Technisierungen reagiert, und der die Grenzen der Technisierung Reproduktion. Abschließend werden Gründe dafür angeführt, warum die Lebenswelt gegenüber Technisierungen, deren bevorzugtes Objekt sie ist, bisher in erstaunlicher Distanz geblieben ist. (shrink)
Recent philosophical analyses of the epistemic dimension of images in the sciences show a certain trend in acknowledging potential roles of these images beyond their merely decorative or pedagogical functions. We argue, however, that this new debate has yet paid little attention to a special type of pictures, we call ‘visual metaphor’, and its versatile heuristic potential in organizing data, supporting communication, and guiding research, modeling, and theory formation. Based on a case study of Conrad Hal Waddington’s epigenetic landscape images (...) in biology, we develop a descriptive framework applicable to heuristic roles of various visual metaphors in the sciences. (shrink)
Case studies of science concerning the interpretation of specific theories and the nature of theory change over time are often presented as evidence for or against forms of selective realism: versions of scientific realism that advocate belief in connection with certain components of theories as opposed to their content as a whole. I consider the question of how probative case studies can be in this sphere, focusing on two prominent examples of selectivity: explanationist realism, which identifies realist commitment with components (...) of theories that are putatively required to explain their empirical success; and entity realism, which identifies realist commitment with certain putatively causally efficacious entities. I argue that while case studies are essential to debates about these positions, they are not compelling in the way that their intended use suggests. Regarding explanationism, concerns about the “neutrality” of historical evidence are ultimately indefeasible. Regarding entity realism, arguments for and against naturally dissolve into disputes about the reference of theoretical terms which are insulated from the details of cases. I conclude by suggesting that the morals of this discussion extend to other forms of selective realism, namely structural realism and semirealism. (shrink)
The epistemic status of Natural Selection has intrigued to biologists and philosophers since the very beginning of the theory to our present times. One prominent contemporary example is Elliott Sober, who claims that Natural Selection, and some other theories in biology, and maybe in economics, are peculiar in including explanatory models/conditionals that are a priori in a sense in which explanatory models/conditionals in Classical Mechanics and most other standard theories are not. In this paper, by analyzing what we take to (...) be the four possible interpretations of Sober’s claim, we argue that, terminological preferences aside, the possible senses in which explanatory models in Natural Selection can qualify, or include elements that can qualify, as a priori, also apply to Classical Mechanics and other standard, highly unified theories. (shrink)
This is the outline: 1. Introduction 2. La compréhension théorique – 2.1 Le dynamisme conceptuel et l'a priori 2.2 L'horizon conceptuel – 3. Compréhension et singularité 4. La production de signifiance 5. La présence du mystère 6. Le problème de la substantialité : l'un et le multiple – 6.1 La notion d'un ordre implicite.
This paper presents an artifactual approach to models that also addresses their fictional features. It discusses first the imaginary accounts of models and fiction that set model descriptions apart from imagined-objects, concentrating on the latter :251–268, 2010; Frigg and Nguyen in The Monist 99:225–242, 2016; Godfrey-Smith in Biol Philos 21:725–740, 2006; Philos Stud 143:101–116, 2009). While the imaginary approaches accommodate surrogative reasoning as an important characteristic of scientific modeling, they simultaneously raise difficult questions concerning how the imagined entities are related (...) to actual representational tools, and coordinated among different scientists, and with real-world phenomena. The artifactual account focuses, in contrast, on the culturally established external representational tools that enable, embody, and extend scientific imagination and reasoning. While there are commonalities between models and fictions, it is argued that the focus should be on the fictional uses of models rather than considering models as fictions. (shrink)
The aim of this article is to outline the theory of a historical process developed within the framework of the Poznań School of Methodology, mainly by Leszek Nowak and a team of his co-workers. In the first part of the paper, the meta-philosophical and meta-theoretical assumptions of Poznań school are reconstructed and juxtaposed with the relevant assumptions of Western analytical Marxism. In the central part of the paper, the main ideas of adaptive reconstruction of historical materialism and non-Marxian historical materialism (...) are presented. In its final part, some problems of reception of Leszek Nowak’s theory of historical process in Polish People’s Republic and the Third Republic are discussed. (shrink)
In his famous article “The Unreasonable Effectiveness of Mathematics in the Natural Sciences” Eugen Wigner argues for a unique tie between mathematics and physics, invoking even religious language: “The miracle of the appropriateness of the language of mathematics for the formulation of the laws of physics is a wonderful gift which we neither understand nor deserve”. The possible existence of such a unique match between mathematics and physics has been extensively discussed by philosophers and historians of mathematics. Whatever the merits (...) of this claim are, a further question can be posed with regard to mathematization in science more generally: What happens when we leave the area of theories and laws of physics and move over to the realm of mathematical modeling in interdisciplinary contexts? Namely, in modeling the phenomena specific to biology or economics, for instance, scientists often use methods that have their origin in physics. How is this kind of mathematical modeling justified? (shrink)
This paper presents a theory of scientific study which is regarded as a social learning process of scientific knowledge creation, revision, application, monitoring and dissemination with the aim of securing good quality, general, objective, testable and complete scientific knowledge of the domain. The theory stipulates the aim of scientific study that forms the basis of its principles. It also makes seven assumptions about scientific study and defines the major participating entities. It extends a recent process model of scientific study into (...) a detailed interaction model as this process model already addresses many issues of philosophy of science. The detailed interaction model of scientific study provides a common template of scientific activities for developing logical models in different scientific disciplines, or alternatively for developing ontologies of different scientific disciplines. Differences between research and scientific studies are discussed, and a possible way to develop a scientific theory of scientific study is described. (shrink)
The central question of this thesis is how one can learn about particular targets by using models of those targets. A widespread assumption is that models have to be representative models in order to foster knowledge about targets. Thus the thesis begins by examining the concept of representation from an epistemic point of view and supports an account of representation that does not distinguish between representation simpliciter and adequate representation. Representation, understood in the sense of a representative model, is regarded (...) as a success term. That is, a representative model is one relatum in a relation of adequate representation (Chapter 2). When a representative model represents a target, it allows users of this model to learn something about the target. It is argued that a representative model has this epistemic function because it shares relevant features with the target. This presupposes a similarity view of representation. Similarity views of representation face serious objections, which will be rebutted (Chapter 3). One way of spelling out a similarity view of representation is to defend an indirect view of representation. In this thesis, which does not argue for an indirect view, it is assumed that the indirect view is a good option, if not the best, for articulating the similarity view. It is demonstrated how such an indirect view can be expanded to account for cases of technological modeling. A case study in bioengineering is used to show that the indirect view of representation must acknowledge a distinction between two directions of fit in relations between vehicles and targets. In this context the notion of design is interpreted as a relation between a vehicle and a target, thereby connecting ideas from philosophy of science with ideas from philosophy of technology (Chapter 4). Fictionalist accounts of models are intended to tackle the issue of the ontology of models. In this thesis, however, two prominent fictionalist accounts are discussed from an epistemological point of view in light of the central question regarding how one can learn about targets by using models. This question is addressed from the standpoint of Waltonian fictionalism. The result of the discussion is that the two discussed Waltonian fictionalist accounts cannot sufficiently answer the question. These accounts are criticized for their inability to deliver a satisfactory epistemology of representation (Chapter 5). Although Waltonian fictionalism is criticized, the present thesis also shows that the foundational theory of Waltonian fictionalism, the theory of make-believe can nevertheless be used to account for the distinction between projections and predictions that is made by the Intergovernmental Panel on Climate Change (Chapter 6). (shrink)
Regarding the dichotomy between applied science and pure science, there are two apparently paradoxical facts. First, they are distinguishable. Second, the outcomes of pure sciences (e.g. scientific theories and models) are applicable to producing the outcomes of applied sciences (e.g. technological artefacts) and vice versa. Addressing the functional roles of applied and pure science, i.e. to produce design representation and science representation, respectively, I propose a new characterisation of the dichotomy that explains these two facts.
There is nowadays consensus in the community of didactics of science regarding the need to include the philosophy of science in didactical research, science teacher education, curriculum design, and the practice of science education in all educational levels. Some authors have identified an ever-increasing use of the concept of ‘theoretical model’, stemming from the so-called semantic view of scientific theories. However, it can be recognised that, in didactics of science, there are over-simplified transpositions of the idea of model. In this (...) sense, contemporary philosophy of science is often blurred or distorted in the science education literature. In this paper, we address the discussion around some meta-theoretical concepts that are introduced into didactics of science due to their perceived educational value. We argue for the existence of a ‘semantic family’, and we characterise four different versions of semantic views existing within the family. In particular, we seek to contribute to establishing a model-based didactics of science mainly supported in this semantic family. (shrink)
The structuralist reconstruction of the metabolic biochemistry here presented is a more complete and revised version than the one presented in Donolo, Federico & Lorenzano (2006). This version, as the previous one, continues with the reconstructive task initiated by César Lorenzano (2002), but advances further on those elements which remained pendent of reconstruction: applications subsequent to the paradigmatic one, for being these “too diversified and numerous” (p. 210).In line with which is said before, the objective of this new reconstruction is (...) to make the theoretical network of the biochemistry wider, in order to be able to capture the many successful applications (paradigmatic examples or exemplars) which appear in modern university textbooks. In order to accomplish this, major conceptual precisions are being introduced which will have repercussions in a modification and increased complexity of the fundamental law implicit in the text books, but still conserving the previous basic idea. Because of all this we can say that the present article goes further into the reconstruction task of the metabolic biochemistry theory. (shrink)