one takes to be the most salient, any pair could be judged more similar to each other than to the third. Goodman uses this second problem to showthat there can be no context-free similarity metric, either in the trivial case or in a scientifically ...
Philosophers of science increasingly recognize the importance of idealization: the intentional introduction of distortion into scientific theories. Yet this recognition has not yielded consensus about the nature of idealization. e literature of the past thirty years contains disparate characterizations and justifications, but little evidence of convergence towards a common position.
Many standard philosophical accounts of scientific practice fail to distinguish between modeling and other types of theory construction. This failure is unfortunate because there are important contrasts among the goals, procedures, and representations employed by modelers and other kinds of theorists. We can see some of these differences intuitively when we reflect on the methods of theorists such as Vito Volterra and Linus Pauling on the one hand, and Charles Darwin and Dimitri Mendeleev on the other. Much of Volterra's and (...) Pauling's work involved modeling; much of Darwin's and Mendeleev's did not. In order to capture this distinction, I consider two examples of theory construction in detail: Volterra's treatment of post-WWI fishery dynamics and Mendeleev's construction of the periodic system. I argue that modeling can be distinguished from other forms of theorizing by the procedures modelers use to represent and to study real-world phenomena: indirect representation and analysis. This differentiation between modelers and non-modelers is one component of the larger project of understanding the practice of modeling, its distinctive features, and the strategies of abstraction and idealization it employs. (shrink)
Because of its complexity, contemporary scientific research is almost always tackled by groups of scientists, each of which works in a different part of a given research domain. We believe that understanding scientific progress thus requires understanding this division of cognitive labor. To this end, we present a novel agent-based model of scientific research in which scientists divide their labor to explore an unknown epistemic landscape. Scientists aim to climb uphill in this landscape, where elevation represents the significance of the (...) results discovered by employing a research approach. We consider three different search strategies scientists can adopt for exploring the landscape. In the first, scientists work alone and do not let the discoveries of the community as a whole influence their actions. This is compared with two social research strategies, which we call the follower and maverick strategies. Followers are biased towards what others have already discovered, and we find that pure populations of these scientists do less well than scientists acting independently. However, pure populations of mavericks, who try to avoid research approaches that have already been taken, vastly outperform both of the other strategies. Finally, we show that in mixed populations, mavericks stimulate followers to greater levels of epistemic production, making polymorphic populations of mavericks and followers ideal in many research domains. (shrink)
Modelers often rely on robustness analysis, the search for predictions common to several independent models. Robustness analysis has been characterized and championed by Richard Levins and William Wimsatt, who see it as central to modern theoretical practice. The practice has also been severely criticized by Steven Orzack and Elliott Sober, who claim that it is a nonempirical form of confirmation, effective only under unusual circumstances. This paper addresses Orzack and Sober's criticisms by giving a new account of robustness analysis and (...) showing how the practice can identify robust theorems. Once the structure of robust theorems is clearly articulated, it can be shown that such theorems have a degree of confirmation, despite the lack of direct empirical evidence for their truth. (shrink)
Despite their best efforts, scientists may be unable to construct models that simultaneously exemplify every theoretical virtue. One explanation for this is the existence of tradeoffs: relationships of attenuation that constrain the extent to which models can have such desirable qualities. In this paper, we characterize three types of tradeoffs theorists may confront. These characterizations are then used to examine the relationships between parameter precision and two types of generality. We show that several of these relationships exhibit tradeoffs and discuss (...) what consequences those tradeoffs have for theoretical practice. (shrink)
Theorizing in ecology and evolution often proceeds via the construction of multiple idealized models. To determine whether a theoretical result actually depends on core features of the models and is not an artifact of simplifying assumptions, theorists have developed the technique of robustness analysis, the examination of multiple models looking for common predictions. A striking example of robustness analysis in ecology is the discovery of the Volterra Principle, which describes the effect of general biocides in predator-prey systems. This paper details (...) the discovery of the Volterra Principle and the demonstration of its robustness. It considers the classical ecology literature on robustness and introduces two individual-based models of predation, which are used to further analyze the Volterra Principle. The paper also introduces a distinction between parameter robustness, structural robustness, and representational robustness, and demonstrates that the Volterra Principle exhibits all three kinds of robustness. *Received September 2006; revised May 2007. ‡Earlier versions of this paper were presented at the Australasian Association of Philosophy, the London School of Economics, and the University of Bristol. The authors wish to thank those audiences as well as Patrick Forber, Ken Waters, Deena Skolnick Weisberg, Uri Wilensky, and Bill Wimsatt for many helpful comments. Special thanks to Giacomo Sillari for his assistance in translating Volterra's original paper and his insightful thoughts about Volterra's aims and methods. Some of the research in this paper was supported by NSF grant SES-0620887 to MW. †To contact the authors, please write to: Michael Weisberg, Department of Philosophy, University of Pennsylvania, 433 Logan Hall, Philadelphia, PA 19104; e-mail: [email protected]; Kenneth Reisman, Pluribo, Inc., 100 Park Avenue, Suite 1600, New York, NY 10017; e-mail: [email protected] (shrink)
This paper is an interpretation and defense of Richard Levins’ “The Strategy of Model Building in Population Biology,” which has been extremely influential among biologists since its publication 40 years ago. In this article, Levins confronted some of the deepest philosophical issues surrounding modeling and theory construction. By way of interpretation, I discuss each of Levins’ major philosophical themes: the problem of complexity, the brute-force approach, the existence and consequence of tradeoffs, and robustness analysis. I argue that Levins’ article is (...) concerned, at its core, with justifying the use of multiple, idealized models in population biology. (shrink)
Scientific research is almost always conducted by communities of scientists of varying size and complexity. Such communities are effective, in part, because they divide their cognitive labor: not every scientist works on the same project. Philip Kitcher and Michael Strevens have pioneered efforts to understand this division of cognitive labor by proposing models of how scientists make decisions about which project to work on. For such models to be useful, they must be simple enough for us to understand their dynamics, (...) but faithful enough to reality that we can use them to analyze real scientific communities. To satisfy the first requirement, we must employ idealizations to simplify the model. The second requirement demands that these idealizations not be so extreme that we lose the ability to describe real-world phenomena. This paper investigates the status of the assumptions that Kitcher and Strevens make in their models, by first inquiring whether they are reasonable representations of reality, and then by checking the models' robustness against weakenings of these assumptions. To do this, we first argue against the reality of the assumptions, and then develop a series of agent-based simulations to systematically test their effects on model outcomes. We find that the models are not robust against weakenings of these idealizations. In fact we find that under certain conditions, this can lead to the model predicting outcomes that are qualitatively opposite of the original model outcomes. (shrink)
This article reviews the recent literature on idealization, specifically idealization in the course of scientific modeling. We argue that idealization is not a unified concept and that there are three different types of idealization: Galilean, minimalist, and multiple models, each with its own justification. We explore the extent to which idealization is a permanent feature of scientific representation and discuss its implications for debates about scientific realism.
Although most philosophical accounts about model/world relations focus on structural mappings such as isomorphism, similarity has long been discussed as an alternative account. Despite its attractions, proponents of the similarity view have not provided detailed accounts of what it means that a model is similar to a real-world target system. This article gives the outlines of such an account, drawing on the work of Amos Tversky.
Current scientific research almost always requires collaboration among several (if not several hundred) specialized researchers. When scientists co-author a journal article, who deserves credit for discoveries or blame for errors? How should scientific institutions promote fruitful collaborations among scientists? In this book, leading philosophers of science address these critical questions.
This paper examines a series of Schelling-like models of residential segregation, in which agents prefer to be in the minority. We demon- strate that as long as agents care about the characteristics of their wider community, they tend to end up in a segregated state. We then investigate the process that causes this, and conclude that the result hinges on the similarity of informational states amongst agents of the same type. This is quite di erent from Schelling-like behavior, and sug- (...) gests (in his terms) that segregation is an instance of macro behavior which can arise from a wide variety of micro motives. (shrink)
Why is evolutionary theory controversial among members of the American public? We propose a novel explanation: allegiance to different criteria for belief. In one interview study, two online surveys, and one nationally representative phone poll, we found that evolutionists and creationists take different justifications for belief as legitimate. Those who accept evolution emphasize empirical evidence and scientific consensus. Creationists emphasize not only the Bible and religious authority, but also knowledge of the heart. These criteria for belief remain predictive of views (...) about evolution even when taking into account other related factors like religion, political affiliation, and education. Each view is supported by its own internally specified criteria for what constitutes a justified belief. Changing minds may thus require changing epistemic norms. (shrink)
Roald Hoffmann and other theorists claim that we ought to use highly idealized chemical models (“qualitative models”) in order to increase our understanding of chemical phenomena, even though other models are available which make more highly accurate predictions. I assess this norm by examining one of the tradeoffs faced by model builders and model users—the tradeoff between precision and generality. After arguing that this tradeoff obtains in many cases, I discuss how the existence of this tradeoff can help us defend (...) Hoffmann's norm for modelling. (shrink)
In defending semantic externalism, philosophers of language have often assumed that there is a straightforward connection between scientific kinds and the natural kinds recognized by ordinary language users.1 For example, the claim that water is H2O assumes that the ordinary language kind water corresponds to a chemical kind, which contains all the molecules with molecular formula H2O as its members. This assumption about the coordination between ordinary language kinds and scientific kinds is important for the externalist program, because it is (...) what allows us to discover empirically the extensions of ordinary language kind terms. (shrink)
Chemistry is the study of the structure and transformation of matter. When Aristotle founded the field in the 4th century BCE, his conceptual grasp of the nature of matter was tailored to accommodate a relatively simple range of observable phenomena. In the 21st century, chemistry has become the largest scientific discipline, producing over half a million publications a year ranging from direct empirical investigations to substantial theoretical work. However, the specialized interest in the conceptual issues arising in chemistry, hereafter Philosophy (...) of Chemistry, is a relatively recent addition to philosophy of science. Philosophy of chemistry has two major parts. In the first, conceptual issues arising within chemistry are carefully articulated and analyzed. Such questions which are internal to chemistry include the nature of substance, atomism, the chemical bond, and synthesis. In the second, traditional topics in philosophy of science such as realism, reduction, explanation, confirmation, and modeling are taken up within the context of chemistry. (shrink)
The covalent bond, a difficult concept to define precisely, plays a central role in chemical predictions, interventions, and explanations. I investigate the structural conception of the covalent bond, which says that bonding is a directional, submolecular region of electron density, located between individual atomic centers and responsible for holding the atoms together. Several approaches to constructing molecular models are considered in order to determine which features of the structural conception of bonding, if any, are robust across these models. Key components (...) of the structural conception are absent in all but the simplest quantum mechanical models of molecular structure, seriously challenging the conception’s viability. †To contact the author, please write to: Department of Philosophy, University of Pennsylvania, 433 Cohen Hall, Philadelphia, PA 19104‐6304; e‐mail: [email protected] (shrink)
Gravitational interactions allowed astronomers to conclude that dark matter rings all luminous galaxies in gigantic halos, but this only accounts for a fraction of the total mass of dark matter believed to exist. Where is the rest? We hypothesize that some of it resides in dark galaxies, pure dark matter halos that either never possessed or have totally lost their baryonic matter. This article explores methodological challenges that arise because of the nature of observation in astrophysics and examines how the (...) blend of observation, simulation, and theory we call the Observing the Invisible approach might make detecting such dark objects possible. (shrink)
Simulation and Similarity: Using Models to Understand the World is an account of modeling in contemporary science. Modeling is a form of surrogate reasoning where target systems in the natural world are studied using models, which are similar to these targets. My book develops an account of the nature of models, the practice of modeling, and the similarity relation that holds between models and their targets. I also analyze the conceptual tools that allow theorists to identify the trustworthy aspects of (...) models. Taken as a whole, I try to account for the ways that modeling is actually practiced by theorists, while abstracting sufficiently to understand the similarities and differences among examples of concrete, mathematical, and computational modeling.I am grateful to Wendy Parker, Jay Odenbaugh, and Bill Wimsatt for their careful and interesting reading of my book, as well as their constructive criticisms. Although I naturally disagree with some of their critiques, I have learned much .. (shrink)
Evolutionary biology – or, more precisely, two (purported) applications of Darwin's theory of evolution by natural selection, namely, evolutionary psychology and what has been called human behavioral biology – is on the cusp of becoming the new rage among legal scholars looking for interdisciplinary insights into the law. We argue that as the actual science stands today, evolutionary biology offers nothing to help with questions about legal regulation of behavior. Only systematic misrepresentations or lack of understanding of the relevant biology, (...) together with far-reaching analytical and philosophical confusions, have led anyone to think otherwise. Evolutionary accounts are etiological accounts of how a trait evolved. We argue that an account of causal etiology could be relevant to law if (1) the account of causal etiology is scientifically well-confirmed, and (2) there is an explanation of how the well-confirmed etiology bears on questions of development (what we call the Environmental Gap Objection). We then show that the accounts of causal etiology that might be relevant are not remotely well-confirmed by scientific standards. We argue, in particular, that (a) evolutionary psychology is not entitled to assume selectionist accounts of human behaviors, (b) the assumptions necessary for the selectionist accounts to be true are not warranted by standard criteria for theory choice, and (c) only confusions about levels of explanation of human behavior create the appearance that understanding the biology of behavior is important. We also note that no response to the Environmental Gap Objection has been proffered. In the concluding section of the article, we turn directly to the work of Owen Jones, a leading proponent of the relevance of evolutionary biology to law, and show that he does not come to terms with any of the fundamental problems identified in this article. (shrink)
Contemporary literature in philosophy of science has begun to emphasize the practice of modeling, which differs in important respects from other forms of representation and analysis central to standard philosophical accounts. This literature has stressed the constructed nature of models, their autonomy, and the utility of their high degrees of idealization. What this new literature about modeling lacks, however, is a comprehensive account of the models that figure in to the practice of modeling. This paper offers a new account of (...) both concrete and mathematical models, with special emphasis on the intentions of theorists, which are necessary for evaluating the model-world relationship during the practice of modeling. Although mathematical models form the basis of most of contemporary modeling, my discussion begins with more traditional, concrete models such as the San Francisco Bay model. (shrink)
Much of biological and economic theorizing takes place by modeling, the indirect study of real-world phenomena by the construction and examination of models. Books and articles about biological and economic theory are often books and articles about models, many of which are highly idealized and chosen for their explanatory power and analytical convenience rather than for their fit with known data sets. Philosophers of science have recognized these facts and have developed literatures about the nature of models, modeling, idealization, as (...) well as testing of models and explanation by models, for both biology and economics. The impetus for this special issue came from our recognition that there is remarkably little overlap between the “modeling in biology” and “modeling in economics” literatures, despite many of the same themes appearing in these literatures. (shrink)
Proponents of individual-based modeling in ecology claim that their models explain the emergence of population-level behavior. This article argues that individual-based models have not, as yet, provided such explanations. Instead, individual-based models can and do demonstrate and explain the emergence of population-level behaviors from individual behaviors and interactions.
(2013). Modeling herding behavior and its risks. Journal of Economic Methodology: Vol. 20, Methodology, Systemic Risk, and the Economics Profession, pp. 6-18. doi: 10.1080/1350178X.2013.774843.
Aristotle’s On generation and corruption raises a vital question: how is mixture, or what we would now call chemical combination, possible? It also offers an outline of a solution to the problem and a set of criteria that a successful solution must meet. Understanding Aristotle’s solution and developing a viable peripatetic theory of chemical combination has been a source of controversy over the last two millennia. We describe seven criteria a peripatetic theory of mixture must satisfy: uniformity, recoverability, potentiality, equilibrium, (...) alteration, incompleteness, and the ability to distinguish mixture from generation, corruption, juxtaposition, augmentation, and alteration. After surveying the theories of Philoponus , Avicenna , Averroes , and John M. Cooper , we argue for the merits of Richard Rufus of Cornwall’s theory. Rufus was a little known scholastic philosopher who became a Franciscan theologian in 1238, after teaching Aristotelian natural philosophy as a secular master in Paris. Lecturing on Aristotle’s De generatione et corruptione, around the year 1235, he offered his students a solution to the problem of mixture that we believe satisfies Aristotle’s seven criteria.Author Keywords: Mixture; Mixt; Chemical combination; Accidental potential; Potential; Elements; Medieval chemistry; Peripatetic chemistry; Aristotelian science; Richardus Rufus; Averroes ; Avicenna ; Philoponus ; John M. Cooper. (shrink)
Nobel laureate Roald Hoffmann's contributions to chemistry are well known. Less well known, however, is that over a career that spans nearly fifty years, Hoffmann has thought and written extensively about a wide variety of other topics, such as chemistry's relationship to philosophy, literature, and the arts, including the nature of chemical reasoning, the role of symbolism and writing in science, and the relationship between art and craft and science. In Roald Hoffmann on the Philosophy, Art, and Science of Chemistry, (...) Jeffrey Kovac and Michael Weisberg bring together twenty-eight of Hoffmann's most important essays. Gathered here are Hoffmann's most philosophically significant and interesting essays and lectures, many of which are not widely accessible. In essays such as "Why Buy That Theory," "Nearly Circular Reasoning," "How Should Chemists Think," "The Metaphor, Unchained," "Art in Science," and "Molecular Beauty," we find the mature reflections of one of America's leading scientists. Organized under the general headings of Chemical Reasoning and Explanation, Writing and Communicating, Art and Science, Education, and Ethics, these stimulating essays provide invaluable insight into the teaching and practice of science. (shrink)
This article examines a series of Schelling-like models of residential segregation, in which agents prefer to be in the minority. We demonstrate that as long as agents care about the characteristics of their wider community, they tend to end up in a segregated state. We then investigate the process that causes this and conclude that the result hinges on the similarity of informational states among agents of the same type. This is quite different from Schelling-like behavior and suggests that segregation (...) is an instance of macrobehavior that can arise from a wide variety of micromotives. (shrink)
Aristotle’s On generation and corruption raises a vital question: how is mixture, or what we would now call chemical combination, possible? It also offers an outline of a solution to the problem and a set of criteria that a successful solution must meet. Understanding Aristotle’s solution and developing a viable peripatetic theory of chemical combination has been a source of controversy over the last two millennia. We describe seven criteria a peripatetic theory of mixture must satisfy: uniformity, recoverability, potentiality, equilibrium, (...) alteration, incompleteness, and the ability to distinguish mixture from generation, corruption, juxtaposition, augmentation, and alteration. After surveying the theories of Philoponus (d. 574), Avicenna(d. 1037), Averroes (d. 1198), and John M. Cooper (fl. circa2000), we argue for the merits of Richard Rufus of Cornwall’s theory. Rufus (fl. 1231–1256) was a little known scholastic philosopher who became a Franciscan theologian in 1238, after teaching Aristotelian natural philosophy as a secular master in Paris. Lecturing on Aristotle’s De generatione et corruptione, around the year 1235, he offered his students a solution to the problem of mixture that we believe satisfies Aristotle’s seven criteria. # 2004 Elsevier Ltd. All rights reserved. (shrink)
Prior work has found that Americans’ views on evolution are significantly and positively related to their understanding of this theory. However, whether this relationship is cross-culturally robust is unknown. This article extends earlier work by measuring and comparing the acceptance and understanding of evolution among highly educated individuals in China and the United States. We find a significantly higher evolution acceptance level in the Chinese sample than in the US sample, but no significant difference in their average levels of evolution (...) knowledge. Our analysis also shows that accepting evolutionary theory is related to understanding in both the US and the Chinese samples. These results provide evidence for the robustness of the relationship between understanding and acceptance of evolution across different cultural contexts. To our knowledge, this is the first attempt to comprehensively test understanding of evolutionary theory within a Chinese sample and to compare these results with the US sample. (shrink)
A substantial proportion of Chinese nationals seem to accept evolution, and the country is sometimes held up to show that the sorry state of evolution acceptance in the United States is not inevitable. Attempts to improve evolution acceptance generally focus on improving communication, curricular reform, and even identifying cognitive mechanisms that bias people against evolution. What is it that the Chinese scientific community did so well, and can it be generalized? This paper argues that evolution acceptance in China has a (...) very specific history, one that other countries are very unlikely to emulate. We show that the interactions among science, education, mass media, social and political movements, and ideological arguments about evolution greatly influenced the Chinese public's understanding and acceptance of evolution. We find that it was not just formal education, but many more ideologically motivated methods of evolution exposure that contributed to the high rate of acceptance. But since the purpose of evolution dissemination has moved beyond merely teaching biology, the Chinese public persists with substantial misunderstandings of the theory. Thus, bottom line percentage of acceptance figures can be misleading; the details and the history really matter. (shrink)
Many people feel the pull of both creationism and evolution as explanations for the origin of species, despite the direct contradiction. Some respond by endorsing theistic evolution, integrating the scientific and religious explanations by positing that God initiated or guided the process of evolution. Others, however, simultaneously endorse both evolution and creationism despite the contradiction. Here, we illustrate this puzzling phenomenon with interviews with a diverse sample. This qualitative data reveals several approaches to coping with simultaneous inconsistent explanations. For example, (...) some people seem to manage this contradiction by separating out ideological claims, which prioritize identity expression, from fact claims, which prioritize truth. Fitting with this interpretation, ambivalent individuals tended to call explanations “beliefs”, avoid mention of truth or falsity, and ground one or both beliefs in identity and personal history. We conclude with a brief discussion of the affordances of this distinction. (shrink)
This article is an overview of some of the contemporary debates in philosophy of chemistry. We discuss the nature of chemical substances, the individuation of chemical kinds, the relationship between chemistry and physics, and the nature of the chemical bond.