Mitchell signaleert aan de hand van de 'Zak-affaire' de afwezigheid van een gedeelde taal of grammatica in de muziek in het midden van de twintigste eeuw. Hij staat stil bij de speciale problemen die de bijzondere geschiedenis van de twintigste eeuw de muziek nagelaten heeft.
The world is complex, but acknowledging its complexity requires an appreciation for the many roles context plays in shaping natural phenomena. In _Unsimple Truths, _Sandra Mitchell argues that the long-standing scientific and philosophical deference to reductive explanations founded on simple universal laws, linear causal models, and predict-and-act strategies fails to accommodate the kinds of knowledge that many contemporary sciences are providing about the world. She advocates, instead, for a new understanding that represents the rich, variegated, interdependent fabric of many levels (...) and kinds of explanation that are integrated with one another to ground effective prediction and action. Mitchell draws from diverse fields including psychiatry, social insect biology, and studies of climate change to defend “integrative pluralism”—a theory of scientific practices that makes sense of how many natural and social sciences represent the multi-level, multi-component, dynamic structures they study. She explains how we must, in light of the now-acknowledged complexity and contingency of biological and social systems, revise how we conceptualize the world, how we investigate the world, and how we act in the world. Ultimately _Unsimple Truths _argues that the very idea of what should count as legitimate science itself should change. (shrink)
This study investigated the performance of boys with psychopathic tendencies and comparison boys, aged 9 to 17 years, on two tasks believed to be sensitive to amygdala and orbitofrontal cortex func- tioning. Fifty-one boys were divided into two groups according to the Psychopathy Screening Device (PSD, P. J. Frick & R. D. Hare, in press) and presented with two tasks. The tasks were the gambling task (A. Bechara, A. R. Damasio, H. Damasio, & S. W. Anderson, 1994) and the Intradimensional/ (...) Extradimensional (ID/ED) shift task (R. Dias, T. W. Robbins, & A. C. Roberts, 1996). The boys with psychopathic tendencies showed impaired performance on the gambling task. However, there were no group differences on the ID/ED task either for response reversal or extradimensional set shifting. The implications of these results for models of psychopathy are discussed. (shrink)
It may seem a bit perverse to argue that pluralism is a kind of dogmatism, since pluralists invariably define themselves as antidogmatists. Indeed, the world would seem to be so well supplied with overt dogmatists—religious fanatics, militant revolutionaries, political and domestic tyrants—that it will probably seem unfair to suggest that the proponents of liberal, tolerant, civilized open-mindedness are guilty of a covert dogmatism. My only excuse for engaging in this exercise is that it may help to shake up some rather (...) firmly fixed ideas about dogmatism held by those who advocate some version of pluralism. Dogmatism, I want to argue, has had a very had press, some of it deserved, some of it based in misunderstandings and ignorance. Much of that bad press stems, I will suggest, from the dominance of pluralism as an intellectual ideology since the Enlightenment. If “dogmatism” is a synonym for irrationality, infelixibility, and authoritarianism, the fault lies as much with pluralism as it does with any actual dogmatism. I’d like to begin, therefore, with a definition of dogmatism that comes, not from its pluralist foes, but from a historian of religion who treats it as a fairly neutral term, describing a complex and ancient feature of social institutions. This definition comes from E. Royston Pike’s Encyclopedia of Religion and Religions:DOGMA . A religious doctrine that is to be received on authority—whether of a Divine revelation, a Church Council, Holy Scripture, or a great and honoured religious teacher—and not, at least in the first instance, because it may be proved true in the light of reason. Almost always there is associated with dogma the element of Faith. The term comes from the Greek word for “to seem,” and it meant originally that which seems true to anyone, i.e. has been approved or decided beyond cavil. In the New Testament it is applied to decisions of the Christian church in Jerusalem, enactments of the Jewish law, and imperial decrees, all of which were things to be accepted without argument. A little later it had come to mean simple statements of Christian belief and practice; and it was not until the 4th century, when the heretics were showing how far from simple the basic Christian beliefs really were, that it acquired the meaning of a theological interpretation of a religious fact. Then came the division of the Church into a Western and an Eastern branch, and never again was it possible to frame a dogma that might be universally held. The 39 Articles of the Church of England, the principles deduced from Calvin’s “Institutes” and John Wesley’s “Sermons,” and the items that compose the Mormon creed may all be classed as dogmas.1 W. J. T. Mitchell, editor of Critical Inquiry, is professor of English and a member of the Committee on Art and Design at the University of Chicago. His most recent book is Iconology: Image, Text, Ideology. (shrink)
This fine collection of essays by a leading philosopher of science presents a defence of integrative pluralism as the best description for the complexity of scientific inquiry today. The tendency of some scientists to unify science by reducing all theories to a few fundamental laws of the most basic particles that populate our universe is ill-suited to the biological sciences, which study multi-component, multi-level, evolved complex systems. This integrative pluralism is the most efficient way to understand the different and complex (...) processes - historical and interactive - that generate biological phenomena. This book will be of interest to students and professionals in the philosophy of science. (shrink)
Biological knowledge does not fit the image of science that philosophers have developed. Many argue that biology has no laws. Here I criticize standard normative accounts of law and defend an alternative, pragmatic approach. I argue that a multidimensional conceptual framework should replace the standard dichotomous law/ accident distinction in order to display important differences in the kinds of causal structure found in nature and the corresponding scientific representations of those structures. To this end I explore the dimensions of stability, (...) strength, and degree of abstraction that characterize the variety of scientific knowledge claims found in biology and other sciences. (shrink)
Psychopaths continue to be demonised by the media and estimates suggest that a disturbing percentage of the population has psychopathic tendencies. This timely and controversial new book summarises what we already know about psychopathy and antisocial behavior and puts forward a new case for its cause - with far-reaching implications. Presents the scientific facts of psychopathy and antisocial behavior. Addresses key questions, such as: What is psychopathy? Are there psychopaths amongst us? What is wrong with psychopaths? Is psychopathy due to (...) nature or nurture? And can we treat psychopaths? Reveals the authors' ground-breaking research into whether an underlying abnormality in brain development leaves psychopaths with an inability to feel emotion or fear. The resulting theory could lead to early diagnosis and revolutionize the way society, the media and the state both views and contends with the psychopaths in our midst. (shrink)
Philosophical accounts of emergence have been explicated in terms of logical relationships between statements (derivation) or static properties (function and realization). Jaegwon Kim is a modern proponent. A property is emergent if it is not explainable by (or reducible to) the properties of lower level components. This approach, I will argue, is unable to make sense of the kinds of emergence that are widespread in scientific explanations of complex systems. The standard philosophical notion of emergence posits the wrong dichotomies, confuses (...) compositional physicalism with explanatory physicalism, and is unable to represent the type of dynamic processes (self-organizing feedback) that both generate emergent properties and express downward causation. (shrink)
The controversy regarding the unit of selection is fundamentally a dispute about what is the correct causal structure of the process of evolution by natural selection and its ontological commitments. By characterizing the process as consisting of two essential steps--interaction and transmission--a singular answer to the unit question becomes ambiguous. With such an account on hand, two recent defenses of competing units of selection are considered. Richard Dawkins maintains that the gene is the appropriate unit of selection and Robert Brandon, (...) in response, argues that the individual organism is better suited to the role. This paper argues that by making explicit the underlying questions that each of these views addresses, the apparent conflict can be resolved. Furthermore, such a resolution allows for a more complete and realistic understanding of the process of evolution by natural selection. (shrink)
Beatty, Brandon, and Sober agree that biological generalizations, when contingent, do not qualify as laws. Their conclusion follows from a normative definition of law inherited from the Logical Empiricists. I suggest two additional approaches: paradigmatic and pragmatic. Only the pragmatic represents varying kinds and degrees of contingency and exposes the multiple relationships found among scientific generalizations. It emphasizes the function of laws in grounding expectation and promotes the evaluation of generalizations along continua of ontological and representational parameters. Stability of conditions (...) and strength of determination in nature govern projectibility. Accuracy, ontological level, simplicity, and manageability provide additional measures of usefulness. (shrink)
The `fact' of pluralism in science is nosurprise. Yet, if science is representing andexplaining the structure of the oneworld, why is there such a diversity ofrepresentations and explanations in somedomains? In this paper I consider severalphilosophical accounts of scientific pluralismthat explain the persistence of bothcompetitive and compatible alternatives. PaulSherman's `Levels of Analysis' account suggeststhat in biology competition betweenexplanations can be partitioned by the type ofquestion being investigated. I argue that thisaccount does not locate competition andcompatibility correctly. I then defend anintegrative (...) model for understanding pluralism. This view is based on taking seriously both thecomplexity and contingency of biologicalorganization and the idealized character ofbiological models. On this view, explanationbecomes, among other things, the location forthe integration of diverse models. I explicatemy argument by an analysis of explanations ofdivision of labor in social insects. (shrink)
Multilevel research strategies characterize contemporary molecular inquiry into biological systems. We outline conceptual, methodological, and explanatory dimensions of these multilevel strategies in microbial ecology, systems biology, protein research, and developmental biology. This review of emerging lines of inquiry in these fields suggests that multilevel research in molecular life sciences has significant implications for philosophical understandings of explanation, modeling, and representation.
In this article I consider the challenges for exporting causal knowledge raised by complex biological systems. In particular, James Woodward’s interventionist approach to causality identified three types of stability in causal explanation: invariance, modularity, and insensitivity. I consider an example of robust degeneracy in genetic regulatory networks and knockout experimental practice to pose methodological and conceptual questions for our understanding of causal explanation in biology. †To contact the author, please write to: Department of History and Philosophy of Science, University of (...) Pittsburgh, 1017 Cathedral of Learning, Pittsburgh, PA 15260; e‐mail: email@example.com. (shrink)
It has been claimed that ceteris paribus laws, rather than strict laws are the proper aim of the special sciences. This is so because the causal regularities found in these domains are exception-ridden, being contingent on the presence of the appropriate conditions and the absence of interfering factors. I argue that the ceteris paribus strategy obscures rather than illuminates the important similarities and differences between representations of causal regularities in the exact and inexact sciences. In particular, a detailed account of (...) the types and degrees of contingency found in the domain of biology permits a more adequate understanding of the relations among the sciences. (shrink)
The question, “Will science remain human?” expresses a worry that deep learning algorithms will replace scientists in making crucial judgments of classification and inference and that something crucial will be lost if that happens. Ever since the introduction of telescopes and microscopes humans have relied on technologies to “extend” beyond human sensory perception in acquiring scientific knowledge. In this paper I explore whether the ways in which new learning technologies “extend” beyond human cognitive aspects of science can be treated instrumentally. (...) I will consider the norms for determining the reliability of a detection instrument, nuclear magnetic resonance spectroscopy, in predicting models of protein atomic structure. Do the same norms that apply in that case be used to judge the reliability of Artificial Intelligence deep learning algorithms? (shrink)
It has long been held that the structure of a protein is determined solely by the interactions of the atoms in the sequence of amino acids of which it is composed, and thus the stable, biologically functional conformation should be predictable by ab initio or de novo methods. However, except for small proteins, ab initio predictions have not been successful. We explain why this is the case and argue that the relationship among the different methods, models, and representations of protein (...) structure is one of integrative pluralism. Our defence appeals to specific features of the complexity of the functional protein structure and to the partial character of representation in general. We present examples of integrative strategies in protein science. 1. Introduction2. Partiality of Representation3. Protein Functional Complexity4. Modelling Protein Structure4.1 Integrating ab initio and experimental models4.2 Integrating multiple experimental models5. Conclusion. (shrink)
This study investigates the ability of individuals with psychopathy to perform passive avoidance learning and whether this ability is modulated by level of reinforcement/punishment. Nineteen psychopathic and 21 comparison individuals, as defined by the Hare Psychopathy Checklist Revised (Hare, 1991), were given a passive avoidance task with a graded reinforcement schedule. Response to each rewarding number gained a point reward specific to that number (i.e., 1, 700, 1400 or 2000 points). Response to each punishing number lost a point punishment specific (...) to that number (i.e., the loss of 1, 700, 1400 or 2000 points). In line with predictions, individuals with psychopathy made more passive avoidance errors than the comparison individuals. In addition, while the performance of both groups was modulated by level of reward, only the performance of the comparison population was modulated by level of punishment. The results are interpreted with reference to a computational account of the emotional learning impairment in individuals with psychopathy. (shrink)
This study investigates the performance of psychopathic individuals on tasks believed to be sensitive to dorsolateral prefrontal and orbitofrontal cortex (OFC) functioning. Psychopathic and non-psychopathic individuals, as defined by the Hare psychopathy checklist revised (PCL-R) [Hare, The Hare psychopathy checklist revised, Toronto, Ontario: Multi-Health Systems, 1991] completed a gambling task [Cognition 50 (1994) 7] and the intradimensional/extradimensional (ID/ED) shift task [Nature 380 (1996) 69]. On the gambling task, psychopathic participants showed a global tendency to choose disadvantageously. Specifically, they showed an (...) impaired ability to show learning over the course of the task. On the ID/ED task, the performance of psychopathic individuals was not significantly different from incarcerated controls on attentional set-shifting, but significant impairments were found on response reversal. These results are interpreted with reference to an OFC and amygdala dysfunction explanation of psychopathy. (shrink)
In this paper I discuss recent debates concerning etiological theories of functions. I defend an etiological theory against two criticisms, namely the ability to account for malfunction, and the problem of structural doubles. I then consider the arguments provided by Bigelow and Pargetter (1987) for a more forward looking account of functions as propensities or dispositions. I argue that their approach fails to address the explanatory problematic for which etiological theories were developed.