The rosy dawn of my title refers to that optimistic time when the logical concept of a natural kind originated in Victorian England. The scholastic twilight refers to the present state of affairs. I devote more space to dawn than twilight, because one basic problem was there from the start, and by now those origins have been forgotten. Philosophers have learned many things about classification from the tradition of natural kinds. But now it is in disarray and is unlikely to (...) be put back together again. My argument is less founded on objections to the numerous theories now in circulation, than on the sheer proliferation of incompatible views. There no longer exists what Bertrand Russell called ‘the doctrine of natural kinds’—one doctrine. Instead we have a slew of distinct analyses directed at unrelated projects. (shrink)
Many claims about conceptual matters are often represented as, or inferred from, claims about the meaning, reference, or mastery, of words. But sometimes this has led to treating conceptual analysis as though it were nothing but linguistic analysis. We canvass the most promising justifications for moving from linguistic premises to substantive conclusions. We show that these justifications fail and argue against current practice (in metaethics and elsewhere), which confuses an investigation of a word’s meaning, reference, or competence conditions with an (...) analysis of some concept or property associated with that word. (shrink)
How is a person's freedom related to his or her preferences? Liberal theorists of negative freedom have generally taken the view that the desire of a person to do or not do something is irrelevant to the question of whether he is free to do it. Supporters of the “pure negative” conception of freedom have advocated this view in its starkest form: they maintain that a person is unfree to Φ if and only if he is prevented from Φ-ing by (...) the conduct or dispositions of some other person. This definition of freedom is value-neutral in the sense that no reference is made to preferences over options or indeed to any other indicators of the values of options, either in the characterization of “Φ-ing” itself or in the characterization of the way in which Φ-ing can be constrained. (shrink)
This paper shows how critical realism can be used to integrate empirical data and philosophical analysis within ‘empirical bioethics’. The term empirical bioethics, whilst appearing oxymoronic, simply refers to an interdisciplinary approach to the resolution of practical ethical issues within the biological and life sciences, integrating social scientific, empirical data with philosophical analysis. It seeks to achieve a balanced form of ethical deliberation that is both logically rigorous and sensitive to context, to generate normative conclusions that are practically applicable to (...) the problem, challenge, or dilemma. Since it incorporates both philosophical and social scientific components, empirical bioethics is a field that is consistent with the use of critical realism as a research methodology. The integration of philosophical and social scientific approaches to ethics has been beset with difficulties, not least because of the irreducibly normative, rather than descriptive, nature of ethical analysis and the contested relation between fact and value. However, given that facts about states of affairs inform potential courses of action and their consequences, there is a need to overcome these difficulties and successfully integrate data with theory. Previous approaches have been formulated to overcome obstacles in combining philosophical and social scientific perspectives in bioethical analysis; however each has shortcomings. As a mature interdisciplinary approach critical realism is well suited to empirical bioethics, although it has hitherto not been widely used. Here I show how it can be applied to this kind of research and explain how it represents an improvement on previous approaches. (shrink)
Many who doubt its analytic status nonetheless agree with the claim that a spinster is a woman of marriageable age who has not yet married. They are also likely to agree that this claim has the look of a definition. After all, it has the following four features: 1) Extensional adequacy: It cites a particular condition that is met by all and only things of the kind being defined (the spinsters, in this case).
Philosophers frequently defend definitions by appealing to intuitions and contemporary folk classificatory norms. I raise methodological concerns that undermine some of these defenses. Focusing on Andrew Kania's recent definition of music, I argue that the way in which it has been developed leads to problems, and I show that a number of other definitions of interest to philosophers of art run into similar problems.
Epidemiology is a core science of public health, focusing on research related to the distribution and determinants of both positive and adverse health states and events and on application of knowledge gained to improve public health. The American College of Epidemiology (ACE) is a professional organization devoted to the professional practice of epidemiology. As part of that commitment, and in response to concerns for more explicit attention to core values and duties of epidemiologists in light of emerging issues and increased (...) scrutiny of epidemiology, the College developed, adopted, and published a set of Ethics Guidelines. The structure of the ACE ethics guidelines is in four parts: (1) a brief statement of core values and duties of epidemiologists, coupled with the virtues important to professional practice; (2) concise statements of key duties and obligations; (3) exposition of the duties and obligations with more applications; and (4) a brief summary and conclusion. The Guidelines have been published on the ACE website and in the official College journal Annals of Epidemiology. The guidelines contain (and maintain) core elements that define the discipline of epidemiology and its fundamental duties, but they are also intended to be dynamic and evolving, responsive to a changing professional and social environment. (shrink)
Early intervention aims to identify children or families at risk of poor health, and take preventative measures at an early stage, when intervention is more likely to succeed. EI is concerned with the just distribution of “life chances,” so that all children are given fair opportunity to realize their potential and lead a good life; EI policy design, therefore, invokes ethical questions about the balance of responsibilities between the state, society, and individuals in addressing inequalities. We analyze a corpus of (...) EI policy guidance to investigate explicit and implicit ethical arguments about who should be held morally responsible for safeguarding child health and well-being. We examine the implications of these claims and explore what it would mean to put the proposed policies into practice. We conclude with some remarks about the useful role that philosophical analysis can play in EI policy development. (shrink)
This article addresses gender differences in laughter and smiling from an evolutionary perspective. Laughter and smiling can be responses to successful display behavior or signals of affiliation amongst conversational partners—differing social and evolutionary agendas mean there are different motivations when interpreting these signals. Two experiments assess perceptions of genuine and simulated male and female laughter and amusement social signals. Results show male simulation can always be distinguished. Female simulation is more complicated as males seem to distinguish cues of simulation yet (...) judge simulated signals to be genuine. Females judge other female’s genuine signals to have higher levels of simulation. Results highlight the importance of laughter and smiling in human interactions, use of dynamic stimuli, and using multiple methodologies to assess perception. (shrink)
How is the burden of proof to be distributed among individuals who are involved in resolving a particular issue? Under what conditions should the burden of proof be distributed unevenly? We distinguish attitudinal from dialectical burdens and argue that these questions should be answered differently, depending on which is in play. One has an attitudinal burden with respect to some proposition when one is required to possess sufficient evidence for it. One has a dialectical burden with respect to some proposition (...) when one is required to provide supporting arguments for it as part of a deliberative process. We show that the attitudinal burden with respect to certain propositions is unevenly distributed in some deliberative contexts, but in all of these contexts, establishing the degree of support for the proposition is merely a means to some other deliberative end, such as action guidance, or persuasion. By contrast, uneven distributions of the dialectical burden regularly further the aims of deliberation, even in contexts where the quest for truth is the sole deliberative aim, rather than merely a means to some different deliberative end. We argue that our distinction between these two burdens resolves puzzles about unevenness that have been raised in the literature. (shrink)
This 1983 book is a lively and clearly written introduction to the philosophy of natural science, organized around the central theme of scientific realism. It has two parts. 'Representing' deals with the different philosophical accounts of scientific objectivity and the reality of scientific entities. The views of Kuhn, Feyerabend, Lakatos, Putnam, van Fraassen, and others, are all considered. 'Intervening' presents the first sustained treatment of experimental science for many years and uses it to give a new direction to debates about (...) realism. Hacking illustrates how experimentation often has a life independent of theory. He argues that although the philosophical problems of scientific realism can not be resolved when put in terms of theory alone, a sound philosophy of experiment provides compelling grounds for a realistic attitude. A great many scientific examples are described in both parts of the book, which also includes lucid expositions of recent high energy physics and a remarkable chapter on the microscope in cell biology. (shrink)
In this important new study Ian Hacking continues the enquiry into the origins and development of certain characteristic modes of contemporary thought undertaken in such previous works as his best selling Emergence of Probability. Professor Hacking shows how by the late nineteenth century it became possible to think of statistical patterns as explanatory in themselves, and to regard the world as not necessarily deterministic in character. Combining detailed scientific historical research with characteristic philosophic breath and verve, The Taming of Chance (...) brings out the relations among philosophy, the physical sciences, mathematics and the development of social institutions, and provides a unique and authoritative analysis of the "probabilization" of the Western world. (shrink)
One of Ian Hacking's earliest publications, this book showcases his early ideas on the central concepts and questions surrounding statistical reasoning. He explores the basic principles of statistical reasoning and tests them, both at a philosophical level and in terms of their practical consequences for statisticians. Presented in a fresh twenty-first-century series livery, and including a specially commissioned preface written by Jan-Willem Romeijn, illuminating its enduring importance and relevance to philosophical enquiry, Hacking's influential and original work has been revived for (...) a new generation of readers. (shrink)
Classical logic has been attacked by adherents of rival, anti-realist logical systems: Ian Rumfitt comes to its defence. He considers the nature of logic, and how to arbitrate between different logics. He argues that classical logic may dispense with the principle of bivalence, and may thus be liberated from the dead hand of classical semantics.
Historical records show that there was no real concept of probability in Europe before the mid-seventeenth century, although the use of dice and other randomizing objects was commonplace. Ian Hacking presents a philosophical critique of early ideas about probability, induction, and statistical inference and the growth of this new family of ideas in the fifteenth, sixteenth, and seventeenth centuries. Hacking invokes a wide intellectual framework involving the growth of science, economics, and the theology of the period. He argues that the (...) transformations that made it possible for probability concepts to emerge have constrained all subsequent development of probability theory and determine the space within which philosophical debate on the subject is still conducted. First published in 1975, this edition includes an introduction that contextualizes his book in light of developing philosophical trends. Ian Hacking is the winner of the Holberg International Memorial Prize 2009. (shrink)
From the time of its clearest origins with Pascal, the theory of probabilities seemed to offer means by which the study of human affairs might be reduced to the same kind of mathematical discipline that was already being achieved in the study of nature. Condorcet is to a great extent merely representative of the philosophers of the seventeenth and eighteenth centuries who were led on by the prospect of developing moral and political sciences on the pattern of the natural sciences, (...) specifically physics. The development of economics and the social sciences, from the eighteenth century onwards, may be said in part to have fulfilled and in a manner to have perpetuated these ambitions. In so far as the new sciences have been susceptible of mathematical treatment, this has not been confined to the calculus of probabilities. But there is a temptation at every stage to ascribe fundamental significance and universal applicability to each latest mathematical device that is strikingly useful or illuminating on its first introduction. It is the theory of games that enjoys this position at present, and shapes the common contemporary conception of the very same problems that preoccupied Condorcet. (shrink)
The analytical notion of ‘scientific style of reasoning’, introduced by Ian Hacking in the middle of the 1980s, has become widespread in the literature of the history and philosophy of science. However, scholars have rarely made explicit the philosophical assumptions and the research objectives underlying the notion of style: what are its philosophical roots? How does the notion of style fit into the area of research of historical epistemology? What does a comparison between Hacking’s project on styles of thinking and (...) other similar projects suggest? My aim in this paper is to answer these questions. Hacking has denied that his project of styles of thinking falls into the field of historical epistemology. I shall challenge his remark by tracing out the connections of the notion of style with historical epistemology and, more in general, with a tradition of thought born in France in the beginning of twentieth-century. (shrink)
In this paper we argue that ill persons are particularly vulnerable to epistemic injustice in the sense articulated by Fricker. Ill persons are vulnerable to testimonial injustice through the presumptive attribution of characteristics like cognitive unreliability and emotional instability that downgrade the credibility of their testimonies. Ill persons are also vulnerable to hermeneutical injustice because many aspects of the experience of illness are difficult to understand and communicate and this often owes to gaps in collective hermeneutical resources. We then argue (...) that epistemic injustice arises in part owing to the epistemic privilege enjoyed by the practitioners and institutions of contemporary healthcare services—the former owing to their training, expertise, and third-person psychology, and the latter owing to their implicit privileging of certain styles of articulating and evidencing testimonies in ways that marginalise ill persons. We suggest that a phenomenological toolkit may be part of an effort to ameliorate epistemic injustice. (shrink)
Luck permeates our lives, and this raises a number of pressing questions: What is luck? When we attribute luck to people, circumstances, or events, what are we attributing? Do we have any obligations to mitigate the harms done to people who are less fortunate? And to what extent is deserving praise or blame a ected by good or bad luck? Although acquiring a true belief by an uneducated guess involves a kind of luck that precludes knowledge, does all luck undermine (...) knowledge? And how accurate are our luck attributions anyway? The academic literature has seen growing, interdisciplinary interest in luck, and this volume brings together and explains the most important areas of this research. It consists of 39 newly commissioned chapters, written by an internationally acclaimed team of philosophers and psychologists, for a readership of students and researchers. Its coverage is divided into six sections: (i) The History of Luck, (ii) The Nature of Luck, (iii) Moral Luck, (iv) Epistemic Luck, (v) The Psychology of Luck, and (vi) Future Research. The chapters in these sections cover a wide range of topics, from the problem of moral luck, to anti-luck epistemology, to the relationship between luck attributions and cognitive biases, to meta-questions regarding the nature of luck itself, to a range of other theoretical and empirical questions currently being investigated by ethicists, epistemologists, and psychologists. By bringing this research together, the Handbook serves as both a touchstone for understanding the relevant issues and a first port of call for future philosophical and psychological research on luck. (shrink)
The 'Why ain'cha rich?' argument for one-boxing in Newcomb's problem allegedly vindicates evidential decision theory and undermines causal decision theory. But there is a good response to the argument on behalf of causal decision theory. I develop this response. Then I pose a new problem and use it to give a new 'Why ain'cha rich?' argument. Unlike the old argument, the new argument targets evidential decision theory. And unlike the old argument, the new argument is sound.
In the era of information and communication, issues of misinformation and miscommunication are more pressing than ever. _Epistemic injustice - _one of the most important and ground-breaking subjects to have emerged in philosophy in recent years - refers to those forms of unfair treatment that relate to issues of knowledge, understanding, and participation in communicative practices. The Routledge Handbook of Epistemic Injustice is an outstanding reference source to the key topics, problems and debates in this exciting subject. The first collection (...) of its kind, it comprises over thirty chapters by a team of international contributors, divided into five parts: Core Concepts Liberatory Epistemologies and Axes of Oppression Schools of Thought and Subfields within Epistemology Socio-political, Ethical, and Psychological Dimensions of Knowing Case Studies of Epistemic Injustice. As well as fundamental topics such as testimonial and hermeneutic injustice and epistemic trust, the Handbook includes chapters on important issues such as social and virtue epistemology, objectivity and objectification, implicit bias, and gender and race. Also included are chapters on areas in applied ethics and philosophy, such as law, education, and healthcare. The Routledge Handbook of Epistemic Injustice is essential reading for students and researchers in ethics, epistemology, political philosophy, feminist theory, and philosophy of race. It will also be very useful for those in related fields, such as cultural studies, sociology, education and law. (shrink)
The _Mozi_ is a key philosophical work written by a major social and political thinker of the fifth century B.C.E. It is one of the few texts to survive the Warring States period and is crucial to understanding the origins of Chinese philosophy and two other foundational works, the _Mengzi_ and the _Xunzi_. Ian Johnston provides an English translation of the entire _Mozi_, as well as the first bilingual edition in any European language to be published in the West. His (...) careful translation reasserts the significance of the text's central doctrines, and his annotations and contextual explanations add vivid historical and interpretive dimensions. Part 1 of the _Mozi_ is called the "Epitomes" and contains seven short essays on the elements of Mohist doctrine. Part 2, the "Core Doctrines," establishes the ten central tenets of Mo Zi's ethical, social, and political philosophy, while articulating his opposition to Confucianism. Part 3, the "Canons and Explanations," comprises observations on logic, language, disputation, ethics, science, and other matters, written particularly in defense of Mohism. Part 4, the "Dialogues," presents lively conversations between Master Mo and various disciples, philosophical opponents, and potential patrons. Part 5, the "Defense Chapters," details the principles and practices of defensive warfare, a subject on which Master Mo was an acknowledged authority. Now available to English-speaking readers of all backgrounds, the Mozi is a rich and varied text, and this bilingual edition provides an excellent tool for learning classical Chinese. (shrink)
Humanity has sat at the center of philosophical thinking for too long. The recent advent of environmental philosophy and posthuman studies has widened our scope of inquiry to include ecosystems, animals, and artificial intelligence. Yet the vast majority of the stuff in our universe, and even in our lives, remains beyond serious philosophical concern. In _Alien Phenomenology, or What It’s Like to Be a Thing_, Ian Bogost develops an object-oriented ontology that puts things at the center of being—a philosophy in (...) which nothing exists any more or less than anything else, in which humans are elements but not the sole or even primary elements of philosophical interest. And unlike experimental phenomenology or the philosophy of technology, Bogost’s alien phenomenology takes for granted that _all_ beings interact with and perceive one another. This experience, however, withdraws from human comprehension and becomes accessible only through a speculative philosophy based on metaphor. Providing a new approach for understanding the experience of things _as_ things, Bogost also calls on philosophers to rethink their craft. Drawing on his own background as a videogame designer, Bogost encourages professional thinkers to become makers as well, engineers who construct things as much as they think and write about them. (shrink)
A growing number of authors defend putative examples of knowledge from falsehood (KFF), inferential knowledge based in a critical or essential way on false premises, and they argue that KFF has important implications for many areas of epistemology (whether evidence can be false, the Gettier debate, defeasibility theories of knowledge, etc.). I argue, however, that there is no KFF, because in any supposed example either the falsehood does not contribute to the knowledge or the subject lacks knowledge. In particular, I (...) show that if the subject actually has knowledge in putative KFF cases, then there is always a veridical evidential path meeting the basing conditions that accounts for her knowledge; if there is no such path, then the subject is in a type of Gettier case. All the recent arguments that rely on KFF are therefore based on a mistake. (shrink)
Many people find themselves dissatisfied with recent linguistic philosophy, and yet know that language has always mattered deeply to philosophy and must in some sense continue to do so. Ian Hacking considers here some dozen case studies in the history of philosophy to show the different ways in which language has been important, and the consequences for the development of the subject. There are chapters on, among others, Hobbes, Berkeley, Russell, Ayer, Wittgenstein, Chomsky, Feyerabend and Davidson. Dr Hacking ends by (...) speculating about the directions in which philosophy and the study of language seem likely to go. The book will provide students with a stimulating, broad survey of problems in the theory of meaning and the development of philosophy, particularly in this century. The topics treated in the philosophy of language are among the central, current concerns of philosophers, and the historical framework makes it possible to introduce concretely and intelligibly all the main theoretical issues. (shrink)
Some evil actions are public. Maybe genocide is the most awful. Other evil actions are private, a matter of one person harming another or of self-inflicted injury. Child abuse, in our current reckoning, is the worst of private evils. We want to put a stop to it. We know we can’t do that, not entirely. Human wickedness won’t go away. But we must protect as many children as we can. We want also to discover and help those who have already (...) been hurt. Anyone who feels differently is already something of a monster.We are so sure of these moral truths that we seldom pause to wonder what child abuse is. We know we don’t understand it. We have little idea of what prompts people to harm children. But we do have the sense that what we mean by child abuse is something perfectly definite. So it comes as a surprise that the very idea of child abuse has been in constant flux the past thirty years. Previously our present conception of abusing a child did not even exist. People do many of the same vile things to children, for sure, that they did a century ago. But we’ve been almost unwittingly changing the very definitions of abuse and revising our values and our moral codes accordingly. Ian Hacking, a philosopher, teaches at the Institute for the History and Philosophy of Science and Technology in the University of Toronto. His latest book is entitled The Taming of Chance. (shrink)
This article reports the findings from a study of discursive representations of the future role of technology in the work of the US National Intelligence Council. Specifically, it investigates the interplay of ‘techno-optimism’ and propositional certainty in the NIC’s ‘Future Global Trends Reports’. In doing so, it answers the following questions: To what extent was techno-optimism present in the discourse? What level of propositional certainty was expressed in the discourse? How did the discourse deal with the inherent uncertainty of the (...) future? Overall, the discourse was pronouncedly techno-optimist in its stance towards the future role of technology: high-technological solutions were portrayed as solving a host of problems, despite the readily available presence of low-technology or no-technology solutions. In all, 75.1% of the representations were presented as future categorical certainties, meaning the future was predominantly presented as a known and closed inevitability. The discourse dealt with the inherent uncertainty of the subject matter, that is, the future, by projecting the past and present into the future. This was particularly the case in relation to the idea of technological military dominance as a guarantee of global peace, and the role of technology as an inevitable force free from societal censorship. (shrink)
This paper explores ethical concerns arising in healthcare integration. We argue that integration is necessary imperative for meeting contemporary and future healthcare challenges, a far stronger evidence base for the conditions of its effectiveness is required. In particular, given the increasing emphasis at the policy level for the entire healthcare infrastructure to become better integrated, our analysis of the ethical challenges that follow from the logic of integration itself is timely and important and has hitherto received insufficient attention. We evaluated (...) an educational intervention which aims to improve child health outcomes by making transitions between primary to secondary care more efficient, ensuring children and parents are better supported throughout. The programme provided skills for trainee paediatricians and general practitioners in co-designing integrated clinical services. The key ethical challenges of integrated care that arose from a clinical perspective are: professional identity and autonomy in an integrated working environment; the concomitant extent of professional responsibility in such an environment; and the urgent need for more evidence to be produced on which strategies for integrating at scale can be based. From our analysis we suggest a tentative way forward, viewed from a normative position broadly situated at the intersection of deontology and care ethics. We adopt this position because the primary clinical ethical issues in the context of integrated care concern: how to ensure that all duties of care to individual patients are met in a newly orientated working environment where clinical responsibility may be ambiguous; and the need to orientate care around the patient by foregrounding their autonomous preferences and ensuring good patient clinician relationships in clinical decision-making. (shrink)