There is a growing demand to incorporate social, economic and ethical considerations into biotechnology governance. However, there is currently little guidance available for understanding what this means or how it should be done. A framework of care-based ethics and politics can capture many of the concerns maintaining a persistent socio-political conflict over biotechnologies and provide a novel way to incorporate such considerations into regulatory assessments. A care-based approach to ethics and politics has six key defining features. These include: 1) a (...) relational worldview, 2) an emphasis on the importance of context, 3) a recognition of the significance of dependence, 4) an analysis of power, including a particular concern for those most vulnerable, 5) a granting of weight to the significance of affect, and 6) an acknowledgment of an important role for narrative. This policy brief provides an overview of these defining features, illustrates how they can appear in a real world example and provides a list of guiding questions for assessing these features and advancing a politics of care in the governance of biotechnology. (shrink)
In this paper we examine a puzzle recently posed by Aaron Preston for the traditional realist assay of property (quality) instances. Consider Socrates (a red round spot) and red1—Socrates’ redness. For the traditional realist, both of these entities are concrete particulars. Further, both involve redness being `tied to’ the same bare individuator. But then it appears that red1 is duplicated in its ‘thicker’ particular (Socrates), so that it can’t be predicated of Socrates without redundancy. According to Preston, this (...) suggests that a concrete particular and its property instances aren’t genuinely related. We argue that Preston’s proffered solution here—to treat property instances as “mental constructs”—is fraught with difficulty. We then go on to show how, by fine-tuning the nature of bare particulars, treating them as abstract modes of things rather than concrete particulars, the traditional realist can neatly evade Preston’s puzzle. (shrink)
This book focuses on material culture as a subject of philosophical inquiry and promotes the philosophical study of material culture by articulating some of the central and difficult issues raised by this topic and providing innovative solutions to them, most notably an account of improvised action and a non-intentionalist account of function in material culture. Preston argues that material culture essentially involves activities of production and use; she therefore adopts an action-theoretic foundation for a philosophy of material culture. Part (...) 1 illustrates this foundation through a critique, revision, and extension of existing philosophical theories of action. Part 2 investigates a salient feature of material culture itself—its functionality. A basic account of function in material culture is constructed by revising and extending existing theories of biological function to fit the cultural case. Here the adjustments are for the most part necessitated by special features of function in material culture. These two parts of the project are held together by a trio of overarching themes: the relationship between individual and society, the problem of centralized control, and creativity. (shrink)
Brian Skyrms offers a fascinating demonstration of how fundamental signals are to our world. He uses various scientific tools to investigate how meaning and communication develop. Signals operate in networks of senders and receivers at all levels of life, transmitting and processing information. That is how humans and animals think and interact.
Listen to the interview with Brian Kemple... and learn to appreciate the diachronic trajectory of semiotics. *** Live interview with Brian Kemple, Executive Director of the Lyceum Institute, to discuss the legacy and influence of John Deely, the thinker most responsible for developing semiotics into the 21st century. This interview, conducted by William Passarini and Tim Troutman, is part of the preliminary activities of the 2022 International Open Seminar on Semiotics: a Tribute to John Deely on the Fifth (...) Anniversary of His Passing, cooperatively organized by the Institute for Philosophical Studies of the Faculty of Arts and Humanities of the University of Coimbra, the Lyceum Institute, the Deely Project, Saint Vincent College, the Iranian Society for Phenomenology at the Iranian Political Science Association, the International Association for Semiotics of Space and Time, the Institute for Scientific Information on Social Sciences of the Russian Academy of Sciences, the Semiotic Society of America, the American Maritain Association, the International Association for Semiotic Studies, the International Society for Biosemiotic Studies, the International Center for Semiotics and Intercultural Dialogue, Moscow State Academic University for the Humanities and the Mansarda Acesa with the support of the FCT - Foundation for Science and Technology, I.P., of the Ministry of Science, Technology and Higher Education of the Government of Portugal under the UID/FIL/00010/2020 project. Brian Kemple holds a PhD in Philosophy from the University of St. Thomas, in Houston TX, where he wrote his dissertation under the inimitable John Deely. He is the Founder and Executive Director of the Lyceum Institute. Philosophical interests and areas of study include: Thomas Aquinas, John Poinsot, Charles Peirce, Martin Heidegger, the history and importance of semiotics, scholasticism, phenomenology; as well as ancillary interests in the liberal arts, technology, and education as a moral habit. He has published two scholarly books— 'Ens Primum Cognitum in Thomas Aquinas and the Tradition' and 'The Intersections of Semiotics and Phenomenology: Peirce and Heidegger in Dialogue', as well as a number of scholarly articles, popular articles, and his own 'Introduction to Philosophical Principles: Logic, Physics, and the Human Person' and the forthcoming 'Linguistic Signification: A Classical Course in Grammar and Composition'. In addition to being the Executive Director of the Lyceum Institute, he is the Executive Editor of 'Reality: a Journal for Philosophical Discourse'. *** Technical support was assured by Robert Junqueira and the cover image for the video was designed by Zahra Soltani. (shrink)
On the Origin of Objects is the culmination of Brian Cantwell Smith's decade-long investigation into the philosophical and metaphysical foundations of computation, artificial intelligence, and cognitive science. Based on a sustained critique of the formal tradition that underlies the reigning views, he presents an argument for an embedded, participatory, "irreductionist," metaphysical alternative. Smith seeks nothing less than to revise our understanding not only of the machines we build but also of the world with which they interact. Smith's ambitious project (...) begins as a search for a comprehensive theory of computation, able to do empirical justice to practice and conceptual justice to the computational theory of mind. A rigorous commitment to these two criteria ultimately leads him to recommend a radical overhaul of our traditional conception of metaphysics. Everything that exists -- objects, properties, life, practice -- lies Smith claims in the "middle distance," an intermediate realm of partial engagement with and partial separation from, the enveloping world. Patterns of separation and engagement are taken to underlie a single notion unifying representation and ontology: that of subjects' "registration" of the world around them. Along the way, Smith offers many fascinating ideas: the distinction between particularity and individuality, the methodological notion of an "inscription error," an argument that there are no individuals within physics, various deconstructions of the type-instance distinction, an analysis of formality as overly disconnected ("discreteness run amok"), a conception of the boundaries of objects as properties of unruly interactions between objects and subjects, an argument for the theoretical centrality of reference preservation, and a theatrical, acrobatic metaphor for the contortions involved in the preservation of reference and resultant stabilization of objects. Sidebars and diagrams throughout the book help clarify and guide Smith's highly original and compelling argument. A Bradford Book. (shrink)
Espen Hammer’s exceptionally fine book explores modern temporality, its problems and prospects. Hammer claims that how people experience time is a cultural/historical phenomenon, and that there is a peculiarly modern way of experiencing time as a series of present moments each indefinitely leading to the next in an ordered way. Time as measured by the clock is the paradigmatic instance of this sense of time. In this perspective time is quantifiable and forward-looking, and the present is dominated by the future. (...) Hammer argues that this manner of experiencing time provides a way of living that brings with it not only the basis for great successes in technology, but also great costs—specifically, what he calls the problems of transience and of meaning. Hammer goes about his task by considering the ways some of the great modern philosophers have characterized present-day temporality and have responded to the problems he has identified. Specifically, he considers what Kant, Hegel, Schopenhauer, Nietzsche, Heidegger, Habermas, Bloch, and Adorno provide in response to our peculiarly modern predicaments. The book is remarkable for its clarity and perceptiveness, but in the process in crucial places it simplifies the matters at hand or fails to push its insights as far as it ought, and in the end promises more than it can deliver. In this it betrays a rationalist confidence in the power of reason that founders on what in many ways remains a mystery. (shrink)
In this pithy and highly readable book, Brian Skyrms, a recognised authority on game and decision theory, investigates traditional problems of the social contract in terms of evolutionary dynamics. Game theory is skilfully employed to offer new interpretations of a wide variety of social phenomena, including justice, mutual aid, commitment, convention and meaning. The author eschews any grand, unified theory. Rather, he presents the reader with tools drawn from evolutionary game theory for the purpose of analysing and coming to (...) understand the social contract. The book is not technical and requires no special background knowledge. As such, it could be enjoyed by students and professionals in a wide range of disciplines: political science, philosophy, decision theory, economics and biology. (shrink)
In ‘The ethics of belief and Christian faith as commitment to assumptions’, Rik Peels attacks the views that I advanced in ‘Christianity and the ethics of belief’. Here, I rebut his criticisms of the claim that it is wrong to believe without sufficient evidence, of the contention that Christians are committed to that claim, and of the notion of that faith is not belief but commitment to assumptions in the hope of salvation. My original conclusions still stand.
"Brian Orend's The Morality of War promises to become the single most comprehensive and important book on just war for this generation. It moves far beyond the review of the standard just war categories to deal comprehensively with the new challenges of the conflict with terrorism. It thoughtfully reviews every major military conflict of the past few decades, mining them for implications of the evolving tradition of just war thinking. It concludes with a critical engagement with the major alternatives (...) to just war thinking: pacifism and 'realism.' It is, in short, the most comprehensive and thoughtful assessment of all aspects of just war since Michael Walzer's classic Just and Unjust Wars." - Martin L. Cook, United States Air Force Academy. (shrink)
Modal basics -- Some solutions -- Theist solutions -- The ontology of possibility -- Modal truthmakers -- Modality and the divine nature -- Deity as essential -- Against deity theories -- The role of deity -- The biggest bang -- Divine concepts -- Concepts, syntax, and actualism -- Modality: basic notions -- The genesis of secular modality -- Modal reality -- Essences -- Non-secular modalities -- Theism and modal semantics -- Freedom, preference, and cost -- Explaining modal status -- Explaining (...) the necessary -- Against theistic platonism -- Worlds and the existence of God. (shrink)
This stimulating collection is devoted to the life and work of the most flamboyant of twentieth-century philosophers, Paul Feyerabend. Feyerabend's radical epistemological claims, and his stunning argument that there is no such thing as scientific method, were highly influential during his life and have only gained attention since his death in 1994. The essays that make up this volume, written by some of today's most respected philosophers of science, many of whom knew Feyerabend as students and colleagues, cover the diverse (...) themes in his extensive body of work and present a personal account of this fascinating thinker. (shrink)
In the preface to his book God the Problem , Gordon Kaufman writes ‘Although the notion of God as agent seems presupposed by most contemporary theologians … Austin Farrer has been almost alone in trying to specify carefully and consistently just what this might be understood to mean.’.
It has widely been assumed, by philosophers, that our first-person preferences regarding pleasurable and painful experiences exhibit a bias toward the future (positive and negative hedonic future-bias), and that our preferences regarding non-hedonic events (both positive and negative) exhibit no such bias (non-hedonic time-neutrality). Further, it has been assumed that our third-person preferences are always time-neutral. Some have attempted to use these (presumed) differential patterns of future-bias—different across kinds of events and perspectives—to argue for the irrationality of hedonic future-bias. This (...) paper experimentally tests these descriptive hypotheses. While as predicted we found first-person hedonic future-bias, we did not find that participants were time-neutral in all other conditions. Hence, the presumed asymmetry of hedonic/non-hedonic and first/third-person preferences cannot be used to argue for the irrationality of future-bias, since no such asymmetries exist. Instead, we develop a more fine-grained approach, according to which three factors—positive/negative valence, first/third-person, and hedonic/non-hedonic—each independently influence, but do not determine, whether an event is treated in a future-biased or time-neutral way. We discuss the upshots of these results for the debate over the rationality of future-bias. (shrink)
A User's Guide to Capitalism and Schizophrenia is a playful and emphatically practical elaboration of the major collaborative work of the French philosophers Gilles Deleuze and Felix Guattari. When read along with its rigorous textual notes, the book also becomes the richest scholarly treatment of Deleuze's entire philosophical oeuvre available in any language. Finally, the dozens of explicit examples that Brian Massumi furnishes from contemporary artistic, scientific, and popular urban culture make the book an important, perhaps even central text (...) within current debates on postmodern culture and politics.Capitalism and Schizophrenia is the general title for two books published a decade apart. The first, Anti-Oedipus, was a reaction to the events of May/June 1968; it is a critique of "state-happy" Marxism and "school-building" strains of psychoanalysis. The second, A Thousand Plateaus, is an attempt at a positive statement of the sort of nomad philosophy Deleuze and Guattari propose as an alternative to state philosophy.Brian Massumi is Professor of Comparative Literature at McGill University. (shrink)
Many writers have held that in his later work, David Lewis adopted a theory of predicate meaning such that the meaning of a predicate is the most natural property that is (mostly) consistent with the way the predicate is used. That orthodox interpretation is shared by both supporters and critics of Lewis's theory of meaning, but it has recently been strongly criticised by Wolfgang Schwarz. In this paper, I accept many of Schwarze's criticisms of the orthodox interpretation, and add some (...) more. But I also argue that the orthodox interpretation has a grain of truth in it, and seeing that helps us appreciate the strength of Lewis's late theory of meaning. (shrink)
Both an introduction to Nietzsche’s moral philosophy, and a sustained commentary on his most famous work, On the Genealogy of Morality, this book has become the most widely used and debated secondary source on these topics over the past dozen years. Many of Nietzsche’s most famous ideas - the "slave revolt" in morals, the attack on free will, perspectivism, "will to power" and the "ascetic ideal" - are clearly analyzed and explained. The first edition established the centrality of naturalism to (...) Nietzsche’s philosophy, generating a substantial scholarly literature to which Leiter responds in an important new Postscript. In addition, Leiter has revised and refreshed the book throughout, taking into account new scholarly literature, and revising or clarifying his treatment of such topics as the objectivity of value, epiphenomenalism and consciousness, and the possibility of "autonomous" agency. (shrink)
As the end of the Millennium approaches, conspiracy theories are increasing in number and popularity. In this short essay, I offer an analysis of conspiracy theories inspired by Hume's discussion of miracles. My first conclusion is that whereas Hume can argue that miracles are, by definition, explanations we are not warranted in believing, there is nothing analytic that will allow us to distinguish good from bad conspiracy theories. There is no a priori method for distinguishing warranted conspiracy theories (say, those (...) explaining Watergate) from those which are unwarranted (say, theories about extraterrestrials abducting humans). Nonetheless, there is a cluster of characteristics often shared by unwarranted conspiracy theories. An analysis of the alleged explanatory virtues of unwarranted conspiracies suggests some reasons for their current popularity, while at the same time providing grounds for their rejection. Finally, I discuss how conspiracy theories embody an anachronistic world-view that places the contemporary zeitgeist in a clearer light. (shrink)
Most of us display a bias toward the near: we prefer pleasurable experiences to be in our near future and painful experiences to be in our distant future. We also display a bias toward the future: we prefer pleasurable experiences to be in our future and painful experiences to be in our past. While philosophers have tended to think that near bias is a rational defect, almost no one finds future bias objectionable. In this essay, we argue that this hybrid (...) position is untenable. We conclude that those who reject near bias should instead endorse complete temporal neutrality. (shrink)
The cognitive experience view of thought holds that the content of thought is determined by its cognitive-phenomenal character. Adam Pautz argues that the cognitive experience view is extensionally inadequate: it entails the possibility of mix-and-match cases, where the cognitive-phenomenal properties that determine thought content are combined with different sensory-phenomenal and functional properties. Because mix-and-match cases are metaphysically impossible, Pautz argues, the cognitive experience view should be rejected. This paper defends the cognitive experience view from Pautz’s argument. I build on resources (...) in the philosophy of mind literature to show that cognitive-phenomenal properties are modally independent from sensory-phenomenal and functional properties. The result is that mix-and-match cases, though modally remote, are metaphysically possible. The possibility of mix-and-match cases allows us to move from defensive posture to a critical one: it poses problems for any theory of content that imposes rationality constraints, including Pautz’s positive view, phenomenal functionalism. (shrink)
William Hasker replies to my arguments against Social Trinitarianism, offers some criticism of my own view, and begins a sketch of another account of the Trinity. I reply with some defence of my own theory and some questions about his.
In teaching jurisprudence, I typically distinguish between two different families of theories of adjudication—theories of how judges do or should decide cases. “Formalist” theories claim that the law is “rationally” determinate, that is, the class of legitimate legal reasons available for a judge to offer in support of his or her decision justifies one and only one outcome either in all cases or in some significant and contested range of cases ; and adjudication is thus “autonomous” from other kinds of (...) reasoning, that is, the judge can reach the required decision without recourse to nonlegal normative considerations of morality or political philosophy. I also note that “formalism” is sometimes associated with the idea that judicial decision-making involves nothing more than mechanical deduction on the model of the syllogism—Beccaria, for example, expresses such a view. I call the latter “Vulgar Formalism” to emphasize that it is not a view to which anyone today cares to subscribe. (shrink)
Philosophers working on time-biases assume that people are hedonically biased toward the future. A hedonically future-biased agent prefers pleasurable experiences to be future instead of past, and painful experiences to be past instead of future. Philosophers further predict that this bias is strong enough to apply to unequal payoffs: people often prefer less pleasurable future experiences to more pleasurable past ones, and more painful past experiences to less painful future ones. In addition, philosophers have predicted that future-bias is restricted to (...) first-person preferences, and that people’s third-person preferences are time-neutral. Philosophers disagree vigorously about the normative status of these preferences—i.e., they disagree about whether first-person future-bias is rationally permissible. Time-neutralists, for example, have appealed to the predicted asymmetry between first- and third-person preferences to argue for the rational impermissibility of future-bias. We empirically tested these predictions, and found that while people do prefer more past pain to less future pain, they do not prefer less future pleasure to more past pleasure. This was so in both first and third-person conditions. This suggests that future-bias is typically non-absolute, and is more easily outweighed in the case of positive events. We connect this result to the normative debate over future-bias. (shrink)
When someone is prepunished, they are punished for a predicted crime they will or would commit. I argue that cases of prepunishment universally assumed to be merely hypothetical—including those in Philip K. Dick’s “The Minority Report”— are equivalent to some instances of the real-life punishment of attempt offenses. This conclusion puts pressure in two directions. If prepunishment is morally impermissible, as philosophers argue, then this calls for amendments to criminal justice theory and practice. At the same time, if prepunishment is (...) not imaginary, then the philosophers who reject it cannot claim that their view is supported by common sense. (shrink)
The view that phenomenally conscious robots are on the horizon often rests on a certain philosophical view about consciousness, one we call “nomological behaviorism.” The view entails that, as a matter of nomological necessity, if a robot had exactly the same patterns of dispositions to peripheral behavior as a phenomenally conscious being, then the robot would be phenomenally conscious; indeed it would have all and only the states of phenomenal consciousness that the phenomenally conscious being in question has. We experimentally (...) investigate whether the folk think that certain (hypothetical) robots made of silicon and steel would have the same conscious states as certain familiar biological beings with the same patterns of dispositions to peripheral behavior as the robots. Our findings provide evidence that the folk largely reject the view that silicon-based robots would have the sensations that they, the folk, attribute to the biological beings in question. (shrink)
According to the Rationality Constraint, our concept of belief imposes limits on how much irrationality is compatible with having beliefs at all. We argue that empirical evidence of human irrationality from the psychology of reasoning and the psychopathology of delusion undermines only the most demanding versions of the Rationality Constraint, which require perfect rationality as a condition for having beliefs. The empirical evidence poses no threat to more relaxed versions of the Rationality Constraint, which only require only minimal rationality. Nevertheless, (...) we raise problems for all versions of the Rationality Constraint by appealing to more extreme forms of irrationality that are continuous with actual cases of human irrationality. In particular, we argue that there are conceivable cases of “mad belief” in which populations of Lewisian madmen have beliefs that are not even minimally rational. This undermines Lewis’s claim that our ordinary concept of belief is a theoretical concept that is implicitly defined by its role in folk psychology. We argue that introspection gives us a phenomenal concept of belief that cannot be analyzed by applying Lewis’s semantics for theoretical terms. (shrink)
The standard formulation of Newcomb's problem compares evidential and causal conceptions of expected utility, with those maximizing evidential expected utility tending to end up far richer. Thus, in a world in which agents face Newcomb problems, the evidential decision theorist might ask the causal decision theorist: "if you're so smart, why ain’cha rich?” Ultimately, however, the expected riches of evidential decision theorists in Newcomb problems do not vindicate their theory, because their success does not generalize. Consider a theory that allows (...) the agents who employ it to end up rich in worlds containing Newcomb problems and continues to outperform in other cases. This type of theory, which I call a “success-first” decision theory, is motivated by the desire to draw a tighter connection between rationality and success, rather than to support any particular account of expected utility. The primary aim of this paper is to provide a comprehensive justification of success-first decision theories as accounts of rational decision. I locate this justification in an experimental approach to decision theory supported by the aims of methodological naturalism. (shrink)