A dogma of contemporary ethical theory maintains that the nature of normative support for affective attitudes is the very same as the nature of normative support for actions. The prevailing view is that normative reasons provide the support across the board. I argue that the nature of normative support for affective attitudes is importantly different from the nature of normative support for actions. Actions are indeed supported by reasons. Reasons are gradable and contributory. The support relations for affective attitudes are (...) neither. So-called reasons of the right kind for affective attitudes are facts that make those very attitudes fitting. Unlike reasons, fit-making facts for affective attitudes do not conflict with each other or combine in the explanation of further normative facts. More fit-making facts just make a more complex set of reactions fitting. This result undermines various analyses and unity theses in the philosophy of normativity. (shrink)
It is plausible that there are epistemic reasons bearing on a distinctively epistemic standard of correctness for belief. It is also plausible that there are a range of practical reasons bearing on what to believe. These theses are often thought to be in tension with each other. Most significantly for our purposes, it is obscure how epistemic reasons and practical reasons might interact in the explanation of what one ought to believe. We draw an analogy with a similar distinction between (...) types of reasons for actions in the context of activities. The analogy motivates a two-level account of the structure of normativity that explains the interaction of correctness-based and other reasons. This account relies upon a distinction between normative reasons and authoritatively normative reasons. Only the latter play the reasons role in explaining what state one ought to be in. All and only practical reasons are authoritative reasons. Hence, in one important sense, all reasons for belief are practical reasons. But this account also preserves the autonomy and importance of epistemic reasons. Given the importance of having true beliefs about the world, our epistemic standard typically plays a key role in many cases in explaining what we ought to believe. In addition to reconciling (versions of) evidentialism and pragmatism, this two-level account has implications for a range of important debates in normative theory, including the interaction of right and wrong reasons for actions and other attitudes, the significance of reasons in understanding normativity and authoritative normativity, the distinction between ‘formal’ and ‘substantive’ normativity, and whether there is a unified source of authoritative normativity. (shrink)
This paper develops the Value-Based Theory of Reasons in some detail. The central part of the paper introduces a number of theoretically puzzling features of normative reasons. These include weight, transmission, overlap, and the promiscuity of reasons. It is argued that the Value-Based Theory of Reasons elegantly accounts for these features. This paper is programmatic. Its goal is to put the promising but surprisingly overlooked Value-Based Theory of Reasons on the table in discussions of normative reasons, and to draw attention (...) to a number of areas for fruitful further research. (shrink)
There are several powerful motivations for neutral value-based deontic theories such as Act Consequentialism. Traditionally, such theories have had great difficulty accounting for partiality towards one's personal relationships and projects. This paper presents a neutral value-based theory that preserves the motivations for Act Consequentialism while vindicating some crucial intuitions about reasons to be partial. There are two central ideas. The first is that when it comes to working out what you ought to do, your friends’ interests, the needs of your (...) family, the significance of your own projects and ideals, etc. have more weight than the interests and needs of strangers. Your friends’ interests are not more neutrally valuable than the interests of others. So there is a difference between the value of an outcome and its deontic significance. The second familiar idea is that reasons are modifiable. Reasons of partiality are reasons the weights of which are a function of the value of the relevant outcome modified by facts about the value of caring about the outcome in question. The resulting principle has various further explanatory advantages; in particular, it accounts for project- and relationship-specific permissions and requirements, both at a time and across time. (shrink)
Although arguments for and against competing theories of vagueness often appeal to claims about the use of vague predicates by ordinary speakers, such claims are rarely tested. An exception is Bonini et al. (1999), who report empirical results on the use of vague predicates by Italian speakers, and take the results to count in favor of epistemicism. Yet several methodological difficulties mar their experiments; we outline these problems and devise revised experiments that do not show the same results. We then (...) describe three additional empirical studies that investigate further claims in the literature on vagueness: the hypothesis that speakers confuse ‘P’ with ‘definitely P’, the relative persuasiveness of different formulations of the inductive premise of the Sorites, and the interaction of vague predicates with three different forms of negation. (shrink)
There are various ways of characterising Hume’s dictum that ‘you can’t get an ought from an is.’ Contributors to the literature directly addressing this question focus on logical characterisations of autonomy theses. Such theses maintain that certain logical relations do not obtain between ethical and non-ethical sentences, for instance that no non-ethical sentences logically entail an ethical sentence. I argue that this focus on logical autonomy is a mistake. The thesis so important to our metaethicists is not a logical thesis (...) but a metaphysical one. The relevant metaphysical autonomy thesis maintains that ethical facts are not fully grounded just in non-ethical facts. I defend this characterization. I also defend the converse thesis that all facts partly grounded in ethical facts are ethical facts. I then argue that this pair of theses can help with debates about the plausibility of nihilism and the classification of revisionary metaethical theses. (shrink)
This book, published in 2000, is a clear account of causation based firmly in contemporary science. Dowe discusses in a systematic way, a positive account of causation: the conserved quantities account of causal processes which he has been developing over the last ten years. The book describes causal processes and interactions in terms of conserved quantities: a causal process is the worldline of an object which possesses a conserved quantity, and a causal interaction involves the exchange of conserved quantities. Further, (...) things that are properly called cause and effect are appropriately connected by a set of causal processes and interactions. The distinction between cause and effect is explained in terms of a version of the fork theory: the direction of a certain kind of ordered pattern of events in the world. This particular version has the virtue that it allows for the possibility of backwards causation, and therefore time travel. (shrink)
With no statutory definition of death, the accepted medical definition relies on brain stem death criteria as a definitive measure of diagnosing death. However, the use of brain stem death criteria in this way is precarious and causes widespread confusion amongst both medical and lay communities. Through critical analysis, this paper considers the insufficiencies of brain stem death. It concludes that brain stem death cannot be successfully equated with either biological death or the loss of integrated bodily function. The overemphasis (...) of the brain-stem and its consequences leaves the criteria open to significant philosophical critique. Further, in some circumstances, the use of brain stem death criteria causes substantial emotional conflict for families and relatives. Accordingly, a more holistic and comprehensive definition of death is required. (shrink)
This article examines and synthesizes two different approaches to determining the content of business ethics courses and the manner in which they ought to be taught. The first approach, from a political perspective, argues that the institutional framework within which business operates ought to be tested by theories of distributive justice. The second approach, from the perspective of virtue theory, argues that we ought to examine the character of individual employees and the responsibilities associated with the roles which these individuals (...) play within organizations. I argue that Gadamer's interpretation of Aristotle's notion of phronesis shows an inseparable, bidirectional, conceptual link between the approaches of politics and virtue as well as providing insight into how business ethics might best be taught. (shrink)
I challenge the appropriateness of the discourse of managerial control of employees in four ways. First, I question arguments which suggest that employees are always subject to organizational control. Second, I contrast workplace conditions which support employee self-determination and autonomy with conditions which permit control of employees. Third, I provide an ethical assessment of the normative use of control talk and fourth, I suggest an alternative discourse, a discourse of accountability which appropriately highlights the reciprocity necessary to build ethical organizations.
This paper presents a puzzle for Act Consequentialists who do not want to shoot Pelé. The puzzle arises from cases involving the promotion of virtue, and motivates a systematic restriction on the separability of reasons.
This paper examines Wesley Salmon's "process" theory of causality, arguing in particular that there are four areas of inadequacy. These are that the theory is circular, that it is too vague at a crucial point, that statistical forks do not serve their intended purpose, and that Salmon has not adequately demonstrated that the theory avoids Hume's strictures about "hidden powers". A new theory is suggested, based on "conserved quantities", which fulfills Salmon's broad objectives, and which avoids the problems discussed.
It is regrettably common for theorists to attempt to characterize the Humean dictum that one can’t get an ‘ought’ from an ‘is’ just in broadly logical terms. We here address an important new class of such approaches which appeal to model-theoretic machinery. Our complaint about these recent attempts is that they interfere with substantive debates about the nature of the ethical. This problem, developed in detail for Daniel Singer’s and Gillian Russell and Greg Restall’s accounts of Hume’s dictum, is of (...) a general type arising for the use of model-theoretic structures in cashing out substantive philosophical claims: the question of whether an abstract model-theoretic structure successfully interprets something often involves taking a stand on non-trivial issues surrounding the thing. In the particular case of Hume’s dictum, given reasonable conceptual or metaphysical claims about the ethical, Singer’s and Russell and Restall’s accounts treat obviously ethical claims as descriptive and vice versa. Consequently, their model-theoretic characterizations of Hume’s dictum are not metaethically neutral. This encourages skepticism about whether model-theoretic machinery suffices to provide an illuminating distinction between the ethical and the descriptive. (shrink)
David Lewis claims that his theory of modality successfully reduces modal items to nonmodal items. This essay will clarify this claim and argue that it is true. This is largely an exercise within ‘Ludovician Polycosmology’: I hope to show that a certain intuitive resistance to the reduction and a set of related objections misunderstand the nature of the Ludovician project. But these results are of broad interest since they show that would-be reductionists have more formidable argumentative resources than is often (...) thought. Lewis’s reduction depends on a set of methodological commitments each of which is fairly plausible or at least currently popular, and none of which is particular to modality. The choice of which of these commitments to reject I leave to the discerning antireductionist. The essay proceeds as follows: §1 discusses reduction generally and one or two relevant puzzles; §2 discusses Lewis’s reduction in particular; the longest section, §3 replies to four objections. (shrink)
This chapter argues that dual-use emerging technologies are distributing unprecedented offensive capabilities to nonstate actors. To counteract this trend, some scholars have proposed that states become a little “less liberal” by implementing large-scale surveillance policies to monitor the actions of citizens. This is problematic, though, because the distribution of offensive capabilities is also undermining states’ capacity to enforce the rule of law. I will suggest that the only plausible escape from this conundrum, at least from our present vantage point, is (...) the creation of a “supersingleton” run by a friendly superintelligence, founded upon a “post-singularity social contract.” In making this argument, the present chapter offers a novel reason for prioritizing the “control problem,” i.e., the problem of ensuring that a greaterthan-human-level AI will positively enhance human well-being. (shrink)
The integration of nanotechnology’s ‘social and ethical issues’ (SEI) at the research and development stage is one of the defining features of nanotechnology governance in the United States. Mandated by law, integration extends the field of nanotechnology to include a role for the “social”, the “public” and the social sciences and humanities in research and development (R&D) practices and agendas. Drawing from interviews with scientists, engineers and policymakers who took part in an oral history of the “Future of Nanotechnology” symposium (...) at the Cornell NanoScale Facility, this article examines how nanotechnology’s ‘social and ethical issues’ are brought to life by these practitioners. From our analysis, three modes of enactment emerge: enacting SEI as obligations and problems-to-be-solved, enacting SEI by ‘not doing it’ in the laboratory, and enacting SEI as part of scientific practice. Together they paint a complex picture where SEI are variously defined, made visible or invisible, included and excluded, with participants showing their skill at both boundary-work (Gieryn Am Sociol Rev 48:781–795, 1983, 1999) and at integration. We conclude by reflecting on what this may mean for the design and implementation of SEI integration policies, suggesting that we need to transform SEI from obligations into ‘matters of care’ (Puig de la Bellacasa Soc Stud Sci 41(1):85–106, 2011) that tend to existing relationalities between science and society and implicate practitioners themselves. (shrink)
The greatest existential threats to humanity stem from increasingly powerful advanced technologies. Yet the “risk potential” of such tools can only be realized when coupled with a suitable agent who; through error or terror; could use the tool to bring about an existential catastrophe. While the existential risk literature has provided many accounts of how advanced technologies might be misused and abused to cause unprecedented harm; no scholar has yet explored the other half of the agent-tool coupling; namely the agent. (...) This paper aims to correct this failure by offering a comprehensive overview of what we could call “agential riskology.” Only by studying the unique properties of different agential risk types can one acquire an accurate picture of the existential danger before us. (shrink)
“Scaling-up” is the next hurdle facing the local food movement. In order to effect broader systemic impacts, local food systems (LFS) will have to grow, and engage either more or larger consumers and producers. Encouraging the involvement of mid-sized farms looks to be an elegant solution, by broadening the accessibility of local food while providing alternative revenue streams for troubled family farms. Logistical, structural and regulatory barriers to increased scale in LFS are well known. Less is understood about the way (...) in which scale developments affect the perception and legitimacy of LFS. This value-added opportunity begs the question: Is the value that adheres to local food scalable? Many familiar with local food discourse might suggest that important pieces of added value within LFS are generated by the reconnection of producer and consumer, the direct exchange through which this occurs, and the shared goals and values that provide the basis for reconnection. However, these assertions are based on tenuous assumptions about how interactions within the direct exchange produce value, and how LFS are governed. Examination shows that existing assumptions do not properly acknowledge the hybridity, diversity, and flexibility inherent in LFS. A clear analysis of the potential of scale in LFS will depend on understanding both how value is determined within LFS, and the processes through which these systems are governed. Such an analysis shows that, while scaled-up LFS will be challenged to maintain legitimacy and an identity as “alternative”, the establishment of an open governance process—based on a “negotiation of accommodations”—is likely to enhance their viability. (shrink)
Aristotle holds that individual substances are ontologically independent from nonsubstances and universal substances but that non-substances and universal substances are ontologically dependent on substances. There is then an asymmetry between individual substances and other kinds of beings with respect to ontological dependence. Under what could plausibly be called the standard interpretation, the ontological independence ascribed to individual substances and denied of non-substances and universal substances is a capacity for independent existence. There is, however, a tension between this interpretation and the (...) asymmetry between individual substances and the other kinds of entities with respect to ontological independence. I will propose an alternative interpretation: to weaken the relevant notion of ontological independence from a capacity for independent existence to the independent possession of a certain ontological status. (shrink)
In this article we argue that nanotechnology represents an extraordinary opportunity to build in a robust role for the social sciences in a technology that remains at an early, and hence undetermined, stage of development. We examine policy dynamics in both the United States and United Kingdom aimed at both opening up, and closing down, the role of the social sciences in nanotechnologies. We then set out a prospective agenda for the social sciences and its potential in the future shaping (...) of nanotechnology research and innovation processes. The emergent, undetermined nature of nanotechnologies calls for an open, experimental, and interdisciplinary model of social science research. (shrink)
In Unfit for the Future, Ingmar Persson and Julian Savulescu argue that our collective existetial predicment is unprecedentedly dangerous due to climate change and terrorism. Given these global risks to human prosperity and survival, Persson and Savulescu argue that we should explore the radical possibility of moral bioenhancement in addition to cognitive enhancement. In this article, I argue that moral bioenhancements could nontrivially exacerbate the threat posed by certain kinds of malicious agents, while reducing the threat of other kinds. This (...) introduces a previously undiscussed complication to Persson and Savulescu's proposal. In the final section, I present a novel argument for why moral bioenhancement should either be compulsory or not be made available to the public at all. (shrink)
In a recent paper (1994) Wesley Salmon has replied to criticisms (e.g., Dowe 1992c, Kitcher 1989) of his (1984) theory of causality, and has offered a revised theory which, he argues, is not open to those criticisms. The key change concerns the characterization of causal processes, where Salmon has traded "the capacity for mark transmission" for "the transmission of an invariant quantity." Salmon argues against the view presented in Dowe (1992c), namely that the concept of "possession of a conserved quantity" (...) is sufficient to account for the difference between causal and pseudo processes. Here that view is defended, and important questions are raised about the notion of transmission and about gerrymandered aggregates. (shrink)
Inspired by the vision of care in Vincent van Gogh's depiction of the parable of the Good Samaritan, this article offers a paradigm for inhabiting compassion. Compassion is understood in this article as a moral emotion that is also a pathocentric virtue. This definition creates a dynamic view of compassion as a desire to alleviate the suffering of others, the capacity to act on behalf of others and a commitment to sustain engagement with the suffering other. To weave this vision (...) of compassion as a habitus rather than a theoretical construct, the article develops three phases of compassion: seeing, companioning and sighing. This framework deepens and augments a pastoral theological paradigm of compassion with the aim of inculcating an inhabited compassion in caregivers and the communities in which they participate. (shrink)
Jeffrey Stout claims that John Rawls's idea of public reason (IPR) has contributed to a Christian backlash against liberalism. This essay argues that those whom Stout calls “antiliberal traditionalists” have misunderstood Rawls in important ways, and goes on to consider Stout's own critiques of the IPR. While Rawls's idea is often interpreted as a blanket prohibition on religious reasoning outside church and home, the essay will show that the very viability of the IPR depends upon a rich culture of deliberation (...) in which all forms of reasoning can be put forth for consideration. This clarification addresses the perception that the IPR imposes an “asymmetrical burden” upon believers. In fact, the essay suggests that there are good reasons why believers, qua believers, might endorse the IPR. (shrink)
Philosophers have long been fascinated by the connection between cause and effect: are 'causes' things we can experience, or are they concepts provided by our minds? The study of causation goes back to Aristotle, but resurged with David Hume and Immanuel Kant, and is now one of the most important topics in metaphysics. Most of the recent work done in this area has attempted to place causation in a deterministic, scientific, worldview. But what about the unpredictable and chancey world we (...) actually live in: can one theory of causation cover all instances of cause and effect? _Cause and Chance: Causation in an Indeterministic World _is a collection of specially written papers by world-class metaphysicians. Its focus is the problem facing the 'reductionist' approach to causation: the attempt to cover all types of causation, deterministic and indeterministic, with one basic theory. Contributors: Stephen Barker, Helen Beebee, Phil Dowe, Dorothy Edgington, Doug Ehring, Chris Hitchcock, Igal Kwart, Paul Noordhof, Murali Ramachandran and Michael Tooley. (shrink)
Experimental methods and conceptual confusion : philosophy, science, and what emotions really are -- To 'make our voices resonate' or 'to be silent'? : shame as fundamental ontology -- Emotion, cognition, and world -- Shame and world.
The relation of ontological dependence or grounding, expressed by the terminology of separation and priority in substance, plays a central role in Aristotle’s Categories, Metaphysics, De Anima and elsewhere. The article discusses three current interpretations of this terminology. These are drawn along the lines of, respectively, modal-existential ontological dependence, essential ontological dependence, and grounding or metaphysical explanation. I provide an opinionated introduction to the topic, raising the main interpretative questions, laying out a few of the exegetical and philosophical options that (...) influence one’s reading, and locating questions of Aristotle scholarship within the discussion of ontological dependence and grounding in contemporary metaphysics. (shrink)
Normative reasons have become a popular theoretical tool in recent decades. One helpful feature of normative reasons is their weight. The fourteen new essays in this book theorize about many different aspects of weight. Topics range from foundational issues to applications of weight in debates across philosophy.
Individual substances are the ground of Aristotle’s ontology. Taking a liberal approach to existence, Aristotle accepts among existents entities in such categories other than substance as quality, quantity and relation; and, within each category, individuals and universals. As I will argue, individual substances are ontologically independent from all these other entities, while all other entities are ontologically dependent on individual substances. The association of substance with independence has a long history and several contemporary metaphysicians have pursued the connection. In this (...) chapter, I will discuss the intersection of these notions of substance and ontological dependence in Aristotle. I will canvass a few contemporary formulations of ontological dependence and discuss some of the interpretative difficulties in ascribing any of these formulations to Aristotle’s characterization of individual substances as ontologically independent. My aim is not to resolve fully these difficulties but to locate the topics of substance and independence relative to certain other controversies in Aristotle studies. However, I will sketch a position. In particular, elsewhere I have speculated that Aristotle is both a primitivist and a pluralist with respect to ontological dependence, and I will develop this line of interpretation a bit further later in the chapter. (shrink)
Both literalism, the view that mathematical objects simply exist in the empirical world, and fictionalism, the view that mathematical objects do not exist but are rather harmless fictions, have been both ascribed to Aristotle. The ascription of literalism to Aristotle, however, commits Aristotle to the unattractive view that mathematics studies but a small fragment of the physical world; and there is evidence that Aristotle would deny the literalist position that mathematical objects are perceivable. The ascription of fictionalism also faces a (...) difficult challenge: there is evidence that Aristotle would deny the fictionalist position that mathematics is false. I argue that, in Aristotle's view, the fiction of mathematics is not to treat what does not exist as if existing but to treat mathematical objects with an ontological status they lack. This form of fictionalism is consistent with holding that mathematics is true. (shrink)
Gordon Baker in his last decade published a series of papers (now collected in Baker 2004), which are revolutionary in their proposals for understanding of later Wittgenstein. Taking our lead from the first of those papers, on "perspicuous presentations," we offer new criticisms of 'elucidatory' readers of later Wittgenstein, such as Peter Hacker: we argue that their readings fail to connect with the radically therapeutic intent of the 'perspicuous presentation' concept, as an achievement-term, rather than a kind of 'objective' mapping (...) of a 'conceptual landscape.' Baker's Wittgenstein, far from being a 'language policeman' of the kind that often fails to influence mainstream philosophy, offers an alternative to the latent scientism of Wittgenstein's influential 'elucidatory' readers. (shrink)
A predicate logic typically has a heterogeneous semantic theory. Subjects and predicates have distinct semantic roles: subjects refer; predicates characterize. A sentence expresses a truth if the object to which the subject refers is correctly characterized by the predicate. Traditional term logic, by contrast, has a homogeneous theory: both subjects and predicates refer; and a sentence is true if the subject and predicate name one and the same thing. In this paper, I will examine evidence for ascribing to Aristotle the (...) view that subjects and predicates refer. If this is correct, then it seems that Aristotle, like the traditional term logician, problematically conflates predication and identity claims. I will argue that we can ascribe to Aristotle the view that both subjects and predicates refer, while holding that he would deny that a sentence is true just in case the subject and predicate name one and the same thing. In particular, I will argue that Aristotle's core semantic notion is not identity but the weaker relation of constitution. For example, the predication ‘All men are mortal’ expresses a true thought, in Aristotle's view, just in case the mereological sum of humans is a part of the mereological sum of mortals. (shrink)