Long claimed to be the dominant conception of practical reason, the Humean theory that reasons for action are instrumental, or explained by desires, is the basis for a range of worries about the objective prescriptivity of morality. As a result, it has come under intense attack in recent decades. A wide variety of arguments have been advanced which purport to show that it is false, or surprisingly, even that it is incoherent. Slaves of the Passions aims to set the record (...) straight, by advancing a version of the Humean theory of reasons which withstands this sophisticated array of objections. Schroeder defends a radical new view which, if correct, means that the commitments of the Humean theory have been widely misunderstood. Along the way, he raises and addresses questions about the fundamental structure of reasons, the nature of normative explanations, the aims of and challenges facing reductive views in metaethics, the weight of reasons, the nature of desire, moral epistemology, and most importantly, the relationship between agent-relational and agent-neutral reasons for action. (shrink)
It doesn’t seem possible to be a realist about the traditional Christian God while claiming to be able to reduce God talk in naturalistically acceptable terms. Reduction, in this case, seems obviously eliminativist. Many philosophers seem to think that the same is true of the normative—that reductive “realists” about the normative are not really realists about the normative at all, or at least, only in some attenuated sense. This paper takes on the challenge of articulating what it is that makes (...) reductive theological realism look hopeless, with the aim of explaining why we should think that the normative is relevantly different. Although it follows from my diagnosis that reductivists have their work cut out for them, I find nothing which suggests that the prospects for a successful reductive realism about the normative are in any way diminished—particularly for reductive views about reasons. Even reductivists, I argue, can at least aspire to a realism that is robust. (shrink)
A collection, edited by David Bain, Michael Brady, and Jennifer Corns, originating in our Value of Suffering Project. Table of Contents: Michael Wheeler - ‘How should affective phenomena be studied?’; Julien Deonna & Fabrice Teroni – ‘Pleasures, unpleasures, and emotions’; Hilla Jacobson – ‘The attitudinal representational theory of painfulness fleshed out’; Tim Schroeder – ‘What we represent when we represent the badness of getting hurt’; Hagit Benbaji – ‘A defence of the inner view of pain’; Olivier Massin – ‘Suffering (...) pain’; Frederique de Vignemont – ‘The value of threat’; Colin Leach – ‘Bad feelings can be good and good feelings can be bad’; Tasia Scrutton – ‘Mental suffering and the experience of beauty’; Brock Bastian – ‘From suffering to satisfaction: why we need pain to feel pleasure’; Marilyn McCord Adams – ‘Pain and moral agency’; Jennifer Corns – ‘Hedonic rationality’; Jonathan Cohen & Matthew Fulkerson – ‘Suffering and rationality’; Tom McClelland – ‘Suffering invites understanding’; Michael Brady – ‘Suffering as a virtue’; Glen Pettigrove TBA. Further authors TBA. (shrink)
Several authors have recently endorsed the thesis that there is what has been called pragmatic encroachment on knowledge—in other words, that two people who are in the same situation with respect to truth-related factors may differ in whether they know something, due to a difference in their practical circumstances. This paper aims not to defend this thesis, but to explore how it could be true. What I aim to do, is to show how practical factors could play a role in (...) defeating knowledge by defeating epistemic rationality—the very kind of rationality that is entailed by knowledge, and in which Pascalian considerations do not play any role—even though epistemic rationality consists in having adequate evidence. (shrink)
Viewing Foucault in the light of work by Continental and American philosophers, most notably Nietzsche, Habermas, Deleuze, Richard Rorty, Bernard Williams, and Ian Hacking, Genealogy as Critique shows that philosophical genealogy involves not only the critique of modernity but also its transformation. Colin Koopman engages genealogy as a philosophical tradition and a method for understanding the complex histories of our present social and cultural conditions. He explains how our understanding of Foucault can benefit from productive dialogue with philosophical allies (...) to push Foucaultian genealogy a step further and elaborate a means of addressing our most intractable contemporary problems. (shrink)
The science-fiction film The Matrix generated a great deal of philosophical interest. There are already three collections of philosophical papers either published or in the pipeline devoted to the film. Here, Colin McGinn takes a closer look at the film and comes up with some rather surprising conclusions.
According to noncognitivists, when we say that stealing is wrong, what we are doing is more like venting our feelings about stealing or encouraging one another not to steal, than like stating facts about morality. These ideas challenge the core not only of much thinking about morality and metaethics, but also of much philosophical thought about language and meaning. _Noncognitivism in Ethics_ is an outstanding introduction to these theories, ranging from their early history through the latest contemporary developments. Beginning with (...) a general introduction to metaethics, Mark Schroeder introduces and assesses three principal kinds of noncognitivist theory: the speech-act theories of Ayer, Stevenson, and Hare, the expressivist theories of Blackburn and Gibbard, and hybrid theories. He pays particular attention both to the philosophical problems about what moral facts could be about or how they could matter which noncognitivism seeks to solve, and to the deep problems that it faces, including the task of explaining both the nature of moral thought and the complexity of moral attitudes, and the ‘Frege-Geach’ problem. Schroeder makes even the most difficult material accessible by offering crucial background along the way. Also included are exercises at the end of each chapter, chapter summaries, and a glossary of technical terms - making _Noncognitivism in Ethics_ essential reading for all students of ethics and metaethics. (shrink)
Call the idea that states of perceptual awareness have intentional content, and in virtue of that aim at or represent ways the world might be, the ‘Content View.’ I argue that though Kant is widely interpreted as endorsing the Content View there are significant problems for any such interpretation. I further argue that given the problems associated with attributing the Content View to Kant, interpreters should instead consider him as endorsing a form of acquaintance theory. Though perceptual acquaintance is controversial (...) in itself and in attribution to Kant, it promises to make sense of central claims within his critical philosophy. (shrink)
To desire something is a condition familiar to everyone. It is uncontroversial that desiring has something to do with motivation, something to do with pleasure, and something to do with reward. Call these "the three faces of desire." The standard philosophical theory at present holds that the motivational face of desire presents its unique essence--to desire a state of affairs is to be disposed to act so as to bring it about. A familiar but less standard account holds the hedonic (...) face of desire to reveal to true nature of desire. In this view, to desire something is to tend to pleasure if it seems that the desired state of affairs has been achieved, or displeasure if it seems otherwise, thus tying desire to feelings instead of actions. In Three Faces of Desire, Schroeder goes beyond actions and feelings to advance a novel and controversial theory of desire that puts the focus on desire's neglected face, reward. Informed by contemporary science as much as by the philosophical tradition, Three Faces of Desire discusses recent scientific discoveries that tell us much about the way that actions and feelings are produced in the brain. In particular, recent experiments reveal that a distinctive system is responsible for promoting action, on the one hand, and causing feelings of pleasure and displeasure, on the other. This system, the brain's reward system, is the causal origin of both action and feeling, and is the key to understanding the nature of desire. (shrink)
I argue that Kant’s distinction between the cognitive roles of sensibility and understanding raises a question concerning the conditions necessary for objective representation. I distinguish two opposing interpretive positions—viz. Intellectualism and Sensibilism. According to Intellectualism all objective representation depends, at least in part, on the unifying synthetic activity of the mind. In contrast, Sensibilism argues that at least some forms of objective representation, specifically intuitions, do not require synthesis. I argue that there are deep reasons for thinking that Intellectualism is (...) incompatible with Kant's view as expressed in the Transcendental Aesthetic. We can better see how Kant’s arguments in the first Critique may be integrated, I suggest, by examining his notion of the 'unity' [Einheit] of a representation. I articulate two distinct ways in which a representation may possess unity and claim that we can use these notions to integrate Kant’s arguments in the Aesthetic and the Transcendental Deduction without compromising the core claims of either Sensibilism or Intellectualism—that intuition is a form of objective representation independent of synthesis, and that the kind of objective representations that ground scientific knowledge of the world require synthesis by the categories. (shrink)
Fitting Attitudes accounts of value analogize or equate being good with being desirable, on the premise that ‘desirable’ means not, ‘able to be desired’, as Mill has been accused of mistakenly assuming, but ‘ought to be desired’, or something similar. The appeal of this idea is visible in the critical reaction to Mill, which generally goes along with his equation of ‘good’ with ‘desirable’ and only balks at the second step, and it crosses broad boundaries in terms of philosophers’ other (...) commitments. For example, Fitting Attitudes accounts play a central role both in T.M. Scanlon’s  case against teleology, and in Michael Smith , [unpublished] and Doug Portmore’s  cases for it. And of course they have a long and distinguished history. (shrink)
Symposium contribution on Mark Schroeder's Slaves of the Passions. Argues that Schroeder's account of agent-neutral reasons cannot be made to work, that the limited scope of his distinctive proposal in the epistemology of reasons undermines its plausibility, and that Schroeder faces an uncomfortable tension between the initial motivation for his view and the details of the view he develops.
We are now acutely aware, as if all of the sudden, that data matters enormously to how we live. How did information come to be so integral to what we can do? How did we become people who effortlessly present our lives in social media profiles and who are meticulously recorded in state surveillance dossiers and online marketing databases? What is the story behind data coming to matter so much to who we are? -/- In How We Became Our Data, (...)Colin Koopman excavates early moments of our rapidly accelerating data-tracking technologies and their consequences for how we think of and express our selfhood today. Koopman explores the emergence of mass-scale record keeping systems like birth certificates and social security numbers, as well as new data techniques for categorizing personality traits, measuring intelligence, and even racializing subjects. This all culminates in what Koopman calls the “informational person” and the “informational power” we are now subject to. The recent explosion of digital technologies that are turning us into a series of algorithmic data points is shown to have a deeper and more turbulent past than we commonly think. Blending philosophy, history, political theory, and media theory in conversation with thinkers like Michel Foucault, Jürgen Habermas, and Friedrich Kittler, Koopman presents an illuminating perspective on how we have come to think of our personhood—and how we can resist its erosion. (shrink)
Douglas Portmore has recently argued in this journal for a "promising result" – that combining teleological ethics with "evaluator relativism" about the good allows an ethical theory to account for deontological intuitions while "accommodat[ing] the compelling idea that it is always permissible to bring about the best available state of affairs." I show that this result is false. It follows from the indexical semantics of evaluator relativism that Portmore's compelling idea is false. I also try to explain what might have (...) led to this misunderstanding. (shrink)
What is it to have a reason? According to one common idea, the "Factoring Account", you have a reason to do A when there is a reason for you to do A which you have--which is somehow in your possession or grasp. In this paper, I argue that this common idea is false. But though my arguments are based on the practical case, the implications of this are likely to be greatest in epistemology: for the pitfalls we fall into when (...) trying to defend the Factoring Account reflect very well the major developments in empiricist epistemology during the 20th century. I conjecture that this is because epistemologists have been--wrongly--wedded to the Factoring Account about evidence, which I conjecture is a certain kind of reason to believe. (shrink)
The IHME Covid-19 prediction model has been one of the most influential Covid models in the United States. Early on, it received heavy criticism for understating the extent of the epidemic. I argue that this criticism was based on a misunderstanding of the model. The model was best interpreted not as attempting to forecast the actual course of the epidemic. Rather, it was attempting to make a conditional projection: telling us how the epidemic would unfold, given certain assumptions. This misunderstanding (...) of the IHME’s model prevented the public from seeing how dire the model’s projections actually were. (shrink)
The basic idea of expressivism is that for some sentences ‘P’, believing that P is not just a matter of having an ordinary descriptive belief. This is a way of capturing the idea that the meaning of some sentences either exceeds their factual/descriptive content or doesn’t consist in any particular factual/descriptive content at all, even in context. The paradigmatic application for expressivism is within metaethics, and holds that believing that stealing is wrong involves having some kind of desire-like attitude, with (...) world-tomind direction of fit, either in place of, or in addition to, being in a representational state of mind with mind-to-world direction of fit. Because expressivists refer to the state of believing that P as the state of mind ‘expressed’ by ‘P’, this view can also be described as the view that ‘stealing is wrong’ expresses a state of mind that involves a desire-like attitude instead of, or in addition to, a representational state of mind. According to some expressivists - unrestrained expressivists, as I’ll call them - there need be no special relationship among the different kinds of state of mind that can be expressed by sentences. Pick your favorite state of mind, the unrestrained expressivist allows, and there could, at least in principle, be a sentence that expressed it. Expressivists who seem to have been unrestrained plausibly include Ayer in Language, Truth, and Logic, and Simon Blackburn in many of his writings, including his , , and.. (shrink)
One of the central debates in contemporary Kant scholarship concerns whether Kant endorses a “conceptualist” account of the nature of sensory experience. Understanding the debate is crucial for getting a full grasp of Kant's theory of mind, cognition, perception, and epistemology. This paper situates the debate in the context of Kant's broader theory of cognition and surveys some of the major arguments for conceptualist and non-conceptualist interpretations of his critical philosophy.
Expressivism - the sophisticated contemporary incarnation of the noncognitivist research program of Ayer, Stevenson, and Hare - is no longer the province of metaethicists alone. Its comprehensive view about the nature of both normative language and normative thought has also recently been applied to many topics elsewhere in philosophy - including logic, probability, mental and linguistic content, knowledge, epistemic modals, belief, the a priori, and even quantifiers. Yet the semantic commitments of expressivism are still poorly understood and have not been (...) very far developed. As argued within, expressivists have not yet even managed to solve the "negation problem" - to explain why atomic normative sentences are inconsistent with their negations. As a result, it is far from clear that expressivism even could be true, let alone whether it is. Being For seeks to evaluate the semantic commitments of expressivism, by showing how an expressivist semantics would work, what it can do, and what kind of assumptions would be required, in order for it to do it. Building on a highly general understanding of the basic ideas of expressivism, it argues that expressivists can solve the negation problem - but only in one kind of way. It shows how this insight paves the way for an explanatorily powerful, constructive expressivist semantics, which solves many of what have been taken to be the deepest problems for expressivism. But it also argues that no account with these advantages can be generalized to deal with constructions like tense, modals, or binary quantifiers. Expressivism, the book argues, is coherent and interesting, but false. (shrink)
In the Book of Common Prayer’s Rite II version of the Eucharist, the congregation confesses, “we have sinned against you in thought, word, and deed”. According to this confession we wrong God not just by what we do and what we say, but also by what we think. The idea that we can wrong someone not just by what we do, but by what think or what we believe, is a natural one. It is the kind of wrong we feel (...) when those we love believe the worst about us. And it is one of the salient wrongs of racism and sexism. Yet it is puzzling to many philosophers how we could wrong one another by virtue of what we believe about them. This paper defends the idea that we can morally wrong one another by what we believe about them from two such puzzles. The first puzzle concerns whether we have the right sort of control over our beliefs for them to be subject to moral evaluation. And the second concerns whether moral wrongs would come into conflict with the distinctively epistemic standards that govern belief. Our answer to both puzzles is that the distinctively epistemic standards governing belief are not independent of moral considerations. This account of moral encroachment explains how epistemic norms governing belief are sensitive to the moral requirements governing belief. (shrink)
Kant holds that the applicability of the moral ‘ought’ depends on a kind of agent-causal freedom that is incompatible with the deterministic structure of phenomenal nature. I argue that Kant understands this determinism to threaten not just morality but the very possibility of our status as rational beings. Rational beings exemplify “cognitive control” in all of their actions, including not just rational willing and the formation of doxastic attitudes, but also more basic cognitive acts such as judging, conceptualizing, and synthesizing.
Recently, a number of philosophers have argued that we can and should “consequentialize” non-consequentialist moral theories, putting them into a consequentialist framework. I argue that these philosophers, usually treated as a group, in fact offer three separate arguments, two of which are incompatible. I show that none represent significant threats to a committed non-consequentialist, and that the literature has suffered due to a failure to distinguish these arguments. I conclude by showing that the failure of the consequentializers’ arguments has implications (...) for disciplines, such as economics, logic, decision theory, and linguistics, which sometimes use a consequentialist structure to represent non-consequentialist ethical theories. (shrink)
Multiply realizable properties are those whose realizers are physically diverse. It is often argued that theories which contain them are ipso facto irreducible. These arguments assume that physical explanations are restricted to the most specific descriptions possible of physical entities. This assumption is descriptively false, and philosophically unmotivated. I argue that it is a holdover from the late positivist axiomatic view of theories. A semantic view of theories, by contrast, correctly allows scientific explanations to be couched in the most perspicuous, (...) powerful language available. On a semantic view, traditional notions of multiple realizability are thus very hard to motivate. At best, one must abandon either the idea that multiple realizability is an interesting scientific notion, or else admit that multiply realizable properties do not automatically block scientific reductions. (shrink)
It is widely thought that Bayesian confirmation theory has provided a solution to Hempel's Paradox (the Ravens Paradox). I discuss one well-known example of this approach, by John Mackie, and argue that it is unconvincing. I then suggest an alternative solution, which shows that the Bayesian approach is altogether mistaken. Nicod's Condition should be rejected because a generalisation is not confirmed by any of its instances if it is not law-like. And even law-like non-basic empirical generalisations, which are expressions of (...) assumed underlying causal regularities, are not so confirmed if they are absurd in the light of our causal background knowledge or if their instances are not also possible instances of the relevant causal claim. (shrink)
In the mid-eighteenth century David Hume argued that successful prediction tells us nothing about the truth of the predicting theory. But physical theory routinely predicts the values of observable magnitudes within very small ranges of error. The chance of this sort of predictive success without a true theory suggests that Hume's argument is flawed. However, Colin Howson argues that there is no flaw and examines the implications of this disturbing conclusion; he also offers a solution to one of the (...) central problems of Western philosophy, the problem of induction. (shrink)
Particularists in ethics emphasize that the normative is holistic, and invite us to infer with them that it therefore defies generalization. This has been supposed to present an obstacle to traditional moral theorizing, to have striking implications for moral epistemology and moral deliberation, and to rule out reductive theories of the normative, making it a bold and important thesis across the areas of normative theory, moral epistemology, moral psychology, and normative metaphysics. Though particularists emphasize the importance of the holism of (...) the normative, however, it is not something that they have been able to explain. In this paper I’ll show how to use a small number of simple and, I’ll argue, independently compelling assumptions in order to both predict and explain the holistic features of the normative with respect to the non-normative. The basic idea of the paper is simple. It is that normative claims are holistic because they are general, rather than because they defy generalization. (shrink)
Daniel Whiting has argued, in this journal, that Mark Schroeder’s analysis of knowledge in terms of subjectively and objectively sufficient reasons for belief makes wrong predictions in fake barn cases. Schroeder has replied that this problem may be avoided if one adopts a suitable account of perceptual reasons. I argue that Schroeder’s reply fails to deal with the general worry underlying Whiting’s purported counterexample, because one can construct analogous potential counterexamples that do not involve perceptual reasons at (...) all. Nevertheless, I claim that it is possible to overcome Whiting’s objection, by showing that it rests on an inadequate characterization of how defeat works in the examples in question. (shrink)
Many economic measures are structured to reflect ethical values. I describe three attitudes towards this: maximalism, according to which we should aim to build all relevant values into measures; minimalism, according to which we should aim to keep values out of measures; and an intermediate view. I argue the intermediate view is likely correct, but existing versions are inadequate. In particular, economists have strong reason to structure measures to reflect fixed, as opposed to user-assessable, values. This implies that, despite disagreement (...) about precisely how to do so, economists should standardly adjust QALYs and DALYs to reflect egalitarian values. (shrink)
Expressivists have a problem with negation. The problem is that they have not, to date, been able to explain why ‘murdering is wrong’ and ‘murdering is not wrong’ are inconsistent sentences. In this paper, I explain the nature of the problem, and why the best efforts of Gibbard, Dreier, and Horgan and Timmons don’t solve it. Then I show how to diagnose where the problem comes from, and consequently how it is possible for expressivists to solve it. Expressivists should accept (...) this solution, I argue, because it is demonstrably the only way of avoiding the problem, and because it generalizes. Once we see how to solve the negation problem, I show, it becomes easy to state a constructive, compositional expressivist semantics for a purely normative language with the expressive power of propositional logic, in which we can for the first time give explanatory, formally adequate expressivist accounts of logical inconsistency, logical entailment, and logical validity. As a corollary, I give what I take to be the first real expressivist explanation of why Geach’s original moral modus ponens argument is genuinely logically valid. This proves that the problem with expressivism cannot be that it can’t account for the logical properties of complex normative sentences. But it does not show that the same solution can work for a language with both normative and descriptive predicates, let alone that expressivists are able to deal with more complex linguistic constructions like tense, modals, or even quantifiers. In the final section, I show what kind of constraints the solution offered here would place expressivists under, in answering these further questions. (shrink)
In a well-known paper, Timothy Williamson claimed to prove with a coin-flipping example that infinitesimal-valued probabilities cannot save the principle of Regularity, because on pain of inconsistency the event ‘all tosses land heads’ must be assigned probability 0, whether the probability function is hyperreal-valued or not. A premise of Williamson’s argument is that two infinitary events in that example must be assigned the same probability because they are isomorphic. It was argued by Howson that the claim of isomorphism fails, but (...) a more radical objection to Williamson’s argument is that it had been, in effect, refuted long before it was published. (shrink)
The identity theory of truth takes on different forms depending on whether it is combined with a dual relation or a multiple relation theory of judgment. This paper argues that there are two significant problems for the dual relation identity theorist regarding thought’s answerability to reality, neither of which takes a grip on the multiple relation identity theory.
Imagine you are walking down a city street. It is windy and raining. Amidst the bustle you see a young woman. She sits under a railway bridge, hardly protected from the rain and holds a woolen hat containing a small number of coins. You can see that she trembles from the cold. Or imagine seeing an old woman walking in the street at dusk, clutching her bag with one hand and a walking stick with the other. A group of male (...) youths walk behind her without overtaking, drunk and in the mood for mischief. It doesn't need an academic to say what vulnerability is. We can all see it, much more often than we care to. (shrink)
Our new journal Philosophies is devoted to the search for a synthesis of philosophical and scientific inquiry. It promotes philosophical work derived from the experience of diverse scientific disciplines. [...].
This book offers a lucid and highly readable account of Wittgenstein's philosophy, framed against the background of his extraordinary life and character. Woven together with a biographical narrative, the chapters explain the key ideas of Wittgenstein's work, from his first book, the Tractatus Logico-Philosophicus, to his mature masterpiece, the Philosophical Investigations. Severin Schroeder shows that at the core of Wittgenstein's later work lies a startlingly original and subversive conception of the nature of philosophy. In accordance with this conception, Wittgenstein (...) offers no new philosophical doctrines to replace his earlier ones, but seeks to demonstrate how all philosophical theorizing is the result of conceptual misunderstanding. He first diagnoses such misunderstanding at the core of his own earlier philosophy of language and then subjects philosophical views and problems about various mental phenomena understanding, sensations, the will to a similar therapeutic analysis. Schroeder provides a clear and careful account of the main arguments offered by Wittgenstein. He concludes by considering some critical responses to Wittgenstein's work, assessing its legacy for contemporary philosophy. -/- Wittgenstein is ideal for students seeking a clear and concise introduction to the work of this seminal twentieth-century philosopher. (shrink)
There is a growing consensus among philosophers of science that core parts of the scientific process involve non-epistemic values. This undermines the traditional foundation for public trust in science. In this article I consider two proposals for justifying public trust in value-laden science. According to the first, scientists can promote trust by being transparent about their value choices. On the second, trust requires that the values of a scientist align with the values of an individual member of the public. I (...) argue that neither of these proposals work and suggest an alternative that does better. When scientists must appeal to values in the course of their research, they should appeal to democratic values: the values of the public or its representatives. (shrink)
Page generated Thu Aug 5 12:04:43 2021 on philpapers-web-65948fd446-qrpbq
cache stats: hit=6515, miss=4366, save= autohandler : 1659 ms called component : 1635 ms search.pl : 1428 ms render loop : 1077 ms addfields : 655 ms publicCats : 533 ms initIterator : 346 ms next : 343 ms retrieve cache object : 168 ms menu : 150 ms quotes : 96 ms save cache object : 82 ms search_quotes : 48 ms autosense : 42 ms match_cats : 37 ms prepCit : 29 ms applytpl : 8 ms match_other : 2 ms intermediate : 2 ms match_authors : 2 ms init renderer : 1 ms setup : 0 ms auth : 0 ms writelog : 0 ms