This innovative book is the first to couch the debate about animals in the language of justice, and the first to develop both ideal and nonideal theories of justice for animals. It rejects the abolitionist animal rights position in favor of a revised version of animal rights centering on sentience.
In this article, a new, idealizing-hermeneutic methodological approach to developing a theory of philosophical arguments is presented and carried out. The basis for this is a theory of ideal philosophical theory types developed from the analysis of historical examples. According to this theory, the following ideal types of theory exist in philosophy: 1. descriptive-nomological, 2. idealizing-hermeneutic, 3. technical-constructive, 4. ontic-practical. These types of theories are characterized in particular by what their basic types of theses are. (...) The main task of this article is then to determine the types of arguments that are suitable for justifying these types of theses. Surprisingly, practical arguments play a key role here. (shrink)
It has recently been suggested that a distinctive metaphysical relation— ‘Grounding’—is ultimately at issue in contexts in which some goings-on are said to hold ‘in virtue of’’, be ‘metaphysically dependent on’, or be ‘nothing over and above’ some others. Grounding is supposed to do good work in illuminating metaphysical dependence. I argue that Grounding is also unsuited to do this work. To start, Grounding alone cannot do this work, for bare claims of Grounding leave open such basic questions as whether (...) Grounded goings-on exist, whether they are reducible to or rather distinct from Grounding goings-on, whether they are efficacious, and so on; but in the absence of answers to such basic questions, we are not in position to assess the associated claim or theses concerning metaphysical dependence. There is no avoiding appeal to the specific metaphysical relations typically at issue in investigations into dependence—for example, type or token identity, functional realization, classical mereological parthood, the set membership relation, the proper subset relation, the determinable/determinate relation, and so on—which are capable of answering these questions. But, I argue, once the specific relations are on the scene, there is no need for Grounding. (shrink)
Collective emotions are at the heart of any society and become evident in gatherings, crowds, or responses to widely salient events. However, they remain poorly understood and conceptualized in scientific terms. Here, we provide first steps towards a theory of collective emotions. We first review accounts of the social and cultural embeddedness of emotion that contribute to understanding collective emotions from three broad perspectives: face-to-face encounters, culture and shared knowledge, and identification with a social collective. In discussing their strengths (...) and shortcomings and highlighting areas of conceptual overlap, we translate these views into a number of bottom–up mechanisms that explain collective emotion elicitation on the levels of social cognition, expressive behavior, and social practices. (shrink)
According to one tradition, uttering an indicative conditional involves performing a special sort of speech act: a conditional assertion. We introduce a formal framework that models this speech act. Using this framework, we show that any theory of conditional assertion validates several inferences in the logic of conditionals, including the False Antecedent inference. Next, we determine the space of truth-conditional semantics for conditionals consistent with conditional assertion. The truth value of any such conditional is settled whenever the antecedent is (...) false, and whenever the antecedent is true and the consequent is false. Then, we consider the space of dynamic meanings consistent with the theory of conditional assertion. We develop a new family of dynamic conditional-assertion operators that combine a traditional test operator with an update operation. (shrink)
How do we define religious experiences? And what would be the relationship with spiritual experiences? The author claims that the cognitive turn in science gives us new opportunities to map the territory of religion and spirituality. In line with other authors, he proposes a building block approach of cognitive mechanisms that can deal with questions regarding the specificity, origin, and complexity of religious experiences. Two concepts are presented that bridge the great divide which is presumed to exist between sciences that (...) study the brain and humanities, namely the encultured brain and predictive minds. In section three, six building blocks of the structure of religious experience are formulated. New in his approach is the unexpected possible as distinctive ground between normal experiences and what we consider spiritual and religious experiences. Finally, the author presents a critical reflection on his proposal and challenges for the road ahead. (shrink)
Preface and Acknowledgments Introduction PART I Intentionality Chapter 1 Fodor’ Guide to Mental Representation: The Intelligent Auntie’s Vade-Mecum Chapter 2 Semantics, Wisconsin Style Chapter 3 A Theory of Content, I: The Problem Chapter 4 A Theory of Content, II: The Theory Chapter 5 Making Mind Matter More Chapter 6 Substitution Arguments and the Individuation of Beliefs Chapter 7 Stephen Schiffer’s Dark Night of The Soul: A Review of Remnants of Meaning PART II Modularity Chapter 8 Précis of (...) The Modularity of Mind Chapter 9 Why Should the Mind Be Modular? Chapter 10 Observation Reconsidered Appendix: A Reply to Churchland’s ‘Perceptual Plasticity and Theoretical Neutrality" References Index of Names. (shrink)
This book presents a comprehensive overview of what the criminal law would look like if organised around the principle that those who deserve punishment should receive punishment commensurate with, but no greater than, that which they deserve. Larry Alexander and Kimberly Kessler Ferzan argue that desert is a function of the actor's culpability, and that culpability is a function of the risks of harm to protected interests that the actor believes he is imposing and his reasons for acting in the (...) face of those risks. The authors deny that resultant harms, as well as unperceived risks, affect the actor's desert. They thus reject punishment for inadvertent negligence as well as for intentions or preparatory acts that are not risky. Alexander and Ferzan discuss the reasons for imposing risks that negate or mitigate culpability, the individuation of crimes, and omissions. (shrink)
Should you care less about your distant future? What about events in your life that have already happened? How should the passage of time affect your planning and assessment of your life? Most of us think it is irrational to ignore the future but harmless to dismiss the past. But this book argues that rationality requires temporal neutrality.
I argue that the theory of chance proposed by David Lewis has three problems: (i) it is time asymmetric in a manner incompatible with some of the chance theories of physics, (ii) it is incompatible with statistical mechanical chances, and (iii) the content of Lewis's Principal Principle depends on how admissibility is cashed out, but there is no agreement as to what admissible evidence should be. I proposes two modifications of Lewis's theory which resolve these difficulties. I conclude (...) by tentatively proposing a third modification of Lewis's theory, one which explains many of the common features shared by the chance theories of physics. (shrink)
Margaret Gilbert offers an incisive new approach to a classic problem of political philosophy: when and why should I do what the law tells me to do? Do I have special obligations to conform to the laws of my own country and if so, why? In what sense, if any, must I fight in wars in which my country is engaged, if ordered to do so, or suffer the penalty for law-breaking the law imposes - including the death penalty? Gilbert's (...) accessible book offers a provocative and compelling case in favour of citizens' obligations to the state, while examining how these can be squared with self-interest and other competing considerations. (shrink)
This paper represents a preliminary investigation relating Bernard Lonergan’s thought to health science and the healing arts. First, I provide background for basic elements of Lonergan’s theoretical terminology that I employ. As inquiry is the engine of Lonergan’s method, next I specify two questions that underlie medical insights and define several terms, including health, disease, and illness, in relation to these questions. Then I expand the frame of reference to include all disciplines involved in the cycle of clinical interaction under (...) the heading health science and the healing arts. Finally, I analyze the cycle of clinical interaction in terms of Lonergan’s cognitive theory. I compare and contrast my analysis, based on Lonergan, with that of Pellegrino, Thomasma and Sulmasy as I proceed. In closing, I comment briefly on the next stage of this project regarding Lonergan’s theory of the human good in relation to the practice of the healing arts. (shrink)
I develop a basic theory of content within the framework of truthmaker semantics and, in the second part, consider some of the applications to subject matter, common content, logical subtraction and ground.
The distinguished philosopher Robert M. Adams presents a major work on virtue, which is once again a central topic in ethical thought. A Theory of Virtue is a systematic, comprehensive framework for thinking about the moral evaluation of character, proposing that virtue is chiefly a matter of being for what is good, and that virtues must be intrinsically excellent and not just beneficial or useful.
Amartya Sen argues that for the advancement of justice identification of ‘perfect’ justice is neither necessary nor sufficient. He replaces ‘perfect’ justice with comparative justice. Comparative justice limits itself to comparing social states with respect to degrees of justice. Sen’s central thesis is that identifying ‘perfect’ justice and comparing imperfect social states are ‘analytically disjoined’. This essay refutes Sen’s thesis by demonstrating that to be able to make adequate comparisons we need to identify and integrate criteria of comparison. This is (...) precisely the aim of a theory of justice (such as John Rawls’s theory): identifying, integrating and ordering relevant principles of justice. The same integrated criteria that determine ‘perfect’ justice are needed to be able to adequately compare imperfect social states. Sen’s alternative approach, which is based on social choice theory, is incapable of avoiding contrary, indeterminate or incoherent directives where plural principles of justice conflict. (shrink)
This work offers a new theory of what it means to be a legal person and suggests that it is best understood as a cluster property. The book explores the origins of legal personhood, the issues afflicting a traditional understanding of the concept, and the numerous debates surrounding the topic.
Despite its central role in our cognitive lives, rational belief revision has received relatively little attention from epistemologists. This dissertation begins to fill that absence. In particular, we explore the phenomenon of defeasible epistemic justification, i.e., justification that can be lost as well as gained by epistemic agents. We begin by considering extant theories of defeat, according to which defeaters are whatever cause a loss of justification or things that somehow neutralize one's reasons for belief. Both of these theories are (...) both extensionally and explanatorily inadequate and, so, are rejected. We proceed to develop a novel theory of defeat according to which defeaters are reasons against belief. According to this theory, the dynamics of justification are explained by the competition between reasons for and reasons against belief. We find that this theory not only handles the counter-examples that felled the previous theories but also does a fair job in explaining the various aspects of the phenomenon of defeat. Furthermore, this theory accomplishes this without positing any novel entities or mechanisms; according to this theory, defeaters are epistemic reasons against belief, the mirror image of our epistemic reasons for belief, rather than sui generis entities. (shrink)
This is the first full-length presentation of a republican alternative to the liberal and communitarian theories that have dominated political philosophy in recent years. The latest addition to the acclaimed Oxford Political Theory series, Pettit's eloquent and compelling account opens with an examination of the traditional republican conception of freedom as non-domination, contrasting this with established negative and positive views of liberty. The first part of the book traces the rise and decline of this conception, displays its many attractions, (...) and makes a case for why it should still be regarded as a central political ideal. The second part of the book looks at what the implementation of the ideal would imply for substantive policy-making, constitutional and democratic design, regulatory control and the relation between state and civil society. Prominent in this account is a novel concept of democracy, under which government is exposed to systematic contestation, and a vision of relations between state and society founded upon civility and trust. Pettit's powerful and insightful new work offers not only a unified, theoretical overview of the many strands of republican ideas, but also a new and sophisticated perspective on studies in related fields including the history of ideas, jurisprudence, and criminology. (shrink)
The paper has two parts: First, I describe a relatively popular thesis in the philosophy of propositional attitudes, worthy of the name ‘taking tense seriously’; and I distinguish it from a family of views in the metaphysics of time, namely, the A‐theories. Once the distinction is in focus, a skeptical worry arises. Some A‐theorists maintain that the difference between past, present, and future, is to be drawn in terms of what exists: growing‐block theorists eschew ontological commitment to future entities; presentists, (...) to future and past entities. Others think of themselves as A‐theorists but exclude no past or future things from their ontology. The metaphysical skeptic suspects that their attempt to articulate an ‘eternalist’ version of the A‐theory collapses into merely ‘taking tense seriously’– a thesis that does not imply the A‐theory. The second half of the paper is the search for a stable eternalist A‐theory. It includes discussion of temporary intrinsics, temporal parts, and truth. (shrink)
This paper presents a blueprint for an ecological cognitive architecture. Ecological psychology, I contend, must be complemented with a story about the role of the CNS in perception, action, and cognition. To arrive at such a story while staying true to the tenets of ecological psychology, it will be necessary to flesh out the central metaphor according to which the animal perceives its environment by ‘resonating’ to information in energy patterns: what is needed is a theory of resonance. I (...) offer here the two main elements of such a theory: a framework and a methodology. (shrink)
The theory of truthmaking has long aroused skepticism from philosophers who believe it to be tangled up in contentious ontological commitments and unnecessary theoretical baggage. In this book, Jamin Asay shows why that suspicion is unfounded. Challenging the current orthodoxy that truthmaking's fundamental purpose is to be a tool for explaining why truths are true, Asay revives the conception of truthmaking as fundamentally an exercise in ontology: a means for coordinating one's beliefs about what is true and one's ontological (...) commitments. He goes on to show how truthmaking connects to analyticity, truth, and realism, and how it contributes to debates over nominalism, presentism, mathematical objects, and fictional characters. His book is the most comprehensive exploration to date into what truthmaking is and how it contributes to metaphysical debates across philosophy, and will interest a wide range of readers in metaphysics and beyond. (shrink)
What system of morals should rational people select as the best for society? Using a contemporary psychological theory of action and of motivation, Richard Brandt's Oxford lectures argue that the purpose of living should be to strive for the greatest good for the largest number of people. Brandt's discussions range from the concept of welfare to conflict between utilitarian moral codes and the dictates of self-interest.
Though there is a wide and varied literature on ethical supererogation, there has been almost nothing written about its epistemic counterpart, despite an intuitive analogy between the two fields. This paper seeks to change this state of affairs. I will begin by showing that there are examples which intuitively feature epistemically supererogatory doxastic states. Next, I will present a positive theory of epistemic supererogation that can vindicate our intuitions in these examples, in an explanation that parallels a popular (...) class='Hi'>theory of ethical supererogation. Roughly, I will argue that a specific type of epistemic virtue—the ability to creatively think up plausible hypotheses given a body of evidence—is not required of epistemic agents. Thus, certain exercises of this virtue can result in supererogatory doxastic states. In presenting this theory, I will also show how thinking about epistemic supererogation can provide us a new way forward in the debate about the uniqueness thesis for epistemic rationality. (shrink)
An outline of a theory ofvalues-based labeling as a social movementargues that it is motivated by the need tore-embed the agro-food economy in the largersocial economy. A review of some basic premisesof embeddedness theories derived from the workof Karl Polanyi reveals their connection toparticular values-based labeling efforts. Fromthis perspective, values-based labelingpresents itself as primarily an ethical andmoral effort to counter unsustainable trendswithin presently existing capitalism. Theselabels distinguish themselves from ordinarycommercial labels by a focus on processand on quality. Evaluating thetransformative (...) potential and progressive natureof values-based labeling poses a key challenge.One avenue for gauging this potential isconventions theory. This approach can beadapted to consider the decision makingprocesses that go on within values-basedlabeling groups, as well as consumer decisionmaking based on such labels, as instances ofwhat is termed a politics of ethical judgment.The conclusion emphasizes the need for moreresearch in this area and suggests how it couldbe furthered. (shrink)
The authors outline a theory of conditionals of the form If A then C and If A then possibly C. The 2 sorts of conditional have separate core meanings that refer to sets of possibilities. Knowledge, pragmatics, and semantics can modulate these meanings. Modulation can add information about temporal and other relations between antecedent and consequent. It can also prevent the construction of possibilities to yield 10 distinct sets of possibilities to which conditionals can refer. The mental representation of (...) a conditional normally makes explicit only the possibilities in which its antecedent is true, yielding other possibilities implicitly. Reasoners tend to focus on the explicit possibilities. The theory predicts the major phenomena of understanding and reasoning with conditionals. (shrink)
This innovative approach to freedom starts from an account of what we mean by describing someone, in a psychological vein, as a free subject. Pettit develops an argument as to what it is that makes someone free in that basic sense; and then goes on to derive the implications of the approach for issues of freedom in political theory. Freedom in the subject is equated with the person's being fit to be held responsible and to be authorized as a (...) partner in interaction. This book is unique among contemporary approaches - although it is true to the spirit of classical writers like Hobbes and Kant - in seeking a theory that applies to psychological issues of free agency and free will as well as to political issues in the theory of the free state and the free constitution. The driving thesis is that it is only by connecting up the different issues of freedom, psychological and political, that we can fully appreciate the nature of the questions involved, and the requirements for their resolution. The book does not not seek a comprehensive reach just for its own sake, but rather for the sake of the illumination it provides._ _ _A Theory of Freedom_ is a ground-breaking volume which will be of wide interest to scholars and students in political philosophy and political science. (shrink)
This book develops a new theory of determinism that offers fresh insights into questions of how intentions and other mental events relate to neural events, how both come about, and how both result in actions. Honderich tests his theory against neuroscience, quantum theory, and possible philosophical refutations, and discusses the consequences of determinism and near-determinism for life-hopes, knowledge, and personal feelings.
Children must be taught morality. They must be taught to recognise the authority of moral standards and to understand what makes them authoritative. But there’s a problem: the content and justification of morality are matters of reasonable disagreement among reasonable people. This makes it hard to see how educators can secure children’s commitment to moral standards without indoctrinating them. -/- In A Theory of Moral Education, Michael Hand tackles this problem head on. He sets out to show that moral (...) education can and should be fully rational. It is true that many moral standards and justificatory theories are controversial, and educators have an obligation to teach these nondirectively, with the aim of enabling children to form their own considered views. But reasonable moral disagreement does not go all the way down: some basic moral standards are robustly justified, and these should be taught directively, with the aim of bringing children to recognise and understand their authority. -/- This is an original and important contribution to the philosophy of moral education, which lays a new theoretical foundation for the urgent practical task of teaching right from wrong. (shrink)
A familiar feature of our moral responsibility practices are pleas: considerations, such as “That was an accident”, or “I didn’t know what else to do”, that attempt to get agents accused of wrongdoing off the hook. But why do these pleas have the normative force they do in fact have? Why does physical constraint excuse one from responsibility, while forgetfulness or laziness does not? I begin by laying out R. Jay Wallace’s (Responsibility and the moral sentiments, 1994 ) theory (...) of the normative force of excuses and exemptions. For each category of plea, Wallace offers a single governing moral principle that explains their normative force. The principle he identifies as governing excuses is the Principle of No Blameworthiness without Fault: an agent is blameworthy only if he has done something wrong. The principle he identifies as governing exemptions is the Principle of Reasonableness: an agent is morally accountable only if he is normatively competent. I argue that Wallace’s theory of exemptions is sound, but that his account of the normative force of excuses is problematic, in that it fails to explain the full range of excuses we offer in our practices, especially the excuses of addiction and extreme stress. I then develop a novel account of the normative force of excuses, which employs what I call the “Principle of Reasonable Opportunity,” that can explain the full range of excuses we offer and that is deeply unified with Wallace’s theory of the normative force of exemptions. An important implication of the theory I develop is that moral responsibility requires free will. (shrink)
The topic of a priori knowledge is approached through the theory of evidence. A shortcoming in traditional formulations of moderate rationalism and moderate empiricism is that they fail to explain why rational intuition and phenomenal experience count as basic sources of evidence. This explanatory gap is filled by modal reliabilism -- the theory that there is a qualified modal tie between basic sources of evidence and the truth. This tie to the truth is then explained by the (...) class='Hi'>theory of concept possession: this tie is a consequence of what, by definition, it is to possess (i.e., to understand) one’s concepts. A corollary of the overall account is that the a priori disciplines (logic, mathematics, philosophy) can be largely autonomous from the empirical sciences. (shrink)
Page generated Tue Jul 27 06:25:23 2021 on philpapers-web-84c8c567c7-kx665
cache stats: hit=18595, miss=13281, save= autohandler : 1308 ms called component : 1293 ms search.pl : 1185 ms render loop : 734 ms addfields : 466 ms initIterator : 448 ms publicCats : 359 ms next : 223 ms retrieve cache object : 103 ms quotes : 94 ms menu : 70 ms save cache object : 53 ms search_quotes : 38 ms autosense : 24 ms match_cats : 22 ms prepCit : 18 ms applytpl : 5 ms match_other : 1 ms intermediate : 1 ms match_authors : 0 ms init renderer : 0 ms setup : 0 ms auth : 0 ms writelog : 0 ms