: A central component of Bernard Williams' political realism is the articulation of a standard of legitimacy from within politics itself: LEG. This standard is presented as basic, inherent in all political orders and the best way to underwrite fundamental liberal principles particular to the modern state, including basic human rights. It does not require, according to Williams, a wider set of liberal values. In the following, I show that where Williams restricts LEG to generating only minimal political protections, seeking (...) to isolate his account of political legitimacy from a range of liberal principles, this is neither internal to, nor necessarily demanded by, the specifically political account of LEG. Instead, the limitation depends upon his wider ethical thought. (shrink)
The paper questions the extent to which MacIntyre’s current ethical and political outlook should be traced to a pro ject begun in After Virtue. It is argued that, instead, a critical break comes in 1985 with his adoption of a ‘Thomistic Aristotelian’ standpoint. After Virtue’s ‘positive thesis’, by contrast, is a distinct position in MacIntyre’s intellectual journey, and the standpoint of After Virtue embodies substantial commitments not only in conflict with, but antithetical to, MacIntyre’s later worldview—mostly clearly illustrated in the (...) contrasting positions on moral conflict and tragedy. (shrink)
rom Flesh Gordon to Alex in Wonderland , title parodies have been a stock-in-trade of low comedy. We may not anticipate a tactical similarity between the mayhem of Mad magazine's movie reviews and the titles of major scientific works, yet two important nineteenth-century critiques of Darwin parodied his most famous phrases in their headings.
The question of whether faith in God is reasonable is of renewed interest in today’s academy. In light of this interest, as well as the rise of militant religion and terrorism and the emergent reaction by neo-atheism, this volume considers this important question from the views of contemporary scientists, philosophers, and in a more novel fashion, of rhetoricians. It is comprised of a public debate between William Lane Craig, supporting the position that faith in God is reasonable and Alex (...) Rosenberg, arguing against that position. Scholars in the aforementioned fields then respond to the debate, representing both theistic and atheistic positions. The book concludes with rejoinders from Craig and Rosenberg. (shrink)
Alex Oliver and Timothy Smiley provide a new account of plural logic. They argue that there is such a thing as genuinely plural denotation in logic, and expound a framework of ideas that includes the distinction between distributive and collective predicates, the theory of plural descriptions, multivalued functions, and lists.
Origens : Alex Atala, Fernando e Humberto Campana -- Presente : Fernando e Humberto Campana e Jum Nakao -- Intermezzo : convívio : Jam Nakao e colaboradores -- Destinos : Alex Atala e Jum Nakao -- Entrevistas -- Um pouco de história.
Philosophy of Science is a mid-level text for students with some grounding in philosophy. It introduces the questions that drive enquiry in the philosophy of science, and aims to educate readers in the main positions, problems and arguments in the field today. Alex Rosenberg is certainly well qualified to write such an introduction. His works cover a large area of the philosophy of natural and social sciences. In addition, the author of the argument that the ‘queen of the social (...) sciences’, economics, is not a science at all, can be counted on to show how the philosophy of science can be relevant to the understanding of the status of scientific knowledge and can provide a critical assessment of practitioners’ view of their field. (shrink)
Lewis et al. (2011) attempted to restore the reputation of Samuel George Morton, a 19th century physician who reported on the skull sizes of different folk-races. Whereas Gould (1978) claimed that Morton’s conclusions were invalid because they reflected unconscious bias, Lewis et al. alleged that Morton’s findings were, in fact, supported, and Gould’s analysis biased. We take strong exception to Lewis et al.’s thesis that Morton was “right.” We maintain that Gould was right to reject Morton’s analysis (...) as inappropriate and misleading, but wrong to believe that a more appropriate analysis was available. Lewis et al. fail to recognize that there is, given the dataset available, no appropriate way to answer any of the plausibly interesting questions about the “populations” in question (which in many cases are not populations in any biologically meaningful sense). We challenge the premise shared by both Gould and Lewis et al. that Morton’s confused data can be used to draw any meaningful conclusions. This, we argue, reveals the importance of properly focusing on the questions asked, rather than more narrowly on the data gathered. (shrink)
Using an economic bargaining game, we tested for the existence of two phenomena related to social norms, namely norm manipulation – the selection of an interpretation of the norm that best suits an individual – and norm evasion – the deliberate, private violation of a social norm. We found that the manipulation of a norm of fairness was characterized by a self-serving bias in beliefs about what constituted normatively acceptable behaviour, so that an individual who made an uneven bargaining offer (...) not only genuinely believed it was fair, but also believed that recipients found it fair, even though recipients of the offer considered it to be unfair. In contrast, norm evasion operated as a highly explicit process. When they could do so without the recipient's knowledge, individuals made uneven offers despite knowing that their behaviour was unfair. (shrink)
The present volume, comprising ninteen articles by renowned scholars, is divided into three sections, namely, Buddhist Jaina and Hindu Philsosphical Researches.
This paper develops a critical response to John Beatty’s recent (2006) engagement with Stephen Jay Gould’s claim that evolutionary history is contingent. Beatty identifies two senses of contingency in Gould’s work: an unpredictability sense and a causal dependence sense. He denies that Gould associates contingency with stochastic phenomena, such as drift. In reply to Beatty, this paper develops two main claims. The first is an interpretive claim: Gould really thinks of contingency has having to do with (...) stochastic effects at the level of macroevolution, and in particular with unbiased species sorting. This notion of contingency as macro-level stochasticity incorporates both the causal dependence and the unpredictability senses of contingency. The second claim is more substantive: Recent attempts by other scientists to put Gould’s claim to the test fail to engage with the hypothesis that species sorting sometimes resembles a lottery. Gould’s claim that random sorting is a significant macroevolutionary phenomenon remains an intriguing and largely untested live hypothesis about evolution. (shrink)
Much of Stephen Jay Gould’s legacy is dominated by his views on the contingency of evolutionary history expressed in his classic Wonderful Life. However, Gould also campaigned relentlessly for a “nomothetic” paleontology. How do these commitments hang together? I argue that Gould’s conception of science and natural law combined with his commitment to contingency to produce an evolutionary science centered around the formulation of higher-level evolutionary laws.
For many epistemologists, and for many philosophers more broadly, it is axiomatic that rationality requires you to take the doxastic attitudes that your evidence supports. Yet there is also another current in our talk about rationality. On this usage, rationality is a matter of the right kind of coherence between one's mental attitudes. Surprisingly little work in epistemology is explicitly devoted to answering the question of how these two currents of talk are related. But many implicitly assume that evidence -responsiveness (...) guarantees coherence, so that the rational impermissibility of incoherence will just fall out of the putative requirement to take the attitudes that one's evidence supports, and so that coherence requirements do not need to be theorized in their own right, apart from evidential reasons. In this paper, I argue that this is a mistake, since coherence and evidence -responsiveness can in fact come into conflict. More specifically, I argue that in cases of misleading higher-order evidence, there can be a conflict between believing what one's evidence supports and satisfying a requirement that I call “inter-level coherence ”. This illustrates why coherence requirements and evidential reasons must be separated and theorized separately. (shrink)
Traditionally, perceptual experiences—for example, the experience of seeing a cat—were thought to have two quite distinct components. When one sees a cat, one’s experience is “about” the cat: this is the representational or intentional component of the experience. One’s experience also has phenomenal character: this is the sensational component of the experience. Although the intentional and sensational components at least typically go together, in principle they might come apart: the intentional component could be present without the sensational component or vice (...) versa. (shrink)
In this paper, we consider how certain longstanding philosophical questions about mental representation may be answered on the assumption that cognitive and perceptual systems implement hierarchical generative models, such as those discussed within the prediction error minimization framework. We build on existing treatments of representation via structural resemblance, such as those in Gładziejewski :559–582, 2016) and Gładziejewski and Miłkowski, to argue for a representationalist interpretation of the PEM framework. We further motivate the proposed approach to content by arguing that it (...) is consistent with approaches implicit in theories of unsupervised learning in neural networks. In the course of this discussion, we argue that the structural representation proposal, properly understood, has more in common with functional-role than with causal/informational or teleosemantic theories. In the remainder of the paper, we describe the PEM framework for approximate Bayesian inference in some detail, and discuss how structural representations might arise within the proposed Bayesian hierarchies. After explicating the notion of variational inference, we define a subjectively accessible measure of misrepresentation for hierarchical Bayesian networks by appeal to the Kullbach–Leibler divergence between posterior generative and approximate recognition densities, and discuss a related measure of objective misrepresentation in terms of correspondence with the facts. (shrink)
This book investigates context-sensitivity in natural language by examining the meaning and use of a target class of theoretically recalcitrant expressions. These expressions-including epistemic vocabulary, normative and evaluative vocabulary, and vague language -exhibit systematic differences from paradigm context-sensitive expressions in their discourse dynamics and embedding properties. Many researchers have responded by rethinking the nature of linguistic meaning and communication. Drawing on general insights about the role of context in interpretation and collaborative action, Silk develops an improved contextualist theory of CR-expressions (...) within the classical truth-conditional paradigm: Discourse Contextualism. The aim of Discourse Contextualism is to derive the distinctive linguistic behavior of a CR-expression from a particular contextualist interpretation of an independently motivated formal semantics, along with general principles of interpretation and conversation. It is shown how in using CR-expressions, speakers can exploit their mutual grammatical and world knowledge, and general pragmatic reasoning skills, to coordinate their attitudes and negotiate about how the context should evolve. The book focuses primarily on developing a Discourse Contextualist semantics and pragmatics for epistemic modals. The Discourse Contextualist framework is also applied to other categories of epistemic vocabulary, normative and evaluative vocabulary, and vague adjectives. The similarities/differences among these expressions, and among context-sensitive expressions more generally, have been underexplored. The development of Discourse Contextualism in this book sheds light on general features of meaning and communication, and the variety of ways in which context affects and is affected by uses of language. Discourse Contextualism provides a fruitful framework for theorizing about various broader issues in philosophy, linguistics, and cognitive science. (shrink)
In discussions of whether and how pragmatic considerations can make a difference to what one ought to believe, two sets of cases feature. The first set, which dominates the debate about pragmatic reasons for belief, is exemplified by cases of being financially bribed to believe (or withhold from believing) something. The second set, which dominates the debate about pragmatic encroachment on epistemic justification, is exemplified by cases where acting on a belief rashly risks some disastrous outcome if the belief turns (...) out to be false. Call those who think that pragmatic considerations make a difference to what one ought to believe in the second kind of case, but not in the first, ‘moderate pragmatists’. Many philosophers – in particular, most advocates of pragmatic and moral encroachment – are moderate pragmatists. But moderate pragmatists owe us an explanation of exactly why the second kind of pragmatic consideration makes a difference, but the first kind doesn’t. I argue that the most promising of these explanations all fail: they are either theoretically undermotivated, or get key cases wrong, or both. Moderate pragmatism may be an unstable stopping point between a more extreme pragmatism, on one hand, and an uncompromising anti-pragmatism on the other. (shrink)
When is conceptual change so significant that we should talk about a new theory, not a new version of the same theory? We address this problem here, starting from Gould’s discussion of the individuation of the Darwinian theory. He locates his position between two extremes: ‘minimalist’—a theory should be individuated merely by its insertion in a historical lineage—and ‘maximalist’—exhaustive lists of necessary and sufficient conditions are required for individuation. He imputes the minimalist position to Hull and attempts a reductio (...) : this position leads us to give the same ‘name’ to contradictory theories. Gould’s ‘structuralist’ position requires both ‘conceptual continuity’ and descent for individuation. Hull’s attempt to assimilate into his general selectionist framework Kuhn’s notion of ‘exemplar’ and the ‘semantic’ view of the structure of scientific theories can be used to counter Gould’s reductio , and also to integrate structuralist and population thinking about conceptual change. (shrink)
Some combinations of attitudes--of beliefs, credences, intentions, preferences, hopes, fears, and so on--do not fit together right: they are incoherent. A natural idea is that there are requirements of "structural rationality" that forbid us from being in these incoherent states. Yet a number of surprisingly difficult challenges arise for this idea. These challenges have recently led many philosophers to attempt to minimize or eliminate structural rationality, arguing that it is just a "shadow" of "substantive rationality"--that is, correctly responding to one's (...) reasons. -/- In *Fitting Things Together*, Alex Worsnip pushes back against this trend--defending the view that structural rationality is a genuine kind of rationality, distinct from and irreducible to substantive rationality, and tackling the most important challenges for this view. In so doing, he gives an original positive theory of the nature of coherence and structural rationality that explains how the diverse range of instances of incoherence can be unified under a general account, and how facts about coherence are normatively significant. He also shows how a failure to focus on coherence requirements as a distinctive phenomenon and distinguish them adequately from requirements of substantive rationality has led to confusion and mistakes in several substantive debates in epistemology and ethics. -/- Taken as a whole, Fitting Things Together provides the first sustained defense of the view that structural rationality is a genuine, autonomous, unified, and normatively significant phenomenon. (shrink)
Traditionally, perceptual experiences—for example, the experience of seeing a cat—were thought to have two quite distinct components. When one sees a cat, one’s experience is “about” the cat: this is the representational or intentional component of the experience. One’s experience also has phenomenal character: this is the sensational component of the experience. Although the intentional and sensational components at least typically go together, in principle they might come apart: the intentional component could be present without the sensational component or vice (...) versa. (shrink)
Cultural evolution studies are characterized by the notion that culture evolves accordingly to broadly Darwinian principles. Yet how far the analogy between cultural and genetic evolution should be pushed is open to debate. Here, we examine a recent disagreement that concerns the extent to which cultural transmission should be considered a preservative mechanism allowing selection among different variants, or a transformative process in which individuals recreate variants each time they are transmitted. The latter is associated with the notion of “cultural (...) attraction”. This issue has generated much misunderstanding and confusion. We first clarify the respective positions, noting that there is in fact no substantive incompatibility between cultural attraction and standard cultural evolution approaches, beyond a difference in focus. Whether cultural transmission should be considered a preservative or reconstructive process is ultimately an empirical question, and we examine how both preservative and reconstructive cultural transmission has been studied in recent experimental research in cultural evolution. Finally, we discuss how the relative importance of preservative and reconstructive processes may depend on the granularity of analysis and the domain being studied. (shrink)
Priority monism is the view that the cosmos is the only independent concrete object. The paper argues that, pace its proponents, Priority monism is in conflict with the dependence of any whole on any of its parts: if the cosmos does not depend on its parts, neither does any smaller composite.
A revised and updated edition of a title exploring the battle between evolutionary theory's biggest names. Known as one of the firecest battles in science Dawkins and Gould and their supporters have argued over evolution, for over twenty years, and continue, despite Gould's death. Kim Sterelny exposes the real differences between the conceptions of evolution of these two leading scientists. He shows that the conflict extends beyond evolution to their very beliefs in science itself.
Mark Johnston has recently argued that four-dimensionalist theories of persistence are incompatible with some of our most basic ethical and prudential principles. I argue that although Johnston’s arguments succeed on a worm-theoretic account of persistence, they fail on a stage-theoretic account. So much the worse, I conclude, for the worm theory.
Recent work on rationality has been increasingly attentive to “coherence requirements”, with heated debates about both the content of such requirements and their normative status (e.g., whether there is necessarily reason to comply with them). Yet there is little to no work on the metanormative status of coherence requirements. Metaphysically: what is it for two or more mental states to be jointly incoherent, such that they are banned by a coherence requirement? In virtue of what are some putative requirements genuine (...) and others not? Epistemologically: how are we to know which of the requirements are genuine and which aren’t? This paper tries to offer an account that answers these questions. On my account, the incoherence of a set of attitudinal mental states is a matter of its being (partially) constitutive of the mental states in question that, for any agent that holds these attitudes jointly, the agent is disposed, when conditions of full transparency are met, to give up at least one of the attitudes. (shrink)
Is life a purely physical process? What is human nature? Which of our traits is essential to us? In this volume, Daniel McShea and Alex Rosenberg – a biologist and a philosopher, respectively – join forces to create a new gateway to the philosophy of biology; making the major issues accessible and relevant to biologists and philosophers alike. Exploring concepts such as supervenience; the controversies about genocentrism and genetic determinism; and the debate about major transitions central to contemporary thinking (...) about macroevolution; the authors lay out the broad terms in which we should assess the impact of biology on human capacities, social institutions and ethical values. (shrink)
Outside the philosophy classroom, global skeptics – skeptics about all (purported) knowledge of the external world – are rare. But there are people who describe themselves as “skeptics” about various more specific domains, including self-professed “skeptics” about the reality of anthropogenic climate change. There is little to no philosophical literature that juxtaposes the climate change skeptic with the external world skeptic. While many “traditional” epistemologists assume that the external world skeptic poses a serious philosophical challenge in a way that the (...) climate change skeptic doesn’t, many “applied” or “social” epistemologists assume that there isn’t much to be learned from debates about the external world skeptic, finding her challenge to be distant from both common sense and real-world concerns. I try to show that both of these views are mistaken. The external world skeptic raises deep questions that are important for our everyday deliberation about what to believe, and there are significant structural parallels between the arguments for external world skepticism and those for at least a form of climate change skepticism that is idealized – but not too idealized! – from the views of flesh-and-blood climate change skeptics. As such, we have strong reasons to think in parallel about how to reply to both skeptics’ challenges. I thus finish by (briefly) considering how different widespread responses to the external world skeptic might or might not generalize happily to the climate change skeptic’s challenge. (shrink)
Background: how mind functions is subject to continuing scientific discussion. A simplistic approach says that, since no convincing way has been found to model subjective experience, mind cannot exist. A second holds that, since mind cannot be described by classical physics, it must be described by quantum physics. Another perspective concerns mind's hypothesized ability to interact with the world of quanta: it should be responsible for reduction of quantum wave packets; physics producing 'Objective Reduction' is postulated to form the basis (...) for mind-matter interactions. This presentation describes results derived from a new approach to these problems. It is based on well-established biology involving physics not previously applied to the fields of mind, or consciousness studies, that of critical feedback instability. -/- Methods: 'self-organized criticality' in complexity biology places system loci of control at critical instabilities, physical properties of which, including information properties, are presented. Their elucidation shows that they can model hitherto unexplained properties of experience. -/- Results: All results depend on physical properties of critical instabilities. First, at least one feed-back or feed-forward loop must have feedback gain, g = 1: information flows round the loop impress perfect images of system states back on themselves: they represent processes of perfect self-observation. This annihilates system quanta: system excitations are instability fluctuations, which cannot be quantized. Major results follow: -/- 1. Information vectors representing criticality states must include at least one attached information loop denoting self-observation. -/- 2. Such loop structures are attributed a function, 'registering the state's own existence', explaining -/- a. Subjective 'awareness of one's own presence' -/- b. How content-free states of awareness can be remembered (Jon Shear) -/- c. Subjective experience of time duration (Immanuel Kant) -/- d. The 'witness' property of experience – often mentioned by athletes 'in the zone' -/- e. The natural association between consciousness and intelligence -/- This novel, physically and biologically sound approach seems to satisfactorily model subjectivity. -/- Further significant results follow: -/- 1. Registration of external information in excited states of systems at criticality reduces external wave-packets: the new model exhibits 'Objective Reduction' of wave packets. -/- 2. High internal coherence (postulated by Domash & Penrose) leading to a. Non-separable information vector bundles. b. Non-reductive states (Chalmers's criterion for experience). -/- 3. Information that is: a. encoded in coherence negentropy; b. non-digitizable, and therefore c. computationally without digital equivalent (posited by Penrose). -/- Discussion and Conclusions: instability physics implies anharmonic motion, preventing excitation quantization, and totally different from the quantum physics of simple harmonic motion at stability. Instability excitations are different from anything hitherto conceived in information science. They can model aspects of mind never previously treated, including genuine subjectivity, objective reduction of wave-packets, and inter alia all properties given above. (shrink)
This much-anticipated book is a detailed elaboration and defense of Levine’s influential claim that there is an “explanatory gap” between the mental and the physical.
ABSTRACTMany discussions of the ‘preface paradox’ assume that it is more troubling for deductive closure constraints on rational belief if outright belief is reducible to credence. I show that this is an error: we can generate the problem without assuming such reducibility. All that we need are some very weak normative assumptions about rational relationships between belief and credence. The only view that escapes my way of formulating the problem for the deductive closure constraint is in fact itself a reductive (...) view: namely, the view that outright belief is credence 1. However, I argue that this view is unsustainable. Moreover, my version of the problem turns on no particular theory of evidence or evidential probability, and so cannot be avoided by adopting some revisionary such theory. In sum, deductive closure is in more serious, and more general, trouble than some have thought. (shrink)
The difference between the unity of the individual and the separateness of persons requires that there be a shift in the moral weight that we accord to changes in utility when we move from making intrapersonal tradeoffs to making interpersonal tradeoffs. We examine which forms of egalitarianism can, and which cannot, account for this shift. We argue that a form of egalitarianism which is concerned only with the extent of outcome inequality cannot account for this shift. We also argue that (...) a view which is concerned with both outcome inequality and with the unfairness of inequality in individuals‘ expected utilities can account for this shift. Finally, we limn an alternative view, on.. (shrink)
Are there ‘degrees of causation’? Yes and no: causation is not a scalar relation, but different causes can contribute to a causing of an effect to different extents. In this paper, I motivate a probabilistic analysis of an event’s degree of contribution to a causing of an effect and explore some of its consequences.
Are there laws in evolutionary biology? Stephen J. Gould has argued that there are factors unique to biological theorizing which prevent the formulation of laws in biology, in contradistinction to the case in physics and chemistry. Gould offers the problem of complexity as just such a fundamental barrier to biological laws in general, and to Dollos Law in particular. But I argue that Gould fails to demonstrate: (1) that Dollos Law is not law-like, (2) that the alleged (...) failure of Dollos Law demonstrates why there cannot be laws in biological science, and (3) that complexity is a fundamental barrier to nomologicality. (shrink)
In this paper, I argue that theories of perception that appeal to Helmholtz’s idea of unconscious inference (“Helmholtzian” theories) should be taken literally, i.e. that the inferences appealed to in such theories are inferences in the full sense of the term, as employed elsewhere in philosophy and in ordinary discourse. -/- In the course of the argument, I consider constraints on inference based on the idea that inference is a deliberate acton, and on the idea that inferences depend on the (...) syntactic structure of representations. I argue that inference is a personal-level but sometimes unconscious process that cannot in general be distinguished from association on the basis of the structures of the representations over which it’s defined. I also critique arguments against representationalist interpretations of Helmholtzian theories, and argue against the view that perceptual inference is encapsulated in a module. (shrink)
This paper demarcates a theoretically interesting class of "evaluational adjectives." This class includes predicates expressing various kinds of normative and epistemic evaluation, such as predicates of personal taste, aesthetic adjectives, moral adjectives, and epistemic adjectives, among others. Evaluational adjectives are distinguished, empirically, in exhibiting phenomena such as discourse-oriented use, felicitous embedding under the attitude verb `find', and sorites-susceptibility in the comparative form. A unified degree-based semantics is developed: What distinguishes evaluational adjectives, semantically, is that they denote context-dependent measure functions ("evaluational (...) perspectives")—context-dependent mappings to degrees of taste, beauty, probability, etc., depending on the adjective. This perspective-sensitivity characterizing the class of evaluational adjectives cannot be assimilated to vagueness, sensitivity to an experiencer argument, or multidimensionality; and it cannot be demarcated in terms of pretheoretic notions of subjectivity, common in the literature. I propose that certain diagnostics for "subjective" expressions be analyzed instead in terms of a precisely specified kind of discourse-oriented use of context-sensitive language. I close by applying the account to `find x PRED' ascriptions. (shrink)
This user-friendly text covers key issues in the philosophy of science in an accessible and philosophically serious way. It will prove valuable to students studying philosophy of science as well as science students. Prize-winning author Alex Rosenberg explores the philosophical problems that science raises by its very nature and method. He skilfully demonstrates that scientific explanation, laws, causation, theory, models, evidence, reductionism, probability, teleology, realism and instrumentalism actually pose the same questions that Plato, Aristotle, Descartes, Hume, Kant and their (...) successors have grappled with for centuries. (shrink)
Expressivism promises an illuminating account of the nature of normative judgment. But worries about the details of expressivist semantics have led many to doubt whether expressivism's putative advantages can be secured. Drawing on insights from linguistic semantics and decision theory, I develop a novel framework for implementing an expressivist semantics that I call ordering expressivism. I argue that by systematically interpreting the orderings that figure in analyses of normative terms in terms of the basic practical attitude of conditional weak preference, (...) the expressivist can explain the semantic properties of normative sentences in terms of the logical properties of that attitude. Expressivism's problems with capturing the logical relations among normative sentences can be reduced to the familiar, more tractable problem of explaining certain coherence constraints on preferences. Particular attention is given to the interpretation of wide-scope negation. The proposed solution is also extended to other types of embedded contexts—most notably, disjunctions. (shrink)
It is often natural to compare two events by describing one as ‘more of a cause’ of some effect than the other. But what do such comparisons amount to, exactly? This paper aims to provide a guided tour of the recent literature on ‘degrees of causation’. Section 2 looks at what I call ‘dependence measures’, which arise from thinking of causes as difference-makers. Section 3 looks at what I call ‘production measures’, which arise from thinking of causes as jointly sufficient (...) for their effects. Finally, section 4 examines the important question of whether there is any sense in which an agent is more responsible for an outcome in virtue of her action being more of a cause of it. I describe a puzzle that emerges from this question, and explore various strategies for resolving it. (shrink)
In this paper it is argued that we should amend the traditional understanding of the view known as the guise of the good. The guise of the good is traditionally understood as the view that we only want to act in ways that we believe to be good in some way. But it is argued that a more plausible view is that we only want to act in ways that we believe we have normative reason to act in. This change (...) – from formulating the view in terms of goodness to formulating it in terms of reasons – is significant because the revised view avoids various old and new counterexamples to the traditional view, because the revised view is better motivated than the traditional view, and because the revised view is better placed to explain certain features of desire than the traditional view. The paper finishes by showing that the conclusions reached are compatible with theories such as the buck passing account of value. (shrink)
Book Review of Being for Beauty: Aesthetic Agency and Value, by Dominic McIver Lopes. This review summarizes the book's main thread of argument and Lopes' positive view, which he dubs the "network theory". It ends by reflecting on whether Lopes' account of aesthetic normativity is ultimately satisfactory.
Can we ever truly answer the question, “Who am I?” Moderated by Alex Voorhoeve (London School of Economics), neuro-philosopher Elie During (University of Paris, Ouest Nanterre), cognitive scientist David Jopling (York University, Canada), social psychologist Timothy Wilson (University of Virginia),and ethicist Frances Kamm (Harvard University) examine the difficulty of achieving genuine self-knowledge and how the pursuit of self-knowledge plays a role in shaping the self.
This is the first book to systematically examine the underlying theory of evidence in Anglo-American legal systems. Stein develops a detailed and innovative theory which sets aside the traditional vision of evidence law as facilitating the discovery of the truth. Combining probability theory, epistemology, economic analysis, and moral philosophy, he argues instead that the fundamental purpose of evidence law is to apportion the risk of error in conditions of uncertainty.