I argue for an epistemic conception of voting, a conception on which the purpose of the ballot is at least in some cases to identify which of several policy proposals will best promote the public good. To support this view I first briefly investigate several notions of the kind of public good that public policy should promote. Then I examine the probability logic of voting as embodied in two very robust versions of the Condorcet Jury Theorem and some related results. (...) These theorems show that if the number of voters or legislators is sufficiently large and the average of their individual propensities to select the better of two policy proposals is a little above random chance, and if each person votes his or her own best judgment (rather than in alliance with a block or faction), then the majority is extremely likely to select the better alternative. Here ‘better alternative’ means that policy or law that will best promote the public good. I also explicate a Convincing Majorities Theorem, which shows the extent to which the majority vote should provide evidence that the better policy has been selected. Finally, I show how to extend all of these results to judgments among multiple alternatives through the kind of sequential balloting typical of the legislative amendment process. (shrink)
Think of confirmation in the context of the Ravens Paradox this way. The likelihood ratio measure of incremental confirmation gives us, for an observed Black Raven and for an observed non-Black non-Raven, respectively, the following “full” likelihood ratios.
In Mumford’s Dispositions, the reader will find an extended treatment of the recent debate about dispositions from Ryle and Geach to the present. Along the way, Mumford presents his own views on several key points, though we found the book much more thorough in its assessment of opposing views than in the development of a positive account. As we’ll try to make clear, some of the ideas endorsed in Dispositions are certainly worth pursuing; others are not. Following Mackie, Shoemaker, and (...) others,1 Mumford stresses that it’s one thing to distinguish between dispositional and categorical ascriptions and quite another to draw an ontological line between dispositional and categorical properties. The book itself can be divided roughly into those chapters that deal with the relationship between dispositional ascriptions and corresponding conditionals (like ‘is fragile’ and ‘would break if struck’); and those chapters that deal with the relationship between dispositions and their categorical bases (like fragility and having internal structure XYZ). We shall examine each cluster of issues in turn. (shrink)
Seth Yalcin has pointed out some puzzling facts about the behaviour of epistemic modals in certain embedded contexts. For example, conditionals that begin ‘If it is raining and it might not be raining, …’ sound unacceptable, unlike conditionals that begin ‘If it is raining and I don’t know it, …’. These facts pose a prima facie problem for an orthodox treatment of epistemic modals, according to which they express propositions about the knowledge of some contextually specified individual or group. This (...) paper develops an explanation of the puzzling facts about embedding within an orthodox framework, using broadly Gricean resources. (shrink)
Lewis's notion of a "natural" property has proved divisive: some have taken to the notion with enthusiasm, while others have been sceptical. However, it is far from obvious what the enthusiasts and the sceptics are disagreeing about. This paper attempts to articulate what is at stake in this debate.
Rational consequence relations and Popper functions provide logics for reasoning under uncertainty, the former purely qualitative, the latter probabilistic. But few researchers seem to be aware of the close connection between these two logics. I’ll show that Popper functions are probabilistic versions of rational consequence relations. I’ll not assume that the reader is familiar with either logic. I present them, and explicate the relationship between them, from the ground up. I’ll also present alternative axiomatizations for each logic, showing them to (...) depend on weaker axioms than usually recognized. (shrink)
What ought one to do, epistemically speaking, when faced with a disagreement? Faced with this question, one naturally hopes for an answer that is principled, general, and intuitively satisfying. We want to argue that this is a vain hope. Our claim is that a satisfying answer will prove elusive because of non-transparency: that there is no condition such that we are always in a position to know whether it obtains. When we take seriously that there is nothing, including our own (...) minds, to which we have assured access, the familiar project of formulating epistemic norms is destabilized. In this paper, we will show how this plays out in the special case of disagreement. But we believe that a larger lesson can ultimately be extracted from our discussion: namely, that non-transparency threatens our hope for fully satisfying epistemic norms in general. (shrink)
Claims of the form 'I know P and it might be that not-P' tend to sound odd. One natural explanation of this oddity is that the conjuncts are semantically incompatible: in its core epistemic use, 'Might P' is true in a speaker's mouth only if the speaker does not know that not-P. In this paper I defend this view against an alternative proposal that has been advocated by Trent Dougherty and Patrick Rysiew and elaborated upon in Jeremy Fantl and Matthew (...) McGrath's recent Knowledge in an Uncertain World. (shrink)
This discussion piece critically examines some of the key ideology that figures in Elizabeth Fricker's ‘Stating and Insinuating’(2012), raises a number of queries about the details of Fricker's argumentation, and develops some ideas about the normative structure of testimony that relate to the themes of that paper.
This book critically examines some widespread views about the semantic phenomenon of reference and the cognitive phenomenon of singular thought. It begins with a defense of the view that neither is tied to a special relation of causal or epistemic acquaintance. It then challenges the alleged semantic rift between definite and indefinite descriptions on the one hand, and names and demonstratives on the other—a division that has been motivated in part by appeals to considerations of acquaintance. Drawing on recent work (...) in semantics, the book explores a more unified account of all four types of expression, according to which none of them paradigmatically fits the profile of a referential term. On the proposed framework, all four involve existential quantification but admit of uses that exhibit many of the traits associated with reference—a phenomenon that is due to the presence of what we call a ‘singular restriction’ on the existentially quantified domain. The book concludes by drawing out some implications of the proposed semantic picture for the traditional categories of reference and singular thought. (shrink)
One of Weatherson's main goals is to drive home a methodological point: We shouldn't be looking for deductive arguments for or against relativism – we should instead be evaluating inductive arguments designed to show that either relativism or some alternative offers the best explanation of some data. Our focus in Chapter Two on diagnostics for shared content allegedly encourages the search for deductive arguments and so does more harm than good. We have no methodological slogan of our own to offer. (...) Part of what we were trying to do was to clearly articulate what the relevant issues even are. Often relativism is characterized in a way that is offhand and sloppy. The relativist, we are told, accepts 'disquotational truth' for various kinds of claims but denies that they are 'true simpliciter'. What exactly is going on here? Do the relevant distinctions even make sense? Before engaging in various abductive manoevers we need to get much clearer about what it is that we are trying to argue for and against. That said we are perfectly happy with the kind of inductive enterprise that Weatherson sketches. For our part, we were fully aware (and indeed explicit) that the 'agreement' diagnostic does not ‘deductively’ settle all of the relevant disputes. A significant part of Chapter Four is dedicated to something in the vicinity of Weatherson's project. Note, indeed, that our diagnostics are even stated using the ideology of 'providing evidence' – hardly the basis for a straightforwardly deductive argument for or against relativism. Finally, though, we should point out that we are not hostile to deductive arguments against relativism. A philosopher's evidence is theory-laden and in part owes itself to epistemic powers that his or her opponents may not acknowledge. In short, their evidence may not always have the hallmarks of 'evidence neutrality' --- evidence that their opponents would recognize as such. We are perfectly open to there being compelling deductive arguments against relativism from such evidence.. (shrink)
We argue that certain modal questions raise serious problems for a modal metaphysics on which we are permitted to quantify unrestrictedly over all possibilia. In particular, we argue that, on reasonable assumptions, both David Lewis's modal realism and Timothy Williamson's necessitism are saddled with the remarkable conclusion that there is some cardinal number of the form N α such that there could not be more than N α -many angels in existence. In the last section, we make use of similar (...) ideas to draw a moral for a recent debate in meta-ontology. (shrink)
Scientiﬁ c theories and hypotheses make claims that go well beyond what we can immediately observe. How can we come to know whether such claims are true? The obvious approach is to see what a hypothesis says about the observationally accessible parts of the world. If it gets that wrong, then it must be false; if it gets that right, then it may have some claim to being true. Any sensible a empt to construct a logic that captures how we (...) may come to reasonably believe the falsehood or truth of scientiﬁ c hypotheses must be built on this idea. Philosophers refer to such logics as logics of conﬁ rmation or as conﬁ rmation theories. (shrink)
Confirmation theory is the study of the logic by which scientific hypotheses may be confirmed or disconfirmed, or even refuted by evidence. A specific theory of confirmation is a proposal for such a logic. Presumably the epistemic evaluation of scientific hypotheses should largely depend on their empirical content – on what they say the evidentially accessible parts of the world are like, and on the extent to which they turn out to be right about that. Thus, all theories of confirmation (...) rely on measures of how well various alternative hypotheses account for the evidence.1 Most contemporary confirmation theories employ probability functions to provide such a measure. They measure how well the evidence fits what the hypothesis says about the world in terms of how likely it is that the evidence should occur were the hypothesis true. Such hypothesis-based probabilities of evidence claims are called likelihoods. Clearly, when the evidence is more likely according to one hypothesis than according to an alternative, that should redound to the credit of the former hypothesis and the discredit of the later. But various theories of confirmation diverge on precisely how this credit is to be measured? (shrink)
Sections 1 through 3 present all of the main ideas behind the probabilistic logic of evidential support. For most readers these three sections will suffice to provide an adequate understanding of the subject. Those readers who want to know more about how the logic applies when the implications of hypotheses about evidence claims (called likelihoods) are vague or imprecise may, after reading sections 1-3, skip to section 6. Sections 4 and 5 are for the more advanced reader who wants a (...) detailed understanding of some telling results about how this logic may bring about convergence to the truth. (shrink)
In Hawthorne and Magidor 2009, we presented an argument against Stalnaker’s meta-semantic framework. In this paper we address two critical responses to our paper: Stalnaker 2009, and Almotahari and Glick 2010. Sections 1–4 are devoted to addressing Stalnaker’s response and sections 5–8 to addressing Almotahari and Glick’s. We pay special attention (Sect. 2) to an interesting argument that Stalnaker offers to bolster the transparency of presupposition (an argument that, if successful, could also form the basis of a defence of the (...) KK principle). (shrink)
The Paradox of the Ravens (a.k.a,, The Paradox of Confirmation) is indeed an old chestnut. A great many things have been written and said about this paradox and its implications for the logic of evidential support. The first part of this paper will provide a brief survey of the early history of the paradox. This will include the original formulation of the paradox and the early responses of Hempel, Goodman, and Quine. The second part of the paper will describe attempts (...) to resolve the paradox within a Bayesian framework, and show how to improve upon them. This part begins with a discussion of how probabilistic methods can help to clarify the statement of the paradox itself. And it describes some of the early responses to probabilistic explications. We then inspect the assumptions employed by traditional (canonical) Bayesian approaches to the paradox. These assumptions may appear to be overly strong. So, drawing on weaker assumptions, we formulate a new-and-improved Bayesian confirmation-theoretic resolution of the Paradox of the Ravens. (shrink)
Oxford Studies in Epistemology is a biennial publicaton which offers a regular snapshot of state-of-the-art work in this important field. Under the guidance of a distinguished editorial board composed of leading philosophers in North America, Europe and Australasia, it publishes exemplary papers in epistemology, broadly construed. Topics within its purview include: *traditional epistemological questions concerning the nature of belief, justification, and knowledge, the status of scepticism, the nature of the a priori, etc; *new developments in epistemology, including movements such as (...) naturalized epistemology, feminist epistemology, social epistemology, and virtue epistemology, and approaches such as contextualism; *foundational questions in decision-theory; *confirmation theory and other branches of philosophy of science that bear on traditional issues in epistemology; *topics in the philosophy of perception relevant to epistemology; *topics in cognitive science, computer science, developmental, cognitive, and social psychology that bear directly on traditional epistemological questions; and *work that examines connections between epistemology and other branches of philosophy, including work on testimony and the ethics of belief. Anyone wanting to understand the latest developments at the leading edge of the discipline can start here. (shrink)
Relativism has dominated many intellectual circles, past and present, but the twentieth century saw it banished to the fringes of mainstream analytic philosophy. Of late, however, it is making something of a comeback within that loosely configured tradition, a comeback that attempts to capitalize on some important ideas in foundational semantics. Relativism and Monadic Truth aims not merely to combat analytic relativism but also to combat the foundational ideas in semantics that led to its revival. Doing so requires a proper (...) understanding of the significance of possible worlds semantics, an examination of the relation between truth and the flow of time, an account of putatively relevant data from attitude and speech act reporting, and a careful treatment of various operators. Throughout, Herman Cappelen and John Hawthorne contrast relativism with a view according to which the contents of thought and talk are propositions that instantiate the fundamental monadic properties of truth simpliciter and falsity simpliciter. Such propositions, they argue, are the semantic values of sentences (relative to context), the objects of illocutionary acts, and, unsurprisingly, the objects of propositional attitudes. (shrink)
In a penetrating investigation of the relationship between belief and quantitative degrees of confidence (or degrees of belief) Richard Foley (1992) suggests the following thesis: ... it is epistemically rational for us to believe a proposition just in case it is epistemically rational for us to have a sufficiently high degree of confidence in it, sufficiently high to make our attitude towards it one of belief. Foley goes on to suggest that rational belief may be just rational degree of confidence (...) above some threshold level that the agent deems sufficient for belief. He finds hints of this view in Locke’s discussion of probability and degrees of assent, so he calls it the Lockean Thesis.1 The Lockean Thesis has important implications for the logic of belief. Most prominently, it implies that even a logically ideal agent whose degrees of confidence satisfy the axioms of probability theory may quite rationally believe each of a large body of propositions that are jointly inconsistent. For example, an agent may legitimately believe that on each given occasion her well-maintained car will start, but nevertheless believe that she will eventually encounter a.. (shrink)
We think we have lots of substantial knowledge about the future. But contemporary wisdom has it that indeterminism prevails in such a way that just about any proposition about the future has a non-zero objective chance of being false.2, 3 What should one do about this? One, pessimistic, reaction is scepticism about knowledge of the future. We think this should be something of a last resort, especially since this scepticism is likely to infect alleged knowledge of the present and past. (...) One anti-sceptical strategy is to pin our hopes on determinism, conceding that knowledge of the future is unavailable in an indeterministic world. This is not satisfying either: we would rather not be hostage to empirical fortune in the way that this strategy recommends. A final strategy, one that we shall explore in this paper, is one of reconciliation: knowledge of a proposition is compatible with a subject’s belief having a non-zero objective chance of error.4 Following Williamson, we are interested in tying knowledge to the presence or absence of error in close cases, and so we shall explore the connections between knowledge and objective chance within such a framework. We don’t want to get tangled up here in complications involved in attempting to formulate a necessary and sufficient condition for knowledge in terms of safety. Instead, we will assume the following rough and ready necessary condition: a subject knows P only if she could not easily have falsely believed P.5 Assuming that easiness is to be spelt.. (shrink)
In his seminal paper 'Assertion', Robert Stalnaker distinguishes between the semantic content of a sentence on an occasion of use and the content asserted by an utterance of that sentence on that occasion. While in general the assertoric content of an utterance is simply its semantic content, the mechanisms of conversation sometimes force the two apart. Of special interest in this connection is one of the principles governing assertoric content in the framework, one according to which the asserted content ought (...) to be identical at each world in the context set (the Uniformity principle). In this paper, we present a problem for Stalnaker's meta-semantic framework, by challenging the plausibility of the Uniformity principle. We argue that the interaction of the framework with facts about epistemic accessibility--in particular, failures of epistemic transparency--cause problems for the Uniformity principle and thus for Stalnaker's framework more generally. (shrink)
The expression ‘Like’ has a wide variety of uses among English and American speakers. It may describe preference, as in (1) She likes mint chip ice cream. It may be used as a vehicle of comparison, as in (2) Trieste is like Minsk on steroids.
Judging by our folk appraisals, then, knowledge and action are intimately related. The theories of rational action with which we are familiar leave this unexplained. Moreover, discussions of knowledge are frequently silent about this connection. This is a shame, since if there is such a connection it would seem to constitute one of the most fundamental roles for knowledge. Our purpose in this paper is to rectify this lacuna, by exploring ways in which knowing something is related to rationally acting (...) upon it, defending one particular proposal against anticipated objections. (shrink)
In a series of thought-provoking and original essays, eighteen leading philosophers engage in head-to-head debates of nine of the most cutting edge topics in contemporary metaphysics. Explores the fundamental questions in contemporary metaphysics in a series of eighteen original essays - 16 of which are newly commissioned for this volume Features an introductory essay by the editors on the nature of metaphysics to prepare the reader for ongoing discussions Offers readers the unique opportunity to observe leading philosophers engage in head-to-head (...) debate on cutting-edge metaphysical topics Provides valuable insights into the flourishing field of contemporary metaphysics. (shrink)
It is natural to think that the relationship between ‘rain’ and the location of rain is different from the relationship between ‘dance’ and the location of dancing. Utterances of (1) are typically interpreted as, in some sense, being about a location in which it rains. (2) is, typically, not interpreted as being about a location in which the dancing takes place.
Oxford Studies in Epistemology is a biennial publicaton which offers a regular snapshot of state-of-the-art work in this important field. Under the guidance of a distinguished editorial board composed of leading philosophers in North America, Europe and Australasia, it will publish exemplary papers in epistemology, broadly construed. Topics within its purview include: *traditional epistemological questions concerning the nature of belief, justification, and knowledge, the status of scepticism, the nature of the a priori, etc; *new developments in epistemology, including movements such (...) as naturalized epistemology, feminist epistemology, social epistemology, and virtue epistemology, and approaches such as contextualism; *foundational questions in decision-theory; *confirmation theory and other branches of philosophy of science that bear on traditional issues in epistemology; *topics in the philosophy of perception relevant to epistemology; *topics in cognitive science, computer science, developmental, cognitive, and social psychology that bear directly on traditional epistemological questions; and *work that examines connections between epistemology and other branches of philosophy, including work on testimony and the ethics of belief. Anyone wanting to understand the latest developments at the leading edge of the discipline can start here. (shrink)
I’ll describe a range of systems for nonmonotonic conditionals that behave like conditional probabilities above a threshold. The rules that govern each system are probabilistically sound in that each rule holds when the conditionals are interpreted as conditional probabilities above a threshold level specific to that system. The well-known preferential and rational consequence relations turn out to be special cases in which the threshold level is 1. I’ll describe systems that employ weaker rules appropriate to thresholds lower than 1, and (...) compare them to these two standard systems. (shrink)
We chart the ways in which closure properties of consequence relations for uncertain inference take on different forms according to whether the relations are generated in a quantitative or a qualitative manner. Among the main themes are: the identification of watershed conditions between probabilistically and qualitatively sound rules; failsafe and classicality transforms of qualitatively sound rules; non-Horn conditions satisfied by probabilistic consequence; representation and completeness problems; and threshold-sensitive conditions such as ‘preface’ and ‘lottery’ rules.
In this short paper, I shall examine some key structural features of Descartes’s metaphysics, as it relates to mind–body dualism. The style of presentation will partly be one of rational reconstruction, designed to present the Cartesian system in a way that will be of maximal interest to contemporary metaphysicians. Section 1 focuses on ﬁve key Cartesian theses about principal attributes. Sections 2 and 3 examine how those theses play themselves out in Descartes’s discussion of mind–body dualism.