This paper develops an information sensitive theory of the semantics and probability of conditionals and statements involving epistemic modals. The theory validates a number of principles linking probability and modality, including the principle that the probability of a conditional 'If A, then C' equals the probability of C, updated with A. The theory avoids so-called triviality results, which are standardly taken to show that principles of this sort cannot be validated. To achieve this, we deny that rational agents update their (...) credences via conditionalization. We offer a new rule of update, Hyperconditionalization, which agrees with Conditionalization whenever nonmodal statements are at stake, but differs for modal and conditional sentences. (shrink)
The book was planned and written as a single, sustained argument. But earlier versions of a few parts of it have appeared separately. The object of this book is both to establish the existence of the paradoxes, and also to describe a non-Pascalian concept of probability in terms of which one can analyse the structure of forensic proof without giving rise to such typical signs of theoretical misfit. Neither the complementational principle for negation nor the multiplicative principle for conjunction applies (...) to the central core of any forensic proof in the Anglo-American legal system. There are four parts included in this book. Accordingly, these parts have been written in such a way that they may be read in different orders by different kinds of reader. (shrink)
This book explores a question central to philosophy--namely, what does it take for a belief to be justified or rational? According to a widespread view, whether one has justification for believing a proposition is determined by how probable that proposition is, given one's evidence. In this book this view is rejected and replaced with another: in order for one to have justification for believing a proposition, one's evidence must normically support it--roughly, one's evidence must make the falsity of that proposition (...) abnormal in the sense of calling for special, independent explanation. This conception of justification bears upon a range of topics in epistemology and beyond. Ultimately, this way of looking at justification guides us to a new, unfamiliar picture of how we should respond to our evidence and manage our own fallibility. This picture is developed here. (shrink)
Richard Jeffrey is beyond dispute one of the most distinguished and influential philosophers working in the field of decision theory and the theory of knowledge. His work is distinctive in showing the interplay of epistemological concerns with probability and utility theory. Not only has he made use of standard probabilistic and decision theoretic tools to clarify concepts of evidential support and informed choice, he has also proposed significant modifications of the standard Bayesian position in order that it provide a better (...) fit with actual human experience. Probability logic is viewed not as a source of judgment but as a framework for explaining the implications of probabilistic judgments and their mutual compatability. This collection of essays spans a period of some 35 years and includes what have become some of the classic works in the literature. There is also one completely new piece, while in many instances Jeffrey includes afterthoughts on the older essays. (shrink)
This book offers a concise survey of basic probability theory from a thoroughly subjective point of view whereby probability is a mode of judgment. Written by one of the greatest figures in the field of probability theory, the book is both a summation and synthesis of a lifetime of wrestling with these problems and issues. After an introduction to basic probability theory, there are chapters on scientific hypothesis-testing, on changing your mind in response to generally uncertain observations, on expectations of (...) the values of random variables, on de Finetti's dissolution of the so-called problem of induction, and on decision theory. (shrink)
In this influential study of central issues in the philosophy of science, Paul Horwich elaborates on an important conception of probability, diagnosing the failure of previous attempts to resolve these issues as stemming from a too-rigid conception of belief. Adopting a Bayesian strategy, he argues for a probabilistic approach, yielding a more complete understanding of the characteristics of scientific reasoning and methodology. Presented in a fresh twenty-first-century series livery, and including a specially commissioned preface written by Colin Howson, illuminating its (...) enduring importance and relevance to philosophical enquiry, this engaging work has been revived for a new generation of readers. (shrink)
With this treatise, an insightful exploration of the probabilistic connection between philosophy and the history of science, the famous economist breathed new life into studies of both disciplines. Originally published in 1921, this important mathematical work represented a significant contribution to the theory regarding the logical probability of propositions. Keynes effectively dismantled the classical theory of probability, launching what has since been termed the “logical-relationist” theory. In so doing, he explored the logical relationships between classifying a proposition as “highly probable” (...) and as a “justifiable induction.” Unabridged republication of the classic 1921 edition. (shrink)
According to what is now commonly referred to as “the Equation” in the literature on indicative conditionals, the probability of any indicative conditional equals the probability of its consequent of the conditional given the antecedent of the conditional. Philosophers widely agree in their assessment that the triviality arguments of Lewis and others have conclusively shown the Equation to be tenable only at the expense of the view that indicative conditionals express propositions. This study challenges the correctness of that assessment by (...) presenting data that cast doubt on an assumption underlying all triviality arguments. (shrink)
The Ramseyan thesis that the probability of an indicative conditional is equal to the corresponding conditional probability of its consequent given its antecedent is both widely confirmed and subject to attested counterexamples (e.g., McGee 2000, Kaufmann 2004). This raises several puzzling questions. For instance, why are there interpretations of conditionals that violate this Ramseyan thesis in certain contexts, and why are they otherwise very rare? In this paper, I raise some challenges to Stefan Kaufmann's account of why the Ramseyan thesis (...) sometimes fails, and motivate my own theory. On my theory, the proposition expressed by an indicative conditional is partially determined by a background partition, and hence its probability depends on the choice of such a partition. I hold that this background partition is contextually determined, and in certain conditions is set by a salient question under discussion in the context. I show how the resulting theory offers compelling answers to the puzzling questions raised by failures of the Ramseyan thesis. (shrink)
Many have argued that a rational agent's attitude towards a proposition may be better represented by a probability range than by a single number. I show that in such cases an agent will have unstable betting behaviour, and so will behave in an unpredictable way. I use this point to argue against a range of responses to the ‘two bets’ argument for sharp probabilities.
This chapter will review selected aspects of the terrain of discussions about probabilities in statistical mechanics (with no pretensions to exhaustiveness, though the major issues will be touched upon), and will argue for a number of claims. None of the claims to be defended is entirely original, but all deserve emphasis. The first, and least controversial, is that probabilistic notions are needed to make sense of statistical mechanics. The reason for this is the same reason that convinced Maxwell, Gibbs, (...) and Boltzmann that probabilities would be needed, namely, that the second law of thermodynamics, which in its original formulation says that certain processes are impossible, must, on the kinetic theory, be replaced by a weaker formulation according to which what the original version deems impossible is merely improbable. Second is that we ought not take the standard measures invoked in equilibrium statistical mechanics as giving, in any sense, the correct probabilities about microstates of the system. We can settle for a much weaker claim: that the probabilities for outcomes of experiments yielded by the standard distributions are effectively the same as those yielded by any distribution that we should take as a representing probabilities over microstates. Lastly, (and most controversially): in asking about the status of probabilities in statistical mechanics, the familiar dichotomy between epistemic probabilities (credences, or degrees of belief) and ontic (physical) probabilities is insufficient; the concept of probability that is best suited to the needs of statistical mechanics is one that combines epistemic and physical considerations. (shrink)
I describe a realist, ontologically objective interpretation of probability, "far-flung frequency (FFF) mechanistic probability". FFF mechanistic probability is defined in terms of facts about the causal structure of devices and certain sets of frequencies in the actual world. Though defined partly in terms of frequencies, FFF mechanistic probability avoids many drawbacks of well-known frequency theories and helps causally explain stable frequencies, which will usually be close to the values of mechanistic probabilities. I also argue that it's a virtue rather (...) than a failing of FFF mechanistic probability that it does not define single-case chances, and compare some aspects of my interpretation to a recent interpretation proposed by Strevens. (shrink)
This is a study in the meaning of natural language probability operators, sentential operators such as probably and likely. We ask what sort of formal structure is required to model the logic and semantics of these operators. Along the way we investigate their deep connections to indicative conditionals and epistemic modals, probe their scalar structure, observe their sensitivity to contex- tually salient contrasts, and explore some of their scopal idiosyncrasies.
A. J. Ayer was one of the foremost analytical philosophers of the twentieth century, and was known as a brilliant and engaging speaker. In essays based on his influential Dewey Lectures, Ayer addresses some of the most critical and controversial questions in epistemology and the philosophy of science, examining the nature of inductive reasoning and grappling with the issues that most concerned him as a philosopher. This edition contains revised and expanded versions of the lectures and two additional essays. Ayer (...) begins by considering Hume's formulation of the problem of induction and then explores the inferences on which we base our beliefs in factual matters. In other essays, he defines the three kinds of probability that inform inductive reasoning and examines the various criteria for verifiability and falsifiability. In his extensive introduction, Graham Macdonald discusses the arguments in _Probability and Evidence_, how they relate to Ayer's other works, and their influence in contemporary philosophy. He also provides a brief biographical sketch of Ayer, and includes a bibliography of works about and in response to _Probability and Evidence_. (shrink)
It is well known that classical, aka ‘sharp’, Bayesian decision theory, which models belief states as single probability functions, faces a number of serious difficulties with respect to its handling of agnosticism. These difficulties have led to the increasing popularity of so-called ‘imprecise’ models of decision-making, which represent belief states as sets of probability functions. In a recent paper, however, Adam Elga has argued in favour of a putative normative principle of sequential choice that he claims to be borne out (...) by the sharp model but not by any promising incarnation of its imprecise counterpart. After first pointing out that Elga has fallen short of establishing that his principle is indeed uniquely borne out by the sharp model, I cast aspersions on its plausibility. I show that a slight weakening of the principle is satisfied by at least one, but interestingly not all, varieties of the imprecise model and point out that Elga has failed to motivate his stronger commitment. (shrink)
Many have claimed that unspecific evidence sometimes demands unsharp, indeterminate, imprecise, vague, or interval-valued probabilities. Against this, a variant of the diachronic Dutch Book argument shows that perfectly rational agents always have perfectly sharp probabilities.
David Wallace has given a decision-theoretic argument for the Born Rule in the context of Everettian quantum mechanics. This approach promises to resolve some long-standing problems with probability in EQM, but it has faced plenty of resistance. One kind of objection charges that the requisite notion of decision-theoretic uncertainty is unavailable in the Everettian picture, so that the argument cannot gain any traction; another kind of objection grants the proof’s applicability and targets the premises. In this article I propose some (...) novel principles connecting the physics of EQM with the metaphysics of modality, and argue that in the resulting framework the incoherence problem does not arise. These principles also help to justify one of the most controversial premises of Wallace’s argument, ‘branching indifference’. Absent any a priori reason to align the metaphysics with the physics in some other way, the proposed principles can be adopted on grounds of theoretical utility. The upshot is that Everettians can, after all, make clear sense of objective probability. 1 Introduction2 Setup3 Individualism versus Collectivism4 The Ingredients of Indexicalism5 Indexicalism and Incoherence5.1 The trivialization problem5.2 The uncertainty problem6 Indexicalism and Branching Indifference6.1 Introducing branching indifference6.2 The pragmatic defence of branching indifference6.3 The non-existence defence of branching indifference6.4 The indexicalist defence of branching indifference7 Conclusion. (shrink)
_Probability: A Philosophical Introduction_ introduces and explains the principal concepts and applications of probability. It is intended for philosophers and others who want to understand probability as we all apply it in our working and everyday lives. The book is not a course in mathematical probability, of which it uses only the simplest results, and avoids all needless technicality. The role of probability in modern theories of knowledge, inference, induction, causation, laws of nature, action and decision-making makes an understanding of (...) it especially important to philosophers and students of philosophy, to whom this book will be invaluable both as a textbook and a work of reference. In this book D. H. Mellor discusses the three basic kinds of probability – physical, epistemic, and subjective – and introduces and assesses the main theories and interpretations of them. The topics and concepts covered include: * chance * frequency * possibility * propensity * credence * confirmation * Bayesianism. _Probability: A Philosophical Introduction_ is essential reading for all philosophy students and others who encounter or need to apply ideas of probability. (shrink)
This book aims to discuss probability and David Hume's inductive scepticism. For the sceptical view which he took of inductive inference, Hume only ever gave one argument. That argument is the sole subject-matter of this book. The book is divided into three parts. Part one presents some remarks on probability. Part two identifies Hume's argument for inductive scepticism. Finally, the third part evaluates Hume's argument for inductive scepticism. Hume's argument that induction must be either deductively valid or circular because based (...) on experience neglects the possibility that it is an argument of non-deductive logic (logical probability, in the sense of Keynes). (shrink)
Enjoying great popularity in decision theory, epistemology, and philosophy of science, Bayesianism as understood here is fundamentally concerned with epistemically ideal rationality. It assumes a tight connection between evidential probability and ideally rational credence, and usually interprets evidential probability in terms of such credence. Timothy Williamson challenges Bayesianism by arguing that evidential probabilities cannot be adequately interpreted as the credences of an ideal agent. From this and his assumption that evidential probabilities cannot be interpreted as the actual credences (...) of human agents either, he concludes that no interpretation of evidential probabilities in terms of credence is adequate. I argue to the contrary. My overarching aim is to show on behalf of Bayesians how one can still interpret evidential probabilities in terms of ideally rational credence and how one can maintain a tight connection between evidential probabilities and ideally rational credence even if the former cannot be interpreted in terms of the latter. By achieving this aim I illuminate the limits and prospects of Bayesianism. (shrink)
This collection of essays is on the relation between probabilities, especially conditional probabilities, and conditionals. It provides negative results which sharply limit the ways conditionals can be related to conditional probabilities. There are also positive ideas and results which will open up areas of research. The collection is intended to honour Ernest W. Adams, whose seminal work is largely responsible for creating this area of inquiry. As well as describing, evaluating, and applying Adams's work the contributions extend (...) his ideas in directions he may or may not have anticipated, but that he certainly inspired. In addition to a wide range of philosophers of science, the volume should interest computer scientists and linguists. (shrink)
We propose a new account of indicative conditionals, giving acceptability and logical closure conditions for them. We start from Adams’ Thesis: the claim that the acceptability of a simple indicative equals the corresponding conditional probability. The Thesis is widely endorsed, but arguably false and refuted by empirical research. To fix it, we submit, we need a relevance constraint: we accept a simple conditional 'If φ, then ψ' to the extent that (i) the conditional probability p(ψ|φ) is high, provided that (ii) (...) φ is relevant for ψ. How (i) should work is well-understood. It is (ii) that holds the key to improve our understanding of conditionals. Our account has (i) a probabilistic component, using Popper functions; (ii) a relevance component, given via an algebraic structure of topics or subject matters. We present a probabilistic logic for simple indicatives, and argue that its (in)validities are both theoretically desirable and in line with empirical results on how people reason with conditionals. (shrink)
Some have argued that chance and determinism are compatible in order to account for the objectivity of probabilities in theories that are compatible with determinism, like Classical Statistical Mechanics (CSM) and Evolutionary Theory (ET). Contrarily, some have argued that chance and determinism are incompatible, and so such probabilities are subjective. In this paper, I argue that both of these positions are unsatisfactory. I argue that the probabilities of theories like CSM and ET are not chances, but also (...) that they are not subjective probabilities either. Rather, they are a third type of probability, which I call counterfactual probability. The main distinguishing feature of counterfactual-probability is the role it plays in conveying important counterfactual information in explanations. This distinguishes counterfactual probability from chance as a second concept of objective probability. (shrink)
We argue that a fashionable interpretation of the theory of natural selection as a claim exclusively about populations is mistaken. The interpretation rests on adopting an analysis of fitness as a probabilistic propensity which cannot be substantiated, draws parallels with thermodynamics which are without foundations, and fails to do justice to the fundamental distinction between drift and selection. This distinction requires a notion of fitness as a pairwise comparison between individuals taken two at a time, and so vitiates the interpretation (...) of the theory as one about populations exclusively. (shrink)
Decision theory and the theory of rational choice have recently been the subjects of considerable research by philosophers and economists. However, no adequate anthology exists which can be used to introduce students to the field. This volume is designed to meet that need. The essays included are organized into five parts covering the foundations of decision theory, the conceptualization of probability and utility, pholosophical difficulties with the rules of rationality and with the assessment of probability, and causal decision theory. The (...) editors provide an extensive introduction to the field and introductions to each part. (shrink)
This book is meant to be a primer, that is, an introduction, to probability logic, a subject that appears to be in its infancy. Probability logic is a subject envisioned by Hans Reichenbach and largely created by Adams. It treats conditionals as bearers of conditional probabilities and discusses an appropriate sense of validity for arguments such conditionals, as well as ordinary statements as premisses. This is a clear well-written text on the subject of probability logic, suitable for advanced undergraduates (...) or graduates, but also of interest to professional philosophers. There are well-thought-out exercises, and a number of advanced topics treated in appendices, while some are brought up in exercises and some are alluded to only in footnotes. By this means, it is hoped that the reader will at least be made aware of most of the important ramifications of the subject and its tie-ins with current research, and will have some indications concerning recent and relevant literature. (shrink)
The epistemic probability of A given B is the degree to which B evidentially supports A, or makes A plausible. This paper is a first step in answering the question of what determines the values of epistemic probabilities. I break this question into two parts: the structural question and the substantive question. Just as an object’s weight is determined by its mass and gravitational acceleration, some probabilities are determined by other, more basic ones. The structural question asks what (...)probabilities are not determined in this way—these are the basic probabilities which determine values for all other probabilities. The substantive question asks how the values of these basic probabilities are determined. I defend an answer to the structural question on which basic probabilities are the probabilities of atomic propositions conditional on potential direct explanations. I defend this against the view, implicit in orthodox mathematical treatments of probability, that basic probabilities are the unconditional probabilities of complete worlds. I then apply my answer to the structural question to clear up common confusions in expositions of Bayesianism and shed light on the “problem of the priors.”. (shrink)
We explore ways in which purely qualitative belief change in the AGM tradition throws light on options in the treatment of conditional probability. First, by helping see why it can be useful to go beyond the ratio rule defining conditional from one-place probability. Second, by clarifying what is at stake in different ways of doing that. Third, by suggesting novel forms of conditional probability corresponding to familiar variants of qualitative belief change, and conversely. Likewise, we explain how recent work on (...) the qualitative part of probabilistic inference leads to a very broad class of 'proto-probability' functions. (shrink)
I present a proof of the quantum probability rule from decision-theoretic assumptions, in the context of the Everett interpretation. The basic ideas behind the proof are those presented in Deutsch's recent proof of the probability rule, but the proof is simpler and proceeds from weaker decision-theoretic assumptions. This makes it easier to discuss the conceptual ideas involved in the proof, and to show that they are defensible.
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. _Philosophical Theories of Probability_ is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
Subjective probability plays an increasingly important role in many fields concerned with human cognition and behavior. Yet there have been significant criticisms of the idea that probabilities could actually be represented in the mind. This paper presents and elaborates a view of subjective probability as a kind of sampling propensity associated with internally represented generative models. The resulting view answers to some of the most well known criticisms of subjective probability, and is also supported by empirical work in neuroscience (...) and behavioral psychology. The repercussions of the view for how we conceive of many ordinary instances of subjective probability, and how it relates to more traditional conceptions of subjective probability, are discussed in some detail. (shrink)
It is possible that a fair coin tossed infinitely many times will always land heads. So the probability of such a sequence of outcomes should, intuitively, be positive, albeit miniscule: 0 probability ought to be reserved for impossible events. And, furthermore, since the tosses are independent and the probability of heads (and tails) on a single toss is half, all sequences are equiprobable. But Williamson has adduced an argument that purports to show that our intuitions notwithstanding, the probability of an (...) infinite sequence is 0. In this paper, I rebut his argument.No Abstract. (shrink)
Non-Archimedean probability functions allow us to combine regularity with perfect additivity. We discuss the philosophical motivation for a particular choice of axioms for a non-Archimedean probability theory and answer some philosophical objections that have been raised against infinitesimal probabilities in general. _1_ Introduction _2_ The Limits of Classical Probability Theory _2.1_ Classical probability functions _2.2_ Limitations _2.3_ Infinitesimals to the rescue? _3_ NAP Theory _3.1_ First four axioms of NAP _3.2_ Continuity and conditional probability _3.3_ The final axiom of (...) NAP _3.4_ Infinite sums _3.5_ Definition of NAP functions via infinite sums _3.6_ Relation to numerosity theory _4_ Objections and Replies _4.1_ Cantor and the Archimedean property _4.2_ Ticket missing from an infinite lottery _4.3_ Williamson’s infinite sequence of coin tosses _4.4_ Point sets on a circle _4.5_ Easwaran and Pruss _5_ Dividends _5.1_ Measure and utility _5.2_ Regularity and uniformity _5.3_ Credence and chance _5.4_ Conditional probability _6_ General Considerations _6.1_ Non-uniqueness _6.2_ Invariance Appendix. (shrink)
This article outlines a theory of naive probability. According to the theory, individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an extensional way: They construct mental models of what is true in the various possibilities. Each model represents an equiprobable alternative unless individuals have beliefs to the contrary, in which case some models will have higher probabilities than others. The probability of an event depends on the proportion of models in which (...) it occurs. The theory predicts several phenomena of reasoning about absolute probabilities, including typical biases. It correctly predicts certain cognitive illusions in inferences about relative probabilities. It accommodates reasoning based on numerical premises, and it explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem. Finally, it dispels some common misconceptions of probabilistic reasoning. (shrink)
The aim of the paper is to draw a connection between a semantical theory of conditional statements and the theory of conditional probability. First, the probability calculus is interpreted as a semantics for truth functional logic. Absolute probabilities are treated as degrees of rational belief. Conditional probabilities are explicitly defined in terms of absolute probabilities in the familiar way. Second, the probability calculus is extended in order to provide an interpretation for counterfactual probabilities--conditional probabilities where (...) the condition has zero probability. Third, conditional propositions are introduced as propositions whose absolute probability is equal to the conditional probability of the consequent on the antecedent. An axiom system for this conditional connective is recovered from the probabilistic definition. Finally, the primary semantics for this axiom system, presented elsewhere, is related to the probabilistic interpretation. (shrink)
Isn't probability 1 certainty? If the probability is objective, so is the certainty: whatever has chance 1 of occurring is certain to occur. Equivalently, whatever has chance 0 of occurring is certain not to occur. If the probability is subjective, so is the certainty: if you give credence 1 to an event, you are certain that it will occur. Equivalently, if you give credence 0 to an event, you are certain that it will not occur. And so on for other (...) kinds of probability, such as evidential probability. The formal analogue of this picture is the regularity constraint: a probability distribution over sets of possibilities is regular just in case it assigns probability 0 only to the null set, and therefore probability 1 only to the set of all possibilities. (shrink)
Stalnaker's Thesis about indicative conditionals is, roughly, that the probability one ought to assign to an indicative conditional equals the probability that one ought to assign to its consequent conditional on its antecedent. The thesis seems right. If you draw a card from a standard 52-card deck, how confident are you that the card is a diamond if it's a red card? To answer this, you calculate the proportion of red cards that are diamonds -- that is, you calculate the (...) probability of drawing a diamond conditional on drawing a red card. Skyrms' Thesis about counterfactual conditionals is, roughly, that the probability that one ought to assign to a counterfactual equals one's rational expectation of the chance, at a relevant past time, of its consequent conditional on its antecedent. This thesis also seems right. If you decide not to enter a 100-ticket lottery, how confident are you that you would have won had you bought a ticket? To answer this, you calculate the prior chance--that is, the chance just before your decision not to buy a ticket---of winning conditional on entering the lottery. The central project of this article is to develop a new uniform theory of conditionals that allows us to derive a version of Skyrms' Thesis from a version of Stalnaker's Thesis, together with a chance-deference norm relating rational credence to beliefs about objective chance. (shrink)