David Wallace has given a decision-theoretic argument for the Born Rule in the context of Everettian quantum mechanics. This approach promises to resolve some long-standing problems with probability in EQM, but it has faced plenty of resistance. One kind of objection charges that the requisite notion of decision-theoretic uncertainty is unavailable in the Everettian picture, so that the argument cannot gain any traction; another kind of objection grants the proof’s applicability and targets the premises. In this article I propose (...) some novel principles connecting the physics of EQM with the metaphysics of modality, and argue that in the resulting framework the incoherence problem does not arise. These principles also help to justify one of the most controversial premises of Wallace’s argument, ‘branching indifference’. Absent any a priori reason to align the metaphysics with the physics in some other way, the proposed principles can be adopted on grounds of theoretical utility. The upshot is that Everettians can, after all, make clear sense of objective probability. 1 Introduction2 Setup3 Individualism versus Collectivism4 The Ingredients of Indexicalism5 Indexicalism and Incoherence5.1 The trivialization problem5.2 The uncertainty problem6 Indexicalism and Branching Indifference6.1 Introducing branching indifference6.2 The pragmatic defence of branching indifference6.3 The non-existence defence of branching indifference6.4 The indexicalist defence of branching indifference7 Conclusion. (shrink)
I describe a realist, ontologically objective interpretation of probability, "far-flung frequency (FFF) mechanistic probability". FFF mechanistic probability is defined in terms of facts about the causal structure of devices and certain sets of frequencies in the actual world. Though defined partly in terms of frequencies, FFF mechanistic probability avoids many drawbacks of well-known frequency theories and helps causally explain stable frequencies, which will usually be close to the values of mechanistic probabilities. I also argue that it's (...) a virtue rather than a failing of FFF mechanistic probability that it does not define single-case chances, and compare some aspects of my interpretation to a recent interpretation proposed by Strevens. (shrink)
There is a vast literature that seeks to uncover features underlying moral judgment by eliciting reactions to hypothetical scenarios such as trolley problems. These thought experiments assume that participants accept the outcomes stipulated in the scenarios. Across seven studies, we demonstrate that intuition overrides stipulated outcomes even when participants are explicitly told that an action will result in a particular outcome. Participants instead substitute their own estimates of the probability of outcomes for stipulated outcomes, and these probability estimates (...) in turn influence moral judgments. Our findings demonstrate that intuitive likelihoods are one critical factor in moral judgment, one that is not suspended even in moral dilemmas that explicitly stipulate outcomes. Features thought to underlie moral reasoning, such as intention, may operate, in part, by affecting the intuitive likelihood of outcomes, and, problematically, moral differences between scenarios may be confounded with non-moral intuitive probabilities. (shrink)
Emch, G.G., Liu, C.: The Logic of Thermostatistical Physics. Springer, Berlin/ Heidelberg (2002) 11. Frigg, R., Werndl, C.: Entropy – a guide for the perplexed. Forthcoming in: Beisbart, C., Hartmann, S. (eds.) Probabilities in Physics. Oxford ...
Early work on the frequency theory of probability made extensive use of the notion of randomness, conceived of as a property possessed by disorderly collections of outcomes. Growing out of this work, a rich mathematical literature on algorithmic randomness and Kolmogorov complexity developed through the twentieth century, but largely lost contact with the philosophical literature on physical probability. The present chapter begins with a clarification of the notions of randomness and probability, conceiving of the former as a (...) property of a sequence of outcomes, and the latter as a property of the process generating those outcomes. A discussion follows of the nature and limits of the relationship between the two notions, with largely negative verdicts on the prospects for any reduction of one to the other, although the existence of an apparently random sequence of outcomes is good evidence for the involvement of a genuinely chancy process. (shrink)
This book explores a question central to philosophy--namely, what does it take for a belief to be justified or rational? According to a widespread view, whether one has justification for believing a proposition is determined by how probable that proposition is, given one's evidence. In this book this view is rejected and replaced with another: in order for one to have justification for believing a proposition, one's evidence must normically support it--roughly, one's evidence must make the falsity of that proposition (...) abnormal in the sense of calling for special, independent explanation. This conception of justification bears upon a range of topics in epistemology and beyond. Ultimately, this way of looking at justification guides us to a new, unfamiliar picture of how we should respond to our evidence and manage our own fallibility. This picture is developed here. (shrink)
The book was planned and written as a single, sustained argument. But earlier versions of a few parts of it have appeared separately. The object of this book is both to establish the existence of the paradoxes, and also to describe a non-Pascalian concept of probability in terms of which one can analyse the structure of forensic proof without giving rise to such typical signs of theoretical misfit. Neither the complementational principle for negation nor the multiplicative principle for conjunction (...) applies to the central core of any forensic proof in the Anglo-American legal system. There are four parts included in this book. Accordingly, these parts have been written in such a way that they may be read in different orders by different kinds of reader. (shrink)
This paper develops an information-sensitive theory of the semantics and probability of conditionals and statements involving epistemic modals. The theory validates a number of principles linking probability and modality, including the principle that the probability of a conditional If A, then C equals the probability of C, updated with A. The theory avoids so-called triviality results, which are standardly taken to show that principles of this sort cannot be validated. To achieve this, we deny that rational (...) agents update their credences via conditionalization. We offer a new rule of update, Hyperconditionalization, which agrees with Conditionalization whenever nonmodal statements are at stake but differs for modal and conditional sentences. (shrink)
Richard Jeffrey is beyond dispute one of the most distinguished and influential philosophers working in the field of decision theory and the theory of knowledge. His work is distinctive in showing the interplay of epistemological concerns with probability and utility theory. Not only has he made use of standard probabilistic and decision theoretic tools to clarify concepts of evidential support and informed choice, he has also proposed significant modifications of the standard Bayesian position in order that it provide a (...) better fit with actual human experience. Probability logic is viewed not as a source of judgment but as a framework for explaining the implications of probabilistic judgments and their mutual compatability. This collection of essays spans a period of some 35 years and includes what have become some of the classic works in the literature. There is also one completely new piece, while in many instances Jeffrey includes afterthoughts on the older essays. (shrink)
First issued in translation as a two-volume work in 1975, this classic book provides the first complete development of the theory of probability from a subjectivist viewpoint. It proceeds from a detailed discussion of the philosophical mathematical aspects to a detailed mathematical treatment of probability and statistics. De Finetti’s theory of probability is one of the foundations of Bayesian theory. De Finetti stated that probability is nothing but a subjective analysis of the likelihood that something will (...) happen and that that probability does not exist outside the mind. It is the rate at which a person is willing to bet on something happening. This view is directly opposed to the classicist/ frequentist view of the likelihood of a particular outcome of an event, which assumes that the same event could be identically repeated many times over, and the 'probability' of a particular outcome has to do with the fraction of the time that outcome results from the repeated trials. (shrink)
This book offers a concise survey of basic probability theory from a thoroughly subjective point of view whereby probability is a mode of judgment. Written by one of the greatest figures in the field of probability theory, the book is both a summation and synthesis of a lifetime of wrestling with these problems and issues. After an introduction to basic probability theory, there are chapters on scientific hypothesis-testing, on changing your mind in response to generally uncertain (...) observations, on expectations of the values of random variables, on de Finetti's dissolution of the so-called problem of induction, and on decision theory. (shrink)
APA PsycNET abstract: This is the first volume of a two-volume work on Probability and Induction. Because the writer holds that probability logic is identical with inductive logic, this work is devoted to philosophical problems concerning the nature of probability and inductive reasoning. The author rejects a statistical frequency basis for probability in favor of a logical relation between two statements or propositions. Probability "is the degree of confirmation of a hypothesis (or conclusion) on the (...) basis of some given evidence (or premises)." Furthermore, all principles and theorems of inductive logic are analytic, and the entire system is to be constructed by means of symbolic logic and semantic methods. This means that the author confines himself to the formalistic procedures of word and symbol systems. The resulting sentence or language structures are presumed to separate off logic from all subjectivist or psychological elements. Despite the abstractionism, the claim is made that if an inductive probability system of logic can be constructed it will have its practical application in mathematical statistics, and in various sciences. 16-page bibliography. (PsycINFO Database Record (c) 2016 APA, all rights reserved). (shrink)
This work has been selected by scholars as being culturally important and is part of the knowledge base of civilization as we know it. This work is in the public domain in the United States of America, and possibly other nations. Within the United States, you may freely copy and distribute this work, as no entity has a copyright on the body of the work. Scholars believe, and we concur, that this work is important enough to be preserved, reproduced, and (...) made generally available to the public. To ensure a quality reading experience, this work has been proofread and republished using a format that seamlessly blends the original graphical elements with text in an easy-to-read typeface. We appreciate your support of the preservation process, and thank you for being an important part of keeping this knowledge alive and relevant. (shrink)
Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject.
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises’ views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises’ definition of probability in terms of limiting frequency and (...) claims that probability should be taken as a primitive or undefined term in accordance with modern axiomatic approaches. This of course raises the problem of how the abstract calculus of probability should be connected with the ‘actual world of experiments’. It is suggested that this link should be established, not by a definition of probability, but by an application of Popper’s concept of falsifiability. In addition to formulating his own interesting theory, Dr Gillies gives a detailed criticism of the generally accepted Neyman Pearson theory of testing, as well as of alternative philosophical approaches to probability theory. The reissue will be of interest both to philosophers with no previous knowledge of probability theory and to mathematicians interested in the foundations of probability theory and statistics. (shrink)
I define a concept of causal probability and apply it to questions about the role of probability in evolutionary processes. Causal probability is defined in terms of manipulation of patterns in empirical outcomes by manipulating properties that realize objective probabilities. The concept of causal probability allows us see how probabilities characterized by different interpretations of probability can share a similar causal character, and does so in such way as to allow new inferences about relationships between (...) probabilities realized in different chance setups. I clarify relations between probabilities and properties defined in terms of them, and argue that certain widespread uses of computer simulations in evolutionary biology show that many probabilities relevant to evolutionary outcomes are causal probabilities. This supports the claim that higher-level properties such as biological fitness and processes such as natural selection are causal properties and processes, contrary to what some authors have argued. (shrink)
In this influential study of central issues in the philosophy of science, Paul Horwich elaborates on an important conception of probability, diagnosing the failure of previous attempts to resolve these issues as stemming from a too-rigid conception of belief. Adopting a Bayesian strategy, he argues for a probabilistic approach, yielding a more complete understanding of the characteristics of scientific reasoning and methodology. Presented in a fresh twenty-first-century series livery, and including a specially commissioned preface written by Colin Howson, illuminating (...) its enduring importance and relevance to philosophical enquiry, this engaging work has been revived for a new generation of readers. (shrink)
The book contains the transcription of a course on the foundations of probability given by the Italian mathematician Bruno de Finetti in 1979 at the a oeNational Institute of Advanced Mathematicsa in Rome.
With this treatise, an insightful exploration of the probabilistic connection between philosophy and the history of science, the famous economist breathed new life into studies of both disciplines. Originally published in 1921, this important mathematical work represented a significant contribution to the theory regarding the logical probability of propositions. Keynes effectively dismantled the classical theory of probability, launching what has since been termed the “logical-relationist” theory. In so doing, he explored the logical relationships between classifying a proposition as (...) “highly probable” and as a “justifiable induction.” Unabridged republication of the classic 1921 edition. (shrink)
This is a study in the meaning of natural language probability operators, sentential operators such as probably and likely. We ask what sort of formal structure is required to model the logic and semantics of these operators. Along the way we investigate their deep connections to indicative conditionals and epistemic modals, probe their scalar structure, observe their sensitivity to contex- tually salient contrasts, and explore some of their scopal idiosyncrasies.
The Empire of Chance tells how quantitative ideas of chance transformed the natural and social sciences, as well as daily life over the last three centuries. A continuous narrative connects the earliest application of probability and statistics in gambling and insurance to the most recent forays into law, medicine, polling and baseball. Separate chapters explore the theoretical and methodological impact in biology, physics and psychology. Themes recur - determinism, inference, causality, free will, evidence, the shifting meaning of probability (...) - but in dramatically different disciplinary and historical contexts. In contrast to the literature on the mathematical development of probability and statistics, this book centres on how these technical innovations remade our conceptions of nature, mind and society. Written by an interdisciplinary team of historians and philosophers, this readable, lucid account keeps technical material to an absolute minimum. It is aimed not only at specialists in the history and philosophy of science, but also at the general reader and scholars in other disciplines. (shrink)
According to what is now commonly referred to as “the Equation” in the literature on indicative conditionals, the probability of any indicative conditional equals the probability of its consequent of the conditional given the antecedent of the conditional. Philosophers widely agree in their assessment that the triviality arguments of Lewis and others have conclusively shown the Equation to be tenable only at the expense of the view that indicative conditionals express propositions. This study challenges the correctness of that (...) assessment by presenting data that cast doubt on an assumption underlying all triviality arguments. (shrink)
This chapter will review selected aspects of the terrain of discussions about probabilities in statistical mechanics (with no pretensions to exhaustiveness, though the major issues will be touched upon), and will argue for a number of claims. None of the claims to be defended is entirely original, but all deserve emphasis. The first, and least controversial, is that probabilistic notions are needed to make sense of statistical mechanics. The reason for this is the same reason that convinced Maxwell, Gibbs, and (...) Boltzmann that probabilities would be needed, namely, that the second law of thermodynamics, which in its original formulation says that certain processes are impossible, must, on the kinetic theory, be replaced by a weaker formulation according to which what the original version deems impossible is merely improbable. Second is that we ought not take the standard measures invoked in equilibrium statistical mechanics as giving, in any sense, the correct probabilities about microstates of the system. We can settle for a much weaker claim: that the probabilities for outcomes of experiments yielded by the standard distributions are effectively the same as those yielded by any distribution that we should take as a representing probabilities over microstates. Lastly, (and most controversially): in asking about the status of probabilities in statistical mechanics, the familiar dichotomy between epistemic probabilities (credences, or degrees of belief) and ontic (physical) probabilities is insufficient; the concept of probability that is best suited to the needs of statistical mechanics is one that combines epistemic and physical considerations. (shrink)
The Ramseyan thesis that the probability of an indicative conditional is equal to the corresponding conditional probability of its consequent given its antecedent is both widely confirmed and subject to attested counterexamples (e.g., McGee 2000, Kaufmann 2004). This raises several puzzling questions. For instance, why are there interpretations of conditionals that violate this Ramseyan thesis in certain contexts, and why are they otherwise very rare? In this paper, I raise some challenges to Stefan Kaufmann's account of why the (...) Ramseyan thesis sometimes fails, and motivate my own theory. On my theory, the proposition expressed by an indicative conditional is partially determined by a background partition, and hence its probability depends on the choice of such a partition. I hold that this background partition is contextually determined, and in certain conditions is set by a salient question under discussion in the context. I show how the resulting theory offers compelling answers to the puzzling questions raised by failures of the Ramseyan thesis. (shrink)
Many have argued that a rational agent's attitude towards a proposition may be better represented by a probability range than by a single number. I show that in such cases an agent will have unstable betting behaviour, and so will behave in an unpredictable way. I use this point to argue against a range of responses to the ‘two bets’ argument for sharp probabilities.
Self-taught mathematician and father of Boolean algebra, George Boole (1815-1864) published An Investigation of the Laws of Thought in 1854. In this highly original investigation of the fundamental laws of human reasoning, a sequel to ideas he had explored in earlier writings, Boole uses the symbolic language of mathematics to establish a method to examine the nature of the human mind using logic and the theory of probabilities. Boole considers language not just as a mode of expression, but as a (...) system one can use to understand the human mind. In the first 12 chapters, he sets down the rules necessary to represent logic in this unique way. Then he analyses a variety of arguments and propositions of various writers from Aristotle to Spinoza. One of history's most insightful mathematicians, Boole is compelling reading for today's student of intellectual history and the science of the mind. (shrink)
It is well known that classical, aka ‘sharp’, Bayesian decision theory, which models belief states as single probability functions, faces a number of serious difficulties with respect to its handling of agnosticism. These difficulties have led to the increasing popularity of so-called ‘imprecise’ models of decision-making, which represent belief states as sets of probability functions. In a recent paper, however, Adam Elga has argued in favour of a putative normative principle of sequential choice that he claims to be (...) borne out by the sharp model but not by any promising incarnation of its imprecise counterpart. After first pointing out that Elga has fallen short of establishing that his principle is indeed uniquely borne out by the sharp model, I cast aspersions on its plausibility. I show that a slight weakening of the principle is satisfied by at least one, but interestingly not all, varieties of the imprecise model and point out that Elga has failed to motivate his stronger commitment. (shrink)
This chapter explores the topic of imprecise probabilities as it relates to model validation. IP is a family of formal methods that aim to provide a better representationRepresentation of severe uncertainty than is possible with standard probabilistic methods. Among the methods discussed here are using sets of probabilities to represent uncertainty, and using functions that do not satisfy the additvity property. We discuss the basics of IP, some examples of IP in computer simulation contexts, possible interpretations of the IP framework (...) and some conceptual problems for the approach. We conclude with a discussion of IP in the context of model validation. (shrink)
Some have argued that chance and determinism are compatible in order to account for the objectivity of probabilities in theories that are compatible with determinism, like Classical Statistical Mechanics (CSM) and Evolutionary Theory (ET). Contrarily, some have argued that chance and determinism are incompatible, and so such probabilities are subjective. In this paper, I argue that both of these positions are unsatisfactory. I argue that the probabilities of theories like CSM and ET are not chances, but also that they are (...) not subjective probabilities either. Rather, they are a third type of probability, which I call counterfactual probability. The main distinguishing feature of counterfactual-probability is the role it plays in conveying important counterfactual information in explanations. This distinguishes counterfactual probability from chance as a second concept of objective probability. (shrink)
We report a solution to an open problem regarding the axiomatization of the convex hull of a type of nonclassical evaluations. We then investigate the meaning of this result for the larger context of the relation between rational credence functions and nonclassical probability. We claim that the notions of bets and Dutch Books typically employed in formal epistemology are of doubtful use outside the realm of classical logic, eventually proposing two novel ways of understanding Dutch Books in nonclassical settings.
Many have claimed that unspecific evidence sometimes demands unsharp, indeterminate, imprecise, vague, or interval-valued probabilities. Against this, a variant of the diachronic Dutch Book argument shows that perfectly rational agents always have perfectly sharp probabilities.
Non-Archimedean probability functions allow us to combine regularity with perfect additivity. We discuss the philosophical motivation for a particular choice of axioms for a non-Archimedean probability theory and answer some philosophical objections that have been raised against infinitesimal probabilities in general. _1_ Introduction _2_ The Limits of Classical Probability Theory _2.1_ Classical probability functions _2.2_ Limitations _2.3_ Infinitesimals to the rescue? _3_ NAP Theory _3.1_ First four axioms of NAP _3.2_ Continuity and conditional probability _3.3_ (...) The final axiom of NAP _3.4_ Infinite sums _3.5_ Definition of NAP functions via infinite sums _3.6_ Relation to numerosity theory _4_ Objections and Replies _4.1_ Cantor and the Archimedean property _4.2_ Ticket missing from an infinite lottery _4.3_ Williamson’s infinite sequence of coin tosses _4.4_ Point sets on a circle _4.5_ Easwaran and Pruss _5_ Dividends _5.1_ Measure and utility _5.2_ Regularity and uniformity _5.3_ Credence and chance _5.4_ Conditional probability _6_ General Considerations _6.1_ Non-uniqueness _6.2_ Invariance Appendix. (shrink)
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. _Philosophical Theories of Probability_ is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the (...) subjective theory. (shrink)
Contents: Part I. Probability and the Idealizational Theory of Science. Marek GAUL: Statistical dependencies, statements and the idealizational theory of science. Part II. Probability - theoretical concepts in psychology - measurement. Douglas WAHLSTEN: Probability and the understanding of individual differences. Bodo KRAUSE: Modeling cognitive learning steps. Dieter HEYER, and Rainer MAUSFELD: A theoretical and experimental inquiry into the relation of theoretical concepts and probabilistic measurement scales in experimental psychology. Part III. Methods of data analysis. Tadeusz B. IWINSKI: (...) Rough set methods in psychology. Wilma KOUTSTAAL, and Robert ROSENTHAL: Contrast analysi in behavioral research. Part IV. Artifacts in psychological research and diagnostic assessment. David B. STROHMETZ, and Ralph L. ROSNOW: A mediational model of research artifacts. Jerzy BRZEZINSKI: Dimensions of diagnostic space. (shrink)
We argue that a fashionable interpretation of the theory of natural selection as a claim exclusively about populations is mistaken. The interpretation rests on adopting an analysis of fitness as a probabilistic propensity which cannot be substantiated, draws parallels with thermodynamics which are without foundations, and fails to do justice to the fundamental distinction between drift and selection. This distinction requires a notion of fitness as a pairwise comparison between individuals taken two at a time, and so vitiates the interpretation (...) of the theory as one about populations exclusively. (shrink)
This paper is concerned with the causally symmetric version of the familiar de Broglie–Bohm interpretation, this version allowing the spacelike nonlocality and the configuration space ontology of the original model to be avoided via the addition of retrocausality. Two different features of this alternative formulation are considered here. With regard to probabilities, it is shown that the model provides a derivation of the Born rule identical to that in Bohm’s original formulation. This derivation holds just as well for a many-particle, (...) entangled state as for a single particle. With regard to “certainties”, the description of a particle’s spin is examined within the model and it is seen that a statistical description is no longer necessary once final boundary conditions are specified in addition to the usual initial state, with the particle then possessing a definite value for every spin component at intermediate times. These values are consistent with being the components of a single, underlying spin vector. The case of a two-particle entangled spin state is also examined and it is found that, due to the retrocausal aspect, each particle possesses its own definite spin during the entanglement, independent of the other particle. In formulating this picture, it is demonstrated how such a realistic model can preserve Lorentz invariance in the face of Bell’s theorem and avoid the need for a preferred reference frame. (shrink)
This paper is about teaching probability to students of philosophy who don’t aim to do primarily formal work in their research. These students are unlikely to seek out classes about probability or formal epistemology for various reasons, for example because they don’t realize that this knowledge would be useful for them or because they are intimidated by the material. However, most areas of philosophy now contain debates that incorporate probability, and basic knowledge of it is essential even (...) for philosophers whose work isn’t primarily formal. In this paper, I explain how to teach probability to students who are not already enthusiastic about formal philosophy, taking into account the common phenomena of math anxiety and the lack of reading skills for formal texts. I address course design, lesson design, and assignment design. Most of my recommendations also apply to teaching formal methods other than probability theory. (shrink)
We explore ways in which purely qualitative belief change in the AGM tradition throws light on options in the treatment of conditional probability. First, by helping see why it can be useful to go beyond the ratio rule defining conditional from one-place probability. Second, by clarifying what is at stake in different ways of doing that. Third, by suggesting novel forms of conditional probability corresponding to familiar variants of qualitative belief change, and conversely. Likewise, we explain how (...) recent work on the qualitative part of probabilistic inference leads to a very broad class of 'proto-probability' functions. (shrink)
Subjective probability plays an increasingly important role in many fields concerned with human cognition and behavior. Yet there have been significant criticisms of the idea that probabilities could actually be represented in the mind. This paper presents and elaborates a view of subjective probability as a kind of sampling propensity associated with internally represented generative models. The resulting view answers to some of the most well known criticisms of subjective probability, and is also supported by empirical work (...) in neuroscience and behavioral psychology. The repercussions of the view for how we conceive of many ordinary instances of subjective probability, and how it relates to more traditional conceptions of subjective probability, are discussed in some detail. (shrink)
This article outlines a theory of naive probability. According to the theory, individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an extensional way: They construct mental models of what is true in the various possibilities. Each model represents an equiprobable alternative unless individuals have beliefs to the contrary, in which case some models will have higher probabilities than others. The probability of an event depends on the proportion of models in (...) which it occurs. The theory predicts several phenomena of reasoning about absolute probabilities, including typical biases. It correctly predicts certain cognitive illusions in inferences about relative probabilities. It accommodates reasoning based on numerical premises, and it explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem. Finally, it dispels some common misconceptions of probabilistic reasoning. (shrink)
_Probability: A Philosophical Introduction_ introduces and explains the principal concepts and applications of probability. It is intended for philosophers and others who want to understand probability as we all apply it in our working and everyday lives. The book is not a course in mathematical probability, of which it uses only the simplest results, and avoids all needless technicality. The role of probability in modern theories of knowledge, inference, induction, causation, laws of nature, action and decision-making (...) makes an understanding of it especially important to philosophers and students of philosophy, to whom this book will be invaluable both as a textbook and a work of reference. In this book D. H. Mellor discusses the three basic kinds of probability – physical, epistemic, and subjective – and introduces and assesses the main theories and interpretations of them. The topics and concepts covered include: * chance * frequency * possibility * propensity * credence * confirmation * Bayesianism. _Probability: A Philosophical Introduction_ is essential reading for all philosophy students and others who encounter or need to apply ideas of probability. (shrink)