The article is a plea for ethicists to regard probability as one of their most important concerns. It outlines a series of topics of central importance in ethical theory in which probability is implicated, often in a surprisingly deep way, and lists a number of open problems. Topics covered include: interpretations of probability in ethical contexts; the evaluative and normative significance of risk or uncertainty; uses and abuses of expected utility theory; veils of ignorance; Harsanyi’s aggregation theorem; (...) population size problems; equality; fairness; giving priority to the worse off; continuity; incommensurability; nonexpected utility theory; evaluative measurement; aggregation; causal and evidential decision theory; act consequentialism; rule consequentialism; and deontology. (shrink)
The present chapter describes a probabilistic framework of human reasoning. It is based on probability logic. While there are several approaches to probability logic, we adopt the coherence based approach.
An unknown process is generating a sequence of symbols, drawn from an alphabet, A. Given an initial segment of the sequence, how can one predict the next symbol? Ray Solomonoff’s theory of inductive reasoning rests on the idea that a useful estimate of a sequence’s true probability of being outputted by the unknown process is provided by its algorithmic probability (its probability of being outputted by a species of probabilistic Turing machine). However algorithmic probability is a (...) “semimeasure”: i.e., the sum, over all x∈A, of the conditional algorithmic probabilities of the next symbol being x, may be less than 1. Prevailing wisdom has it that algorithmic probability must be normalized, to eradicate this semimeasure property, before it can yield acceptable probability estimates. This paper argues, to the contrary, that the semimeasure property contributes substantially to the power and scope of an algorithmic-probability-based theory of induction, and that normalization is unnecessary. (shrink)
Classical (Bayesian) probability (CP) theory has led to an influential research tradition for modeling cognitive processes. Cognitive scientists have been trained to work with CP principles for so long that it is hard even to imagine alternative ways to formalize probabilities. However, in physics, quantum probability (QP) theory has been the dominant probabilistic approach for nearly 100 years. Could QP theory provide us with any advantages in cognitive modeling as well? Note first that both CP and QP theory (...) share the fundamental assumption that it is possible to model cognition on the basis of formal, probabilistic principles. But why consider a QP approach? The answers are that (1) there are many well-established empirical findings (e.g., from the influential Tversky, Kahneman research tradition) that are hard to reconcile with CP principles; and (2) these same findings have natural and straightforward explanations with quantum principles. In QP theory, probabilistic assessment is often strongly context- and order-dependent, individual states can be superposition states (that are impossible to associate with specific values), and composite systems can be entangled (they cannot be decomposed into their subsystems). All these characteristics appear perplexing from a classical perspective. However, our thesis is that they provide a more accurate and powerful account of certain cognitive processes. We first introduce QP theory and illustrate its application with psychological examples. We then review empirical findings that motivate the use of quantum theory in cognitive theory, but also discuss ways in which QP and CP theories converge. Finally, we consider the implications of a QP theory approach to cognition for human rationality. (shrink)
This paper motivates and develops a novel semantic framework for deontic modals. The framework is designed to shed light on two things: the relationship between deontic modals and substantive theories of practical rationality and the interaction of deontic modals with conditionals, epistemic modals and probability operators. I argue that, in order to model inferential connections between deontic modals and probability operators, we need more structure than is provided by classical intensional theories. In particular, we need probabilistic structure that (...) interacts directly with the compositional semantics of deontic modals. However, I reject theories that provide this probabilistic structure by claiming that the semantics of deontic modals is linked to the Bayesian notion of expectation. I offer a probabilistic premise semantics that explains all the data that create trouble for the rival theories. (shrink)
This book is meant to be a primer, that is, an introduction, to probability logic, a subject that appears to be in its infancy. Probability logic is a subject envisioned by Hans Reichenbach and largely created by Adams. It treats conditionals as bearers of conditional probabilities and discusses an appropriate sense of validity for arguments such conditionals, as well as ordinary statements as premisses. This is a clear well-written text on the subject of probability logic, suitable for (...) advanced undergraduates or graduates, but also of interest to professional philosophers. There are well-thought-out exercises, and a number of advanced topics treated in appendices, while some are brought up in exercises and some are alluded to only in footnotes. By this means, it is hoped that the reader will at least be made aware of most of the important ramifications of the subject and its tie-ins with current research, and will have some indications concerning recent and relevant literature. (shrink)
This is a study in the meaning of natural language probability operators, sentential operators such as probably and likely. We ask what sort of formal structure is required to model the logic and semantics of these operators. Along the way we investigate their deep connections to indicative conditionals and epistemic modals, probe their scalar structure, observe their sensitivity to contex- tually salient contrasts, and explore some of their scopal idiosyncrasies.
Order of information plays a crucial role in the process of updating beliefs across time. In fact, the presence of order effects makes a classical or Bayesian approach to inference difficult. As a result, the existing models of inference, such as the belief-adjustment model, merely provide an ad hoc explanation for these effects. We postulate a quantum inference model for order effects based on the axiomatic principles of quantum probability theory. The quantum inference model explains order effects by transforming (...) a state vector with different sequences of operators for different orderings of information. We demonstrate this process by fitting the quantum model to data collected in a medical diagnostic task and a jury decision-making task. To further test the quantum inference model, a new jury decision-making experiment is developed. Using the results of this experiment, we compare the quantum inference model with two versions of the belief-adjustment model, the adding model and the averaging model. We show that both the quantum model and the adding model provide good fits to the data. To distinguish the quantum model from the adding model, we develop a new experiment involving extreme evidence. The results from this new experiment suggest that the adding model faces limitations when accounting for tasks involving extreme evidence, whereas the quantum inference model does not. Ultimately, we argue that the quantum model provides a more coherent account for order effects that was not possible before. (shrink)
When probability discounting (or probability weighting), one multiplies the value of an outcome by one's subjective probability that the outcome will obtain in decision-making. The broader import of defending probability discounting is to help justify cost-benefit analyses in contexts such as climate change. This chapter defends probability discounting under risk both negatively, from arguments by Simon Caney (2008, 2009), and with a new positive argument. First, in responding to Caney, I argue that small costs and (...) benefits need to be evaluated, and that viewing practices at the social level is too coarse-grained. Second, I argue for probability discounting, using a distinction between causal responsibility and moral responsibility. Moral responsibility can be cashed out in terms of blameworthiness and praiseworthiness, while causal responsibility obtains in full for any effect which is part of a causal chain linked to one's act. With this distinction in hand, unlike causal responsibility, moral responsibility can be seen as coming in degrees. My argument is, given that we can limit our deliberation and consideration to that which we are morally responsible for and that our moral responsibility for outcomes is limited by our subjective probabilities, our subjective probabilities can ground probability discounting. (shrink)
This article outlines a theory of naive probability. According to the theory, individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an extensional way: They construct mental models of what is true in the various possibilities. Each model represents an equiprobable alternative unless individuals have beliefs to the contrary, in which case some models will have higher probabilities than others. The probability of an event depends on the proportion of models in (...) which it occurs. The theory predicts several phenomena of reasoning about absolute probabilities, including typical biases. It correctly predicts certain cognitive illusions in inferences about relative probabilities. It accommodates reasoning based on numerical premises, and it explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem. Finally, it dispels some common misconceptions of probabilistic reasoning. (shrink)
Adams’ thesis is generally agreed to be linguistically compelling for simple conditionals with factual antecedent and consequent. We propose a derivation of Adams’ thesis from the Lewis- Kratzer analysis of if-clauses as domain restrictors, applied to probability operators. We argue that Lewis’s triviality result may be seen as a result of inexpressibility of the kind familiar in generalized quantifier theory. Some implications of the Lewis- Kratzer analysis are presented concerning the assignment of probabilities to compounds of conditionals.
In this study we investigate the influence of reason-relation readings of indicative conditionals and ‘and’/‘but’/‘therefore’ sentences on various cognitive assessments. According to the Frege-Grice tradition, a dissociation is expected. Specifically, differences in the reason-relation reading of these sentences should affect participants’ evaluations of their acceptability but not of their truth value. In two experiments we tested this assumption by introducing a relevance manipulation into the truth-table task as well as in other tasks assessing the participants’ acceptability and probability evaluations. (...) Across the two experiments a strong dissociation was found. The reason-relation reading of all four sentences strongly affected their probability and acceptability evaluations, but hardly affected their respective truth evaluations. Implications of this result for recent work on indicative conditionals are discussed. (shrink)
David Wallace has given a decision-theoretic argument for the Born Rule in the context of Everettian quantum mechanics. This approach promises to resolve some long-standing problems with probability in EQM, but it has faced plenty of resistance. One kind of objection charges that the requisite notion of decision-theoretic uncertainty is unavailable in the Everettian picture, so that the argument cannot gain any traction; another kind of objection grants the proof’s applicability and targets the premises. In this article I propose (...) some novel principles connecting the physics of EQM with the metaphysics of modality, and argue that in the resulting framework the incoherence problem does not arise. These principles also help to justify one of the most controversial premises of Wallace’s argument, ‘branching indifference’. Absent any a priori reason to align the metaphysics with the physics in some other way, the proposed principles can be adopted on grounds of theoretical utility. The upshot is that Everettians can, after all, make clear sense of objective probability. 1 Introduction2 Setup3 Individualism versus Collectivism4 The Ingredients of Indexicalism5 Indexicalism and Incoherence5.1 The trivialization problem5.2 The uncertainty problem6 Indexicalism and Branching Indifference6.1 Introducing branching indifference6.2 The pragmatic defence of branching indifference6.3 The non-existence defence of branching indifference6.4 The indexicalist defence of branching indifference7 Conclusion. (shrink)
This chapter is a philosophical survey of some leading approaches in formal epistemology in the so-called ‘Bayesian’ tradition. According to them, a rational agent’s degrees of belief—credences—at a time are representable with probability functions. We also canvas various further putative ‘synchronic’ rationality norms on credences. We then consider ‘diachronic’ norms that are thought to constrain how credences should respond to evidence. We discuss some of the main lines of recent debate, and conclude with some prospects for future research.
I describe a realist, ontologically objective interpretation of probability, "far-flung frequency (FFF) mechanistic probability". FFF mechanistic probability is defined in terms of facts about the causal structure of devices and certain sets of frequencies in the actual world. Though defined partly in terms of frequencies, FFF mechanistic probability avoids many drawbacks of well-known frequency theories and helps causally explain stable frequencies, which will usually be close to the values of mechanistic probabilities. I also argue that it's (...) a virtue rather than a failing of FFF mechanistic probability that it does not define single-case chances, and compare some aspects of my interpretation to a recent interpretation proposed by Strevens. (shrink)
A definition of causation as probability-raising is threatened by two kinds of counterexample: first, when a cause lowers the probability of its effect; and second, when the probability of an effect is raised by a non-cause. In this paper, I present an account that deals successfully with problem cases of both these kinds. In doing so, I also explore some novel implications of incorporating into the metaphysical investigation considerations of causal psychology.
Some have argued that chance and determinism are compatible in order to account for the objectivity of probabilities in theories that are compatible with determinism, like Classical Statistical Mechanics (CSM) and Evolutionary Theory (ET). Contrarily, some have argued that chance and determinism are incompatible, and so such probabilities are subjective. In this paper, I argue that both of these positions are unsatisfactory. I argue that the probabilities of theories like CSM and ET are not chances, but also that they are (...) not subjective probabilities either. Rather, they are a third type of probability, which I call counterfactual probability. The main distinguishing feature of counterfactual-probability is the role it plays in conveying important counterfactual information in explanations. This distinguishes counterfactual probability from chance as a second concept of objective probability. (shrink)
(2014). Capturing the relationship between conditionals and conditional probability with a trivalent semantics. Journal of Applied Non-Classical Logics: Vol. 24, Three-Valued Logics and their Applications, pp. 144-152. doi: 10.1080/11663081.2014.911535.
This paper is a response to Tyler Wunder’s ‘The modality of theism and probabilistic natural theology: a tension in Alvin Plantinga's philosophy’ (this journal). In his article, Wunder argues that if the proponent of the Evolutionary Argument Against Naturalism (EAAN) holds theism to be non-contingent and frames the argument in terms of objective probability, that the EAAN is either unsound or theism is necessarily false. I argue that a modest revision of the EAAN renders Wunder’s objection irrelevant, and that (...) this revision actually widens the scope of the argument. (shrink)
The Bayesian model has been used in psychology as the standard reference for the study of probability revision. In the first part of this paper we show that this traditional choice restricts the scope of the experimental investigation of revision to a stable universe. This is the case of a situation that, technically, is known as focusing. We argue that it is essential for a better understanding of human probability revision to consider another situation called updating (Katsuno & (...) Mendelzon, 1992), in which the universe is evolving. In that case the structure of the universe has definitely been transformed and the revision message conveys information on the resulting universe. The second part of the paper presents four experiments based on the Monty Hall puzzle that aim to show that updating is a natural frame for individuals to revise their beliefs. (shrink)
A probability distribution is regular if no possible event is assigned probability zero. While some hold that probabilities should always be regular, three counter-arguments have been posed based on examples where, if regularity holds, then perfectly similar events must have different probabilities. Howson (2017) and Benci et al. (2016) have raised technical objections to these symmetry arguments, but we see here that their objections fail. Howson says that Williamson’s (2007) “isomorphic” events are not in fact isomorphic, but Howson (...) is speaking of set-theoretic representations of events in a probability model. While those sets are not isomorphic, Williamson’s physical events are, in the relevant sense. Benci et al. claim that all three arguments rest on a conflation of different models, but they do not. They are founded on the premise that similar events should have the same probability in the same model, or in one case, on the assumption that a single rotation-invariant distribution is possible. Having failed to refute the symmetry arguments on such technical grounds, one could deny their implicit premises, which is a heavy cost, or adopt varying degrees of instrumentalism or pluralism about regularity, but that would not serve the project of accurately modelling chances. (shrink)
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned probability zero (in other words: the probability functions are regular). We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov’s (...) axiomatization of probability is replaced by a different type of infinite additivity. (shrink)
We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. (...) These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. (shrink)
The paper provides a new critical perspective on the propensity interpretation of fitness, by investigating its relationship to the propensity interpretation of probability. Two main conclusions are drawn. First, the claim that fitness is a propensity cannot be understood properly: fitness is not a propensity in the sense prescribed by the propensity interpretation of probability. Second, this interpretation of probability is inessential for explanations proposed by the PIF in evolutionary biology. Consequently, interpreting the probabilistic dimension of fitness (...) in terms of propensities is neither a strong motivation in favor of this interpretation, nor a possible target for substantial criticism. (shrink)
Rather than entailing that a particular outcome will occur, many scientific theories only entail that an outcome will occur with a certain probability. Because scientific evidence inevitably falls short of conclusive proof, when choosing between different theories it is standard to make reference to how probable the various options are in light of the evidence. A full understanding of probability in science needs to address both the role of probabilities in theories, or chances, as well as the role (...) of probabilistic judgment in theory choice. In this chapter, the author introduces and distinguishes the two sorts of probability from one another and attempt to offer a satisfactory characterization of how the different uses for probability in science are to be understood. A closing section turns to the question of how views about the chance of some outcome should guide our confidence in that outcome. (shrink)
We explore ways in which purely qualitative belief change in the AGM tradition throws light on options in the treatment of conditional probability. First, by helping see why it can be useful to go beyond the ratio rule defining conditional from one-place probability. Second, by clarifying what is at stake in different ways of doing that. Third, by suggesting novel forms of conditional probability corresponding to familiar variants of qualitative belief change, and conversely. Likewise, we explain how (...) recent work on the qualitative part of probabilistic inference leads to a very broad class of 'proto-probability' functions. (shrink)
We provide a 'verisimilitudinarian' analysis of the well-known Linda paradox or conjunction fallacy, i.e., the fact that most people judge the probability of the conjunctive statement "Linda is a bank teller and is active in the feminist movement" (B & F) as more probable than the isolated statement "Linda is a bank teller" (B), contrary to an uncontroversial principle of probability theory. The basic idea is that experimental participants may judge B & F a better hypothesis about Linda (...) as compared to B because they evaluate B & F as more verisimilar than B. In fact, the hypothesis "feminist bank teller", while less likely to be true than "bank teller", may well be a better approximation to the truth about Linda. (shrink)
Iterated conditionals of the form If p, then if q, r are an important topic in philosophical logic. In recent years, psychologists have gained much knowledge about how people understand simple conditionals, but there are virtually no published psychological studies of iterated conditionals. This paper presents experimental evidence from a study comparing the iterated form, If p, then if q, r with the “imported,” noniterated form, If p and q, then r, using a probability evaluation task and a truth-table (...) task, and taking into account qualitative individual differences. This allows us to critically contrast philosophical and psychological approaches that make diverging predictions regarding the interpretation of these forms. The results strongly support the probabilistic Adams conditional and the “new paradigm” that takes this conditional as a starting point. (shrink)
There is a plethora of confirmation measures in the literature. Zalabardo considers four such measures: PD, PR, LD, and LR. He argues for LR and against each of PD, PR, and LD. First, he argues that PR is the better of the two probability measures. Next, he argues that LR is the better of the two likelihood measures. Finally, he argues that LR is superior to PR. I set aside LD and focus on the trio of PD, PR, and (...) LR. The question I address is whether Zalabardo succeeds in showing that LR is superior to each of PD and PR. I argue that the answer is negative. I also argue, though, that measures such as PD and PR, on one hand, and measures such as LR, on the other hand, are naturally understood as explications of distinct senses of confirmation. (shrink)
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: inferential probability, ensemble probability, and propensity. Class is the basis of inductive logic; deals with the frequencies of (...) events in repeatable experiments; describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge. (shrink)
We offer a probabilistic model of rational consequence relations (Lehmann and Magidor, 1990) by appealing to the extension of the classical Ramsey-Adams test proposed by Vann McGee in (McGee, 1994). Previous and influential models of nonmonotonic consequence relations have been produced in terms of the dynamics of expectations (Gärdenfors and Makinson, 1994; Gärdenfors, 1993).'Expectation' is a term of art in these models, which should not be confused with the notion of expected utility. The expectations of an agent are some form (...) of belief weaker than absolute certainty. Our model offers a modified and extended version of an account of qualitative belief in terms of conditional probability, first presented in (van Fraassen, 1995). We use this model to relate probabilistic and qualitative models of non-monotonic relations in terms of expectations. In doing so we propose a probabilistic model of the notion of expectation. We provide characterization results both for logically finite languages and for logically infinite, but countable, languages. The latter case shows the relevance of the axiom of countable additivity for our probability functions. We show that a rational logic defined over a logically infinite language can only be fully characterized in terms of finitely additive conditional probability. (shrink)
Inductive probability is the logical concept of probability in ordinary language. It is vague but it can be explicated by defining a clear and precise concept that can serve some of the same purposes. This paper presents a general method for doing such an explication and then a particular explication due to Carnap. Common criticisms of Carnap's inductive logic are examined; it is shown that most of them are spurious and the others are not fundamental.
In this article, I present some new group level interpretations of probability, and champion one in particular: a consensus-based variant where group degrees of belief are construed as agreed upon betting quotients rather than shared personal degrees of belief. One notable feature of the account is that it allows us to treat consensus between experts on some matter as being on the union of their relevant background information. In the course of the discussion, I also introduce a novel distinction (...) between intersubjective and interobjective interpretations of probability. (shrink)
Johannes von Kries’s Spielraum-theory is regarded as one of the most important philosophical contributions of the nineteenth century to an objective interpretation of probability. This paper aims at a critical and contextual analysis of von Kries’s approach: It is contextual insofar as it reconstructs the Spielraum-theory in the historical setting that formed his scientific and philosophical outlook. It is critical insofar as it unfolds systematic tensions and inconsistencies which are rooted in this context, especially in the grave change of (...) mechanism which took place in the late nineteenth century. In this regard, the paper focuses on von Kries’s understanding of natural laws and nomological knowledge in relation to his concept of objective probability. While the formal approach of the Spielraum-theory—as far as developed by von Kries—seems sound, his epistemological claims with respect to nomological knowledge sustain classical mechanism and are hence difficult to substantiate from the point of view of modern science. (shrink)
The foundations of probability are viewed through the lens of the subjectivist interpretation. This article surveys conditional probability, arguments for probabilism, probability dynamics, and the evidential and subjective interpretations of probability.
We study probabilistically informative (weak) versions of transitivity by using suitable definitions of defaults and negated defaults in the setting of coherence and imprecise probabilities. We represent p-consistent sequences of defaults and/or negated defaults by g-coherent imprecise probability assessments on the respective sequences of conditional events. Moreover, we prove the coherent probability propagation rules for Weak Transitivity and the validity of selected inference patterns by proving p-entailment of the associated knowledge bases. Finally, we apply our results to study (...) selected probabilistic versions of classical categorical syllogisms and construct a new version of the square of opposition in terms of defaults and negated defaults. (shrink)
In Thinking and Acting John Pollock offers some criticisms of Bayesian epistemology, and he defends an alternative understanding of the role of probability in epistemology. Here, I defend the Bayesian against some of Pollock's criticisms, and I discuss a potential problem for Pollock's alternative account.
This paper analyzes concepts of independence and assumptions of convexity in the theory of sets of probability distributions. The starting point is Kyburg and Pittarelli’s discussion of “convex Bayesianism” (in particular their proposals concerning E-admissibility, independence, and convexity). The paper offers an organized review of the literature on independence for sets of probability distributions; new results on graphoid properties and on the justification of “strong independence” (using exchangeability) are presented. Finally, the connection between Kyburg and Pittarelli’s results and (...) recent developments on the axiomatization of non-binary preferences, and its impact on “complete” independence, are described. (shrink)
This paper argues that the technical notion of conditional probability, as given by the ratio analysis, is unsuitable for dealing with our pretheoretical and intuitive understanding of both conditionality and probability. This is an ontological account of conditionals that include an irreducible dispositional connection between the antecedent and consequent conditions and where the conditional has to be treated as an indivisible whole rather than compositional. The relevant type of conditionality is found in some well-defined group of conditional statements. (...) As an alternative, therefore, we briefly offer grounds for what we would call an ontological reading: for both conditionality and conditional probability in general. It is not offered as a fully developed theory of conditionality but can be used, we claim, to explain why calculations according to the RATIO scheme does not coincide with our intuitive notion of conditional probability. What it shows us is that for an understanding of the whole range of conditionals we will need what John Heil (2003), in response to Quine (1953), calls an ontological point of view. (shrink)
Karl Popper discovered in 1938 that the unconditional probability of a conditional of the form ‘If A, then B’ normally exceeds the conditional probability of B given A, provided that ‘If A, then B’ is taken to mean the same as ‘Not (A and not B)’. So it was clear (but presumably only to him at that time) that the conditional probability of B given A cannot be reduced to the unconditional probability of the material conditional (...) ‘If A, then B’. I describe how this insight was developed in Popper’s writings and I add to this historical study a logical one, in which I compare laws of excess in Kolmogorov probability theory with laws of excess in Popper probability theory. (shrink)
Bayesian confirmation theory is rife with confirmation measures. Zalabardo focuses on the probability difference measure, the probability ratio measure, the likelihood difference measure, and the likelihood ratio measure. He argues that the likelihood ratio measure is adequate, but each of the other three measures is not. He argues for this by setting out three adequacy conditions on confirmation measures and arguing in effect that all of them are met by the likelihood ratio measure but not by any of (...) the other three measures. Glass and McCartney, hereafter “G&M,” accept the conclusion of Zalabardo’s argument along with each of the premises in it. They nonetheless try to improve on Zalabardo’s argument by replacing his third adequacy condition with a weaker condition. They do this because of a worry to the effect that Zalabardo’s third adequacy condition runs counter to the idea behind his first adequacy condition. G&M have in mind confirmation in the sense of increase in probability: the degree to which E confirms H is a matter of the degree to which E increases H’s probability. I call this sense of confirmation “IP.” I set out four ways of precisifying IP. I call them “IP1,” “IP2,” “IP3,” and “IP4.” Each of them is based on the assumption that the degree to which E increases H’s probability is a matter of the distance between p and a certain other probability involving H. I then evaluate G&M’s argument in light of them. (shrink)
Following the pioneer work of Bruno De Finetti , conditional probability spaces (allowing for conditioning with events of measure zero) have been studied since (at least) the 1950's. Perhaps the most salient axiomatizations are Karl Popper's in , and Alfred Renyi's in . Nonstandard probability spaces  are a well know alternative to this approach. Vann McGee proposed in  a result relating both approaches by showing that the standard values of infinitesimal probability functions are representable as (...) Popper functions, and that every Popper function is representable in terms of the standard real values of some infinitesimal measure. Our main goal in this article is to study the constraints on (qualitative and probabilistic) change imposed by an extended version of McGee's result. We focus on an extension capable of allowing for iterated changes of view. Such extension, we argue, seems to be needed in almost all considered applications. Since most of the available axiomatizations stipulate (definitionally) important constraints on iterated change, we propose a non-questionbegging framework, Iterative Probability Systems (IPS) and we show that every Popper function can be regarded as a Bayesian IPS. A generalized version of McGee's result is then proved and several of its consequences considered. In particular we note that our proof requires the imposition of Cumulativity, i.e. the principle that a proposition that is accepted at any stage of an iterative process of acceptance will continue to be accepted at any later stage. The plausibility and range of applicability of Cumulativity is then studied. In particular we appeal to a method for defining belief from conditional probability (first proposed in  and then slightly modified in  and ) in order to characterize the notion of qualitative change induced by Cumulative models of probability kinematics. The resulting cumulative notion is then compared with existing axiomatizations of belief change and probabilistic supposition. We also consider applications in the probabilistic accounts of conditionals  and . (shrink)
Dutch Book arguments have been presented for static belief systems and for belief change by conditionalization. An argument is given here that a rule for belief change which under certain conditions violates probability kinematics will leave the agent open to a Dutch Book.
We generalize the Kolmogorov axioms for probability calculus to obtain conditions defining, for any given logic, a class of probability functions relative to that logic, coinciding with the standard probability functions in the special case of classical logic but allowing consideration of other classes of "essentially Kolmogorovian" probability functions relative to other logics. We take a broad view of the Bayesian approach as dictating inter alia that from the perspective of a given logic, rational degrees of (...) belief are those representable by probability functions from the class appropriate to that logic. Classical Bayesianism, which fixes the logic as classical logic, is only one version of this general approach. Another, which we call Intuitionistic Bayesianism, selects intuitionistic logic as the preferred logic and the associated class of probability functions as the right class of candidate representions of epistemic states (rational allocations of degrees of belief). Various objections to classical Bayesianism are, we argue, best met by passing to intuitionistic Bayesianism—in which the probability functions are taken relative to intuitionistic logic—rather than by adopting a radically non-Kolmogorovian, for example, nonadditive, conception of (or substitute for) probability functions, in spite of the popularity of the latter response among those who have raised these objections. The interest of intuitionistic Bayesianism is further enhanced by the availability of a Dutch Book argument justifying the selection of intuitionistic probability functions as guides to rational betting behavior when due consideration is paid to the fact that bets are settled only when/if the outcome bet on becomes known. (shrink)
We give a direct and elementary proof of the fact that every real-valued probability measure can be approximated—up to an infinitesimal—by a hyperreal-valued one which is regular and defined on the whole powerset of the sample space.
Understanding probabilities as something other than point values has often been motivated by the need to find more realistic models for degree of belief, and in particular the idea that degree of belief should have an objective basis in “statistical knowledge of the world.” I offer here another motivation growing out of efforts to understand how chance evolves as a function of time. If the world is “chancy” in that there are non-trivial, objective, physical probabilities at the macro-level, then the (...) chance of an event e that happens at a given time is \ until it happens. But whether the chance of e goes to one continuously or not is left open. Discontinuities in such chance trajectories can have surprising and troubling consequences for probabilistic analyses of causation and accounts of how events occur in time. This, coupled with the compelling evidence for quantum discontinuities in chance’s evolution, gives rise to a “continuity bind” with respect to chance probability trajectories. I argue that a viable option for circumventing the continuity bind is to understand the probabilities “imprecisely,” that is, as intervals rather than point values. I then develop and motivate an alternative kind of continuity appropriate for interval-valued chance probability trajectories. (shrink)
In the following we will investigate whether von Mises’ frequency interpretation of probability can be modified to make it philosophically acceptable. We will reject certain elements of von Mises’ theory, but retain others. In the interpretation we propose we do not use von Mises’ often criticized ‘infinite collectives’ but we retain two essential claims of his interpretation, stating that probability can only be defined for events that can be repeated in similar conditions, and that exhibit frequency stabilization. The (...) central idea of the present article is that the mentioned ‘conditions’ should be well-defined and ‘partitioned’. More precisely, we will divide probabilistic systems into object, initializing, and probing subsystem, and show that such partitioning allows to solve problems. Moreover we will argue that a key idea of the Copenhagen interpretation of quantum mechanics (the determinant role of the observing system) can be seen as deriving from an analytic definition of probability as frequency. Thus a secondary aim of the article is to illustrate the virtues of analytic definition of concepts, consisting of making explicit what is implicit. (shrink)
The representational epistemic approach to the design of visual displays and notation systems advocates encoding the fundamental conceptual structure of a knowledge domain directly in the structure of a representational system. It is claimed that representations so designed will benefit from greater semantic transparency, which enhances comprehension and ease of learning, and plastic generativity, which makes the meaningful manipulation of the representation easier and less error prone. Epistemic principles for encoding fundamental conceptual structures directly in representational schemes are described. The (...) diagrammatic recodification of probability theory is undertaken to demonstrate how the fundamental conceptual structure of a knowledge domain can be analyzed, how the identified conceptual structure may be encoded in a representational system, and the cognitive benefits that follow. An experiment shows the new probability space diagrams are superior to the conventional approach for learning this conceptually challenging topic. (shrink)
There are many scientific and everyday cases where each of Pr and Pr is high and it seems that Pr is high. But high probability is not transitive and so it might be in such cases that each of Pr and Pr is high and in fact Pr is not high. There is no issue in the special case where the following condition, which I call “C1”, holds: H 1 entails H 2. This condition is sufficient for transitivity in (...) high probability. But many of the scientific and everyday cases referred to above are cases where it is not the case that H 1 entails H 2. I consider whether there are additional conditions sufficient for transitivity in high probability. I consider three candidate conditions. I call them “C2”, “C3”, and “C2&3”. I argue that C2&3, but neither C2 nor C3, is sufficient for transitivity in high probability. I then set out some further results and relate the discussion to the Bayesian requirement of coherence. (shrink)