The present chapter describes a probabilistic framework of human reasoning. It is based on probability logic. While there are several approaches to probability logic, we adopt the coherence based approach.
We provide a 'verisimilitudinarian' analysis of the well-known Linda paradox or conjunction fallacy, i.e., the fact that most people judge the probability of the conjunctive statement "Linda is a bank teller and is active in the feminist movement" (B & F) as more probable than the isolated statement "Linda is a bank teller" (B), contrary to an uncontroversial principle of probability theory. The basic idea is that experimental participants may judge B & F a better hypothesis about Linda (...) as compared to B because they evaluate B & F as more verisimilar than B. In fact, the hypothesis "feminist bank teller", while less likely to be true than "bank teller", may well be a better approximation to the truth about Linda. (shrink)
This dissertation is a contribution to formal and computational philosophy. -/- In the first part, we show that by exploiting the parallels between large, yet finite lotteries on the one hand and countably infinite lotteries on the other, we gain insights in the foundations of probability theory as well as in epistemology. Case 1: Infinite lotteries. We discuss how the concept of a fair finite lottery can best be extended to denumerably infinite lotteries. The solution boils down to the (...) introduction of infinitesimal probability values, which can be achieved using non-standard analysis. Our solution can be generalized to uncountable sample spaces, giving rise to a Non-Archimedean Probability (NAP) theory. Case 2: Large but finite lotteries. We propose application of the language of relative analysis (a type of non-standard analysis) to formulate a new model for rational belief, called Stratified Belief. This contextualist model seems well-suited to deal with a concept of beliefs based on probabilities ‘sufficiently close to unity’. -/- The second part presents a case study in social epistemology. We model a group of agents who update their opinions by averaging the opinions of other agents. Our main goal is to calculate the probability for an agent to end up in an inconsistent belief state due to updating. To that end, an analytical expression is given and evaluated numerically, both exactly and using statistical sampling. The probability of ending up in an inconsistent belief state turns out to be always smaller than 2%. (shrink)
This paper argues that the technical notion of conditional probability, as given by the ratio analysis, is unsuitable for dealing with our pretheoretical and intuitive understanding of both conditionality and probability. This is an ontological account of conditionals that include an irreducible dispositional connection between the antecedent and consequent conditions and where the conditional has to be treated as an indivisible whole rather than compositional. The relevant type of conditionality is found in some well-defined group of conditional statements. (...) As an alternative, therefore, we briefly offer grounds for what we would call an ontological reading: for both conditionality and conditional probability in general. It is not offered as a fully developed theory of conditionality but can be used, we claim, to explain why calculations according to the RATIO scheme does not coincide with our intuitive notion of conditional probability. What it shows us is that for an understanding of the whole range of conditionals we will need what John Heil (2003), in response to Quine (1953), calls an ontological point of view. (shrink)
This is a study in the meaning of natural language probability operators, sentential operators such as probably and likely. We ask what sort of formal structure is required to model the logic and semantics of these operators. Along the way we investigate their deep connections to indicative conditionals and epistemic modals, probe their scalar structure, observe their sensitivity to contex- tually salient contrasts, and explore some of their scopal idiosyncrasies.
Bayesian decision theory is here construed as explicating a particular concept of rational choice and Bayesian probability is taken to be the concept of probability used in that theory. Bayesian probability is usually identified with the agent’s degrees of belief but that interpretation makes Bayesian decision theory a poor explication of the relevant concept of rational choice. A satisfactory conception of Bayesian decision theory is obtained by taking Bayesian probability to be an explicatum for inductive (...) class='Hi'>probability given the agent’s evidence. (shrink)
Some have argued that chance and determinism are compatible in order to account for the objectivity of probabilities in theories that are compatible with determinism, like Classical Statistical Mechanics (CSM) and Evolutionary Theory (ET). Contrarily, some have argued that chance and determinism are incompatible, and so such probabilities are subjective. In this paper, I argue that both of these positions are unsatisfactory. I argue that the probabilities of theories like CSM and ET are not chances, but also that they are (...) not subjective probabilities either. Rather, they are a third type of probability, which I call counterfactual probability. The main distinguishing feature of counterfactual-probability is the role it plays in conveying important counterfactual information in explanations. This distinguishes counterfactual probability from chance as a second concept of objective probability. (shrink)
David Wallace has given a decision-theoretic argument for the Born Rule in the context of Everettian quantum mechanics (EQM). This approach promises to resolve some long-standing problems with probability in EQM, but it has faced plenty of resistance. One kind of objection (the ‘incoherence problem’) charges that the requisite notion of decision-theoretic uncertainty is unavailable in the Everettian picture, so that the argument cannot gain any traction; another kind of objection grants the proof’s applicability and targets the premises. In (...) this article I propose some novel principles connecting the physics of EQM with the metaphysics of modality, and argue that in the resulting framework the incoherence problem does not arise. These principles also help to justify one of the most controversial premises of Wallace’s argument, ‘branching indifference’. Absent any a priori reason to align the metaphysics with the physics in some other way, the proposed principles can be adopted on grounds of theoretical utility. The upshot is that Everettians can, after all, make clear sense of objective probability. 1 Introduction2 Setup3 Individualism versus Collectivism4 The Ingredients of Indexicalism5 Indexicalism and Incoherence5.1 The trivialization problem5.2 The uncertainty problem6 Indexicalism and Branching Indifference6.1 Introducing branching indifference6.2 The pragmatic defence of branching indifference6.3 The non-existence defence of branching indifference6.4 The indexicalist defence of branching indifference7 Conclusion. (shrink)
We present a model for studying communities of epistemically interacting agents who update their belief states by averaging (in a specified way) the belief states of other agents in the community. The agents in our model have a rich belief state, involving multiple independent issues which are interrelated in such a way that they form a theory of the world. Our main goal is to calculate the probability for an agent to end up in an inconsistent belief state due (...) to updating (in the given way). To that end, an analytical expression is given and evaluated numerically, both exactly and using statistical sampling. It is shown that, under the assumptions of our model, an agent always has a probability of less than 2% of ending up in an inconsistent belief state. Moreover, this probability can be made arbitrarily small by increasing the number of independent issues the agents have to judge or by increasing the group size. A real-world situation to which this model applies is a group of experts participating in a Delphi-study. (shrink)
A careful analysis of Salmon’s Theoretical Realism and van Fraassen’s Constructive Empiricism shows that both share a common origin: the requirement of literal construal of theories inherited by the Standard View. However, despite this common starting point, Salmon and van Fraassen strongly disagree on the existence of unobservable entities. I argue that their different ontological commitment towards the existence of unobservables traces back to their different views on the interpretation of probability via different conceptions of induction. In fact, inferences (...) to statements claiming the existence of unobservable entities are inferences to probabilistic statements, whence the crucial importance of the interpretation of probability. (shrink)
In ‘An Almost Absolute Value in History’ John T. Noonan criticizes several attempts to provide a criterion for when an entity deserves rights. These criteria, he argues are either arbitrary or lead to absurd consequence. Noonan proposes human conception as the criterion of rights, and justifies it by appeal to the sharp shift in probability, at conception, of becoming a being possessed of human reason. Conception, then, is when abortion becomes immoral.The article has an historical and a philosophical goal. (...) The historical goal is to carefully present the probability argument in a charitable manner. The philosophical goal is to offer a unique criticism of Noonan's probability argument against abortion. I argue that, even on a very charitable reading of Noonan's argument for the conception criterion, this criterion is also susceptible to charges of arbitrariness and absurdity. Noonan's claim that probability shifts have anything to do with the moral rights of fetuses cannot be made coherent. I also show that there are problems with Noonan's assumptions about moral rights and the potential to become a being possessed of human reason. (shrink)
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned probability zero (in other words: the probability functions are regular). We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov’s (...) axiomatization of probability is replaced by a different type of infinite additivity. (shrink)
This paper seeks to defend the following conclusions: The program advanced by Carnap and other necessarians for probability logic has little to recommend it except for one important point. Credal probability judgments ought to be adapted to changes in evidence or states of full belief in a principled manner in conformity with the inquirer’s confirmational commitments—except when the inquirer has good reason to modify his or her confirmational commitment. Probability logic ought to spell out the constraints on (...) rationally coherent confirmational commitments. In the case where credal judgments are numerically determinate confirmational commitments correspond to Carnap’s credibility functions mathematically represented by so—called confirmation functions. Serious investigation of the conditions under which confirmational commitments should be changed ought to be a prime target for critical reflection. The necessarians were mistaken in thinking that confirmational commitments are immune to legitimate modification altogether. But their personalist or subjectivist critics went too far in suggesting that we might dispense with confirmational commitments. There is room for serious reflection on conditions under which changes in confirmational commitments may be brought under critical control. Undertaking such reflection need not become embroiled in the anti inductivism that has characterized the work of Popper, Carnap and Jeffrey and narrowed the focus of students of logical and methodological issues pertaining to inquiry. (shrink)
Contrary to Bell’s theorem it is demonstrated that with the use of classical probability theory the quantum correlation can be approximated. Hence, one may not conclude from experiment that all local hidden variable theories are ruled out by a violation of inequality result.
Probability plays a crucial role regarding the understanding of the relationship which exists between mathematics and physics. It will be the point of departure of this brief reflection concerning this subject, as well as about the placement of Poincaré’s thought in the scenario offered by some contemporary perspectives.
Explains how to use a trivalent semantics to explain what is often called Adam’s Thesis, the thesis that the probability of a conditional is the conditional probability of the consequent given the antecedent.
First, we discuss basic probability notions from the viewpoint of category theory. Our approach is based on the following four “sine quibus non” conditions: 1. (elementary) category theory is efficient (and suffices); 2. random variables, observables, probability measures, and states are morphisms; 3. classical probability theory and fuzzy probability theory in the sense of S. Gudder and S. Bugajski are special cases of a more general model; 4. a good model allows natural modifications.
Tom Stoneham put forward an argument purporting to show that coherentists are, under certain conditions, committed to the conjunction fallacy. Stoneham considers this argument a reductio ad absurdum of any coherence theory of justification. I argue that Stoneham neglects the distinction between degrees of confirmation and degrees of probability. Once the distinction is in place, it becomes clear that no conjunction fallacy has been committed.
Draft of a paper for the Sinn und Bedeutung 14 conference. Explains how to capture the link between conditionals the probability of indicative conditionals and conditional probability using a classical semantics for conditionals. (Note: some introductory material is shared with a twin paper, "Capturing the Relationship Between Conditionals and Conditional Probability with a Trivalent Semantics".).
Order of information plays a crucial role in the process of updating beliefs across time. In fact, the presence of order effects makes a classical or Bayesian approach to inference difficult. As a result, the existing models of inference, such as the belief-adjustment model, merely provide an ad hoc explanation for these effects. We postulate a quantum inference model for order effects based on the axiomatic principles of quantum probability theory. The quantum inference model explains order effects by transforming (...) a state vector with different sequences of operators for different orderings of information. We demonstrate this process by fitting the quantum model to data collected in a medical diagnostic task and a jury decision-making task. To further test the quantum inference model, a new jury decision-making experiment is developed. Using the results of this experiment, we compare the quantum inference model with two versions of the belief-adjustment model, the adding model and the averaging model. We show that both the quantum model and the adding model provide good fits to the data. To distinguish the quantum model from the adding model, we develop a new experiment involving extreme evidence. The results from this new experiment suggest that the adding model faces limitations when accounting for tasks involving extreme evidence, whereas the quantum inference model does not. Ultimately, we argue that the quantum model provides a more coherent account for order effects that was not possible before. (shrink)
By supplying propositional calculus with a probability semantics we showed, in our 1996, that finite stochastic problems can be treated by logic-theoretic means equally as well as by the usual set-theoretic ones. In the present paper we continue the investigation to further the use of logical notions in probability theory. It is shown that quantifier logic, when supplied with a probability semantics, is capable of treating stochastic problems involving countably many trials.
The last 20 years or so has seen an intense search carried out within Dempster–Shafer theory, with the aim of finding a generalization of the Shannon entropy for belief functions. In that time, there has also been much progress made in credal set theory—another generalization of the traditional Bayesian epistemic representation—albeit not in this particular area. In credal set theory, sets of probability functions are utilized to represent the epistemic state of rational agents instead of the single probability (...) function of traditional Bayesian theory. The Shannon entropy has been shown to uniquely capture certain highly intuitive properties of uncertainty, and can thus be considered a measure of that quantity. This article presents two measures developed with the purpose of generalizing the Shannon entropy for (1) unordered convex credal sets and (2) possibly non-convex credal sets ordered by second order probability, thereby providing uncertainty measures for such epistemic representations. There is also a comparison with the results of the measure AU developed within Dempster–Shafer theory in a few instances where unordered convex credal set theory and Dempster–Shafer theory overlap. (shrink)
Evolution is a time process. It proceeds in steps of definite length. The probability of each step is relatively high, so self organization of complex systems will be possible in finite time. Prerequisite for such a process is a selection rule, which certainly exists in evolution. Therefore, it would be wrong to calculate the probability of the formation of a complex system solely on the basis of the number of its components and as a momentary event.
We discuss a generalization of the standard notion of probability space and show that the emerging framework, to be called operational probability theory, can be considered as underlying quantal theories. The proposed framework makes special reference to the convex structure of states and to a family of observables which is wider than the familiar set of random variables: it appears as an alternative to the known algebraic approach to quantum probability.
We offer a probabilistic model of rational consequence relations (Lehmann and Magidor, 1990) by appealing to the extension of the classical Ramsey-Adams test proposed by Vann McGee in (McGee, 1994). Previous and influential models of nonmonotonic consequence relations have been produced in terms of the dynamics of expectations (Gärdenfors and Makinson, 1994; Gärdenfors, 1993).'Expectation' is a term of art in these models, which should not be confused with the notion of expected utility. The expectations of an agent are some form (...) of belief weaker than absolute certainty. Our model offers a modified and extended version of an account of qualitative belief in terms of conditional probability, first presented in (van Fraassen, 1995). We use this model to relate probabilistic and qualitative models of non-monotonic relations in terms of expectations. In doing so we propose a probabilistic model of the notion of expectation. We provide characterization results both for logically finite languages and for logically infinite, but countable, languages. The latter case shows the relevance of the axiom of countable additivity for our probability functions. We show that a rational logic defined over a logically infinite language can only be fully characterized in terms of finitely additive conditional probability. (shrink)
I shall argue that there is no such property of an event as its “probability.” This is why standard interpretations cannot give a sound deﬁnition in empirical terms of what “probability” is, and this is why empirical sciences like physics can manage without such a deﬁnition. “Probability” is a collective term, the meaning of which varies from context to context: it means diﬀerent — dimensionless [0, 1]-valued — physical quantities characterising the diﬀerent particular situations. In other words, (...)probability is a reducible concept, supervening on physical quantities characterising the state of aﬀairs corresponding to the event in question. On the other hand, however, these “probability-like” physical quantities correspond to objective features of the physical world, and are objectively related to measurable quantities like relative frequencies of physical events based on ﬁnite samples — no matter whether the world is objectively deterministic or indeterministic. (shrink)
The purpose of this paper is to improve on the logical and measure-theoretic foundations for the notion of probability in the law of evidence, which were given in my contributions Åqvist [ (1990) Logical analysis of epistemic modality: an explication of the Bolding–Ekelöf degrees of evidential strength. In: Klami HT (ed) Rätt och Sanning (Law and Truth. A symposium on legal proof-theory in Uppsala May 1989). Iustus Förlag, Uppsala, pp 43–54; (1992) Towards a logical theory of legal evidence: semantic (...) analysis of the Bolding–Ekelöf degrees of evidential strength. In: Martino AA (ed) Expert systems in law. Elsevier Science Publishers BV, Amsterdam, North-Holland, pp 67–86]. The present approach agrees with the one adopted in those contributions in taking its main task to be that of providing a semantic analysis, or explication, of the so called Bolding–Ekelöf degrees of evidential strength (“proof-strength”) as applied to the establishment of matters of fact in law-courts. However, it differs from the one advocated in our earlier work on the subject in explicitly appealing to what is known as “Pro-et-Contra Argumentation”, after the famous Norwegian philosopher Arne Naess. It tries to bring out the logical form of that interesting kind of reasoning, at least in the context of the law of evidence. The formal techniques used here will be seen to be largely inspired by the important work done by Patrick Suppes, notably Suppes [(1957) Introduction to logic. van Nostrand, Princeton and (1972) Finite equal-interval measurement structures. Theoria 38:45–63]. (shrink)
There exist several phenomena breaking the classical probability laws. The systems related to such phenomena are context-dependent, so that they are adaptive to other systems. In this paper, we present a new mathematical formalism to compute the joint probability distribution for two event-systems by using concepts of the adaptive dynamics and quantum information theory, e.g., quantum channels and liftings. In physics the basic example of the context-dependent phenomena is the famous double-slit experiment. Recently similar examples have been found (...) in biological and psychological sciences. Our approach is an extension of traditional quantum probability theory, and it is general enough to describe aforementioned contextual phenomena outside of quantum physics. (shrink)
We generalize the Kolmogorov axioms for probability calculus to obtain conditions defining, for any given logic, a class of probability functions relative to that logic, coinciding with the standard probability functions in the special case of classical logic but allowing consideration of other classes of "essentially Kolmogorovian" probability functions relative to other logics. We take a broad view of the Bayesian approach as dictating inter alia that from the perspective of a given logic, rational degrees of (...) belief are those representable by probability functions from the class appropriate to that logic. Classical Bayesianism, which fixes the logic as classical logic, is only one version of this general approach. Another, which we call Intuitionistic Bayesianism, selects intuitionistic logic as the preferred logic and the associated class of probability functions as the right class of candidate representions of epistemic states (rational allocations of degrees of belief). Various objections to classical Bayesianism are, we argue, best met by passing to intuitionistic Bayesianism—in which the probability functions are taken relative to intuitionistic logic—rather than by adopting a radically non-Kolmogorovian, for example, nonadditive, conception of (or substitute for) probability functions, in spite of the popularity of the latter response among those who have raised these objections. The interest of intuitionistic Bayesianism is further enhanced by the availability of a Dutch Book argument justifying the selection of intuitionistic probability functions as guides to rational betting behavior when due consideration is paid to the fact that bets are settled only when/if the outcome bet on becomes known. (shrink)
Some propositions add more information to bodies of propositions than do others. We start with intuitive considerations on qualitative comparisons of information added . Central to these are considerations bearing on conjunctions and on negations. We find that we can discern two distinct, incompatible, notions of information added. From the comparative notions we pass to quantitative measurement of information added. In this we borrow heavily from the literature on quantitative representations of qualitative, comparative conditional probability. We look at two (...) ways to obtain a quantitative conception of information added. One, the most direct, mirrors Bernard Koopman’s construction of conditional probability: by making a strong structural assumption, it leads to a measure that is, transparently, some function of a function P which is, formally, an assignment of conditional probability (in fact, a Popper function). P reverses the information added order and mislocates the natural zero of the scale so some transformation of this scale is needed but the derivation of P falls out so readily that no particular transformation suggests itself. The Cox–Good–Aczél method assumes the existence of a quantitative measure matching the qualitative relation, and builds on the structural constraints to obtain a measure of information that can be rescaled as, formally, an assignment of conditional probability. A classical result of Cantor’s, subsequently strengthened by Debreu, goes some way towards justifying the assumption of the existence of a quantitative scale. What the two approaches give us is a pointer towards a novel interpretation of probability as a rescaling of a measure of information added. (shrink)
An assertion of high conditional probability or, more briefly, an HCP assertion is a statement of the type: The conditional probability of B given A is close to one. The goal of this paper is to construct logics of HCP assertions whose conclusions are highly likely to be correct rather than certain to be correct. Such logics would allow useful conclusions to be drawn when the premises are not strong enough to allow conclusions to be reached with certainty. (...) This goal is achieved by taking Adams' (1966) logic, changing its intended application from conditionals to HCP assertions, and then weakening its criterion for entailment. According to the weakened entailment criterion, called the Criterion of Near Surety and which may be loosely interpreted as a Bayesian criterion, a conclusion is entailed if and only if nearly every model of the premises is a model of the conclusion. The resulting logic, called NSL, is nonmonotonic. Entailment in this logic, although not as strict as entailment in Adams' logic, is more strict than entailment in the propositional logic of material conditionals. Next, NSL was modified by requiring that each HCP assertion be scaled; this means that to each HCP assertion was associated a bound on the deviation from 1 of the conditional probability that is the subject of the assertion. Scaling of HCP assertions is useful for breaking entailment deadlocks. For example, it it is known that the conditional probabilities of C given A and of - C given B are both close to one but the bound on the former's deviation from 1 is much smaller than the latter's, then it may be concluded that in all likelihood the conditional probability of C given A ∧ B is close to one. The resulting logic, called NSL-S, is also nonmonotonic. Despite great differences in their definitions of entailment, entailment in NSL is equivalent to Lehmann and Magidor's rational closure and, disregarding minor differences concerning which premise sets are considered consistent, entailment in NSL-S is equivalent to entailment in Goldszmidt and Pearl's System-Z⁺. Bacchus, Grove, Halpern, and Koller proposed two methods of developing a predicate calculus based on the Criterion of Near Surety. In their randomstructures method, which assumed a prior distribution similar to that of NSL, it appears possible to define an entailment relation equivalent to that of NSL. In their random-worlds method, which assumed a prior distribution dramatically different from that of NSL, it is known that the entailment relation is different from that of NSL. (shrink)
We introduce the concept of partial event as a pair of disjoint sets, respectively the favorable and the unfavorable cases. Partial events can be seen as a De Morgan algebra with a single fixed point for the complement. We introduce the concept of a measure of partial probability, based on a set of axioms resembling Kolmogoroff’s. Finally we define a concept of conditional probability for partial events and apply this concept to the analysis of the two-slit experiment in (...) quantum mechanics. (shrink)
In this article, I present some new group level interpretations of probability, and champion one in particular: a consensus-based variant where group degrees of belief are construed as agreed upon betting quotients rather than shared personal degrees of belief. One notable feature of the account is that it allows us to treat consensus between experts on some matter as being on the union of their relevant background information. In the course of the discussion, I also introduce a novel distinction (...) between intersubjective and interobjective interpretations of probability. (shrink)
The Bayesian model has been used in psychology as the standard reference for the study of probability revision. In the first part of this paper we show that this traditional choice restricts the scope of the experimental investigation of revision to a stable universe. This is the case of a situation that, technically, is known as focusing. We argue that it is essential for a better understanding of human probability revision to consider another situation called updating (Katsuno & (...) Mendelzon, 1992), in which the universe is evolving. In that case the structure of the universe has definitely been transformed and the revision message conveys information on the resulting universe. The second part of the paper presents four experiments based on the Monty Hall puzzle that aim to show that updating is a natural frame for individuals to revise their beliefs. (shrink)
This paper analyzes concepts of independence and assumptions of convexity in the theory of sets of probability distributions. The starting point is Kyburg and Pittarelli’s discussion of “convex Bayesianism” (in particular their proposals concerning E-admissibility, independence, and convexity). The paper offers an organized review of the literature on independence for sets of probability distributions; new results on graphoid properties and on the justification of “strong independence” (using exchangeability) are presented. Finally, the connection between Kyburg and Pittarelli’s results and (...) recent developments on the axiomatization of non-binary preferences, and its impact on “complete” independence, are described. (shrink)
I describe a realist, ontologically objective interpretation of probability, "far-flung frequency (FFF) mechanistic probability". FFF mechanistic probability is defined in terms of facts about the causal structure of devices and certain sets of frequencies in the actual world. Though defined partly in terms of frequencies, FFF mechanistic probability avoids many drawbacks of well-known frequency theories and helps causally explain stable frequencies, which will usually be close to the values of mechanistic probabilities. I also argue that it's (...) a virtue rather than a failing of FFF mechanistic probability that it does not define single-case chances, and compare some aspects of my interpretation to a recent interpretation proposed by Strevens. (shrink)
Cumulative Prospect Theory (CPT) does not explain the St. Petersburg Paradox. We show that the solutions related to probability weighting proposed to solve this paradox, (Blavatskyy, Management Science 51:677–678, 2005; Rieger and Wang, Economic Theory 28:665–679, 2006) have to cope with limitations. In that framework, CPT fails to accommodate both gambling and insurance behavior. We suggest replacing the weighting functions generally proposed in the literature by another specification which respects the following properties: (1) to solve the paradox, the slope (...) at zero has to be finite. (2) to account for the fourfold pattern of risk attitudes, the probability weighting has to be strong enough. (shrink)
In this contribution, we focus on probabilistic problems with a denumerably or non-denumerably infinite number of possible outcomes. Kolmogorov (1933) provided an axiomatic basis for probability theory, presented as a part of measure theory, which is a branch of standard analysis or calculus. Since standard analysis does not allow for non-Archimedean quantities (i.e. infinitesimals), we may call Kolmogorov's approach "Archimedean probability theory". We show that allowing non-Archimedean probability values may have considerable epistemological advantages in the infinite case. (...) The current paper focuses on the motivation for our new axiomatization. (shrink)
Do participants bring their own priors to an experiment? If so, do they share the same priors as the researchers who design the experiment? In this article, we examine the extent to which self-generated priors conform to experimenters’ expectations by explicitly asking participants to indicate their own priors in estimating the probability of a variety of events. We find in Study 1 that despite being instructed to follow a uniform distribution, participants appear to have used their own priors, which (...) deviated from the given instructions. Using subjects’ own priors allows us to account better for their responses rather than merely to test the accuracy of their estimates. Implications for the study of judgment and decision making are discussed. (shrink)
This approach does not define a probability measure by syntactical structures. It reveals a link between modal logic and mathematical probability theory. This is shown (1) by adding an operator (and two further connectives and constants) to a system of lower predicate calculus and (2) regarding the models of that extended system. These models are models of the modal system S₅ (without the Barcan formula), where a usual probability measure is defined on their set of possible worlds. (...) Mathematical probability models can be seen as models of S₅. (shrink)
The prime concern of this paper is with the nature of probability. It is argued that questions concerning the nature of probability are intimately linked to questions about the nature of time. The case study here concerns the single case propensity interpretation of probability. It is argued that while this interpretation of probability has a natural place in the quantum theory, the metaphysical picture of time to be found in relativity theory is incompatible with such a (...) treatment of probability. (shrink)
Intergenerational impartiality requires putting the welfare of future generations at par with that of our own. However, rational choice requires weighting all welfare values by the respective probabilities of realization. As the risk of non-survival of mankind is strictly positive for all time periods and as the probability of non-survival is cumulative, the probability weights operate like discount factors, though justified on a morally justifiable and completely different ground. Impartial intertemporal welfare maximization is acceptable, though the welfare of (...) people in the very far future has lower effects as the probabilities of their existence are also lower. However, the effective discount rate on future welfare values (distinct from monetary values) justified on this ground is likely to be less than 0.1 per annum. Such discounting does not compromise environmental protection and sustainability unduly. The finiteness of our universe implies that the sum of our expected welfare to infinity remains finite, solving the paradox of having to compare different infinite values in optimal growth/conservation theories. (shrink)
A non-monotonic theory of probability is put forward and shown to have applicability in the quantum domain. It is obtained simply by replacing Kolmogorov's positivity axiom, which places the lower bound for probabilities at zero, with an axiom that reduces that lower bound to minus one. Kolmogorov's theory of probability is monotonic, meaning that the probability of A is less then or equal to that of B whenever A entails B. The new theory violates monotonicity, as its (...) name suggests; yet, many standard theorems are also theorems of the new theory since Kolmogorov's other axioms are retained. What is of particular interest is that the new theory can accommodate quantum phenomena (photon polarization experiments) while preserving Boolean operations, unlike Kolmogorov's theory. Although non-standard notions of probability have been discussed extensively in the physics literature, they have received very little attention in the philosophical literature. One likely explanation for that difference is that their applicability is typically demonstrated in esoteric settings that involve technical complications. That barrier is effectively removed for non-monotonic probability theory by providing it with a homely setting in the quantum domain. Although the initial steps taken in this paper are quite substantial, there is much else to be done, such as demonstrating the applicability of non-monotonic probability theory to other quantum systems and elaborating the interpretive framework that is provisionally put forward here. Such matters will be developed in other works. (shrink)
A logic for classical conditional events was investigated by Dubois and Prade. In their approach, the truth value of a conditional event may be undetermined. In this paper we extend the treatment to many-valued events. Then we support the thesis that probability over partially undetermined events is a conditional probability, and we interpret it in terms of bets in the style of de Finetti. Finally, we show that the whole investigation can be carried out in a logical and (...) algebraic setting, and we find a logical characterization of coherence for assessments of partially undetermined events. (shrink)
We axiomatize the notion of state over finitely generated free NM-algebras, the Lindenbaum algebras of pure Nilpotent Minimum logic. We show that states over the free n -generated NM-algebra exactly correspond to integrals of elements of with respect to Borel probability measures.
Following the pioneer work of Bruno De Finetti , conditional probability spaces (allowing for conditioning with events of measure zero) have been studied since (at least) the 1950's. Perhaps the most salient axiomatizations are Karl Popper's in , and Alfred Renyi's in . Nonstandard probability spaces  are a well know alternative to this approach. Vann McGee proposed in  a result relating both approaches by showing that the standard values of infinitesimal probability functions are representable as (...) Popper functions, and that every Popper function is representable in terms of the standard real values of some infinitesimal measure. Our main goal in this article is to study the constraints on (qualitative and probabilistic) change imposed by an extended version of McGee's result. We focus on an extension capable of allowing for iterated changes of view. Such extension, we argue, seems to be needed in almost all considered applications. Since most of the available axiomatizations stipulate (definitionally) important constraints on iterated change, we propose a non-questionbegging framework, Iterative Probability Systems (IPS) and we show that every Popper function can be regarded as a Bayesian IPS. A generalized version of McGee's result is then proved and several of its consequences considered. In particular we note that our proof requires the imposition of Cumulativity, i.e. the principle that a proposition that is accepted at any stage of an iterative process of acceptance will continue to be accepted at any later stage. The plausibility and range of applicability of Cumulativity is then studied. In particular we appeal to a method for defining belief from conditional probability (first proposed in  and then slightly modified in  and ) in order to characterize the notion of qualitative change induced by Cumulative models of probability kinematics. The resulting cumulative notion is then compared with existing axiomatizations of belief change and probabilistic supposition. We also consider applications in the probabilistic accounts of conditionals  and . (shrink)
At a time in which probability theory is exerting an unprecedented influence on epistemology and philosophy of science, promising to deliver an exact and unified foundation for the philosophy of rational inference and decision-making, it is worth remembering that the philosophy of religion has long proven to be an extremely fertile ground for the application of probabilistic thinking to traditional epistemological debates. This volume brings together original contributions from twelve contemporary researchers, both established and emerging, to offer a representative (...) sample of the work currently being carried out in this potentially rich field of inquiry. Grouped into five sections, the chapters span a broad range of traditional issues in religious epistemology. The first three sections discuss the evidential impact of various considerations that have often been brought to bear on the question of the existence of God. These include witness reports of the occurrence of miraculous events, the existence of complex biological adaptations, the apparent ‘fine-tuning’ for life of various physical constants and the existence of seemingly unnecessary evil. The fourth section addresses a number of issues raised by Pascal’s famous pragmatic argument for theistic belief. A final section offers probabilistic perspectives on the rationality of faith and the epistemic significance of religious disagreement. (shrink)
The logical treatment of the nature of religious belief (here I will concentrate on belief in Christianity) has been distorted by the acceptance of a false dilemma. On the one hand, many (e.g., Braithwaite, Hare) have placed the significance of religious belief entirely outside the realm of intellectual cognition. According to this view, religious statements do not express factual propositions: they are not made true or false by the ways things are. Religious belief consists in a certain attitude toward the (...) world, life, or other human beings, or in what sorts of things one values. On the other hand, others (such as Swinburne, 1981, Chapers 1 and 4) have taken religious belief to include (at least) being certain of the truth of particular factual religious propositions. The strength of a person's religious belief is identified with his degree of confidence in the truth of those propositions, measured by the "subjective probability" which those propositions have for that person. I propose a third alternative, according to which, (1) contrary to the first view, religious belief does involve a relation to factual religious propositions, such as that God exists, that Jesus was God and man, etc., -- propositions which are made true or false by the way things actually are -- but, (2) contrary to the second view, the strength of religious belief is measured, not by the degree of one's confidence1 in the truth of these propositions, but rather by the way in which the value or desirability to oneself of the various ways the world could be is affected by their including or not including the truth of these religious propositions. Thus, religious belief does consist in what one values or prizes, not in what.. (shrink)
De Finetti's treatise on the theory of probability begins with the provocative statement PROBABILITY DOES NOT EXIST, meaning that probability does not exist in an objective sense. Rather, probability exists only subjectively within the minds of individuals. De Finetti defined subjective probabilities in terms of the rates at which individuals are willing to bet money on events, even though, in principle, such betting rates could depend on state-dependent marginal utility for money as well as on beliefs. (...) Most later authors, from Savage onward, have attempted to disentangle beliefs from values by introducing hypothetical bets whose payoffs are abstract consequences that are assumed to have state-independent utility. In this paper, I argue that de Finetti was right all along: PROBABILITY, considered as a numerical measure of pure belief uncontaminated by attitudes toward money, does not exist. Rather, what exist are de Finetti's `previsions', or betting rates for money, otherwise known in the literature as `risk neutral probabilities'. But the fact that previsions are not measures of pure belief turns out not to be problematic for statistical inference, decision analysis, or economic modeling. (shrink)