This chapter explores the topic of impreciseprobabilities as it relates to model validation. IP is a family of formal methods that aim to provide a better representationRepresentation of severe uncertainty than is possible with standard probabilistic methods. Among the methods discussed here are using sets of probabilities to represent uncertainty, and using functions that do not satisfy the additvity property. We discuss the basics of IP, some examples of IP in computer simulation contexts, possible interpretations of (...) the IP framework and some conceptual problems for the approach. We conclude with a discussion of IP in the context of model validation. (shrink)
The usage of impreciseprobabilities has been advocated in many domains: A number of philosophers have argued that our belief states should be “imprecise” in response to certain sorts of evidence, and impreciseprobabilities have been thought to play an important role in disciplines such as artificial intelligence, climate science, and engineering. In this paper I’m interested in the question of whether the usage of impreciseprobabilities can be given a practical motivation (a (...) motivation based on practical rather than epistemic, or alethic concerns). My aim is to challenge the central motivation for using impreciseprobabilities in decision-making that has been offered in the literature: the idea that, in at least some contexts, it’s desirable to be ambiguity averse. If I succeed, this will show that we need to reconsider whether there are good reasons to use impreciseprobabilities in contexts in which making good decisions is what's of primary concern. (shrink)
Many have argued that a rational agent's attitude towards a proposition may be better represented by a probability range than by a single number. I show that in such cases an agent will have unstable betting behaviour, and so will behave in an unpredictable way. I use this point to argue against a range of responses to the ‘two bets’ argument for sharp probabilities.
There is a trade-off between specificity and accuracy in existing models of belief. Descriptions of agents in the tripartite model, which recognizes only three doxastic attitudes—belief, disbelief, and suspension of judgment—are typically accurate, but not sufficiently specific. The orthodox Bayesian model, which requires real-valued credences, is perfectly specific, but often inaccurate: we often lack precise credences. I argue, first, that a popular attempt to fix the Bayesian model by using sets of functions is also inaccurate, since it requires us to (...) have interval-valued credences with perfectly precise endpoints. We can see this problem as analogous to the problem of higher order vagueness. Ultimately, I argue, the only way to avoid these problems is to endorse Insurmountable Unclassifiability. This principle has some surprising and radical consequences. For example, it entails that the trade-off between accuracy and specificity is in-principle unavoidable: sometimes it is simply impossible to characterize an agent’s doxastic state in a way that is both fully accurate and maximally specific. What we can do, however, is improve on both the tripartite and existing Bayesian models. I construct a new model of belief—the minimal model—that allows us to characterize agents with much greater specificity than the tripartite model, and yet which remains, unlike existing Bayesian models, perfectly accurate. (shrink)
Understanding probabilities as something other than point values has often been motivated by the need to find more realistic models for degree of belief, and in particular the idea that degree of belief should have an objective basis in “statistical knowledge of the world.” I offer here another motivation growing out of efforts to understand how chance evolves as a function of time. If the world is “chancy” in that there are non-trivial, objective, physical probabilities at the macro-level, (...) then the chance of an event e that happens at a given time is \ until it happens. But whether the chance of e goes to one continuously or not is left open. Discontinuities in such chance trajectories can have surprising and troubling consequences for probabilistic analyses of causation and accounts of how events occur in time. This, coupled with the compelling evidence for quantum discontinuities in chance’s evolution, gives rise to a “continuity bind” with respect to chance probability trajectories. I argue that a viable option for circumventing the continuity bind is to understand the probabilities “imprecisely,” that is, as intervals rather than point values. I then develop and motivate an alternative kind of continuity appropriate for interval-valued chance probability trajectories. (shrink)
The question of how the probabilistic opinions of different individuals should be aggregated to form a group opinion is controversial. But one assumption seems to be pretty much common ground: for a group of Bayesians, the representation of group opinion should itself be a unique probability distribution, 410–414, ; Bordley Management Science, 28, 1137–1148, ; Genest et al. The Annals of Statistics, 487–501, ; Genest and Zidek Statistical Science, 114–135, ; Mongin Journal of Economic Theory, 66, 313–351, ; Clemen and (...) Winkler Risk Analysis, 19, 187–203, ; Dietrich and List ; Herzberg Theory and Decision, 1–19, ). We argue that this assumption is not always in order. We show how to extend the canonical mathematical framework for pooling to cover pooling with impreciseprobabilities by employing set-valued pooling functions and generalizing common pooling axioms accordingly. As a proof of concept, we then show that one IP construction satisfies a number of central pooling axioms that are not jointly satisfied by any of the standard pooling recipes on pain of triviality. Following Levi, 3–11, ), we also argue that IP models admit of a much better philosophical motivation as a model of rational consensus. (shrink)
Many philosophers argue that Keynes’s concept of the “weight of arguments” is an important aspect of argument appraisal. The weight of an argument is the quantity of relevant evidence cited in the premises. However, this dimension of argumentation does not have a received method for formalisation. Kyburg has suggested a measure of weight that uses the degree of imprecision in his system of “Evidential Probability” to quantify weight. I develop and defend this approach to measuring weight. I illustrate the usefulness (...) of this measure by employing it to develop an answer to Popper’s Paradox of Ideal Evidence. (shrink)
Two compelling principles, the Reasonable Range Principle and the Preservation of Irrelevant Evidence Principle, are necessary conditions that any response to peer disagreements ought to abide by. The Reasonable Range Principle maintains that a resolution to a peer disagreement should not fall outside the range of views expressed by the peers in their dispute, whereas the Preservation of Irrelevant Evidence Principle maintains that a resolution strategy should be able to preserve unanimous judgments of evidential irrelevance among the peers. No standard (...) Bayesian resolution strategy satisfies the PIE Principle, however, and we give a loss aversion argument in support of PIE and against Bayes. The theory of imprecise probability allows one to satisfy both principles, and we introduce the notion of a set-based credal judgment to frame and address a range of subtle issues that arise in peer disagreements. (shrink)
Randomized controlled clinical trials play an important role in the development of new medical therapies. There is, however, an ethical issue surrounding the use of randomized treatment allocation when the patient is suffering from a life threatening condition and requires immediate treatment. Such patients can only benefit from the treatment they actually receive and not from the alternative therapy, even if it ultimately proves to be superior. We discuss a novel new way to analyse data from such clinical trials based (...) on the use of the recently developed theory of impreciseprobabilities. This work draws an explicit distinction between the related but nevertheless distinct questions of inference and decision in clinical trials. The traditional question of scientific interest asks 'Which treatment offers the greater chance of success?' and is the primary reason for conducting the clinical trial. The question of decision concerns the welfare of the patients in the clinical trial, asking whether the accumulated evidence favours one treatment over the other to such an extent that the next patient should decline randomization and instead express a preference for one treatment. Consideration of the decision question within the framework of impreciseprobabilities leads to a mathematical definition of equipoise and a method for governing the randomization protocol of a clinical trial. This paper describes in detail the protocol for the conduct of clinical trials based on this new method of analysis, which is illustrated in a retrospective analysis of data from a clinical trial comparing the anti-emetic drugs ondansetron and droperidol in the treatment of postoperative nausea and vomiting. The proposed methodology is compared quantitatively using computer simulation studies with conventional clinical trial designs and is shown to maintain high statistical power with reduced sample sizes, at the expense of a high type I error rate that we argue is irrelevant in some specific circumstances. Particular emphasis is placed on describing the type of medical conditions and treatment comparisons where the new methodology is expected to provide the greatest benefit. (shrink)
In his entry on "Quantum Logic and Probability Theory" in the Stanford Encyclopedia of Philosophy, Alexander Wilce (2012) writes that "it is uncontroversial (though remarkable) the formal apparatus quantum mechanics reduces neatly to a generalization of classical probability in which the role played by a Boolean algebra of events in the latter is taken over the 'quantum logic' of projection operators on a Hilbert space." For a long time, Patrick Suppes has opposed this view (see, for example, the paper collected (...) in Suppes and Zanotti (1996). Instead of changing the logic and moving from a Boolean algebra to a non-Boolean algebra, one can also 'save the phenomena' by weakening the axioms of probability theory and work instead with upper and lower probabilities. However, it is fair to say that despite Suppes' efforts upper and lower probabilities are not particularly popular in physics as well as in the foundations of physics, at least so far. Instead, quantum logic is booming again, especially since quantum information and computation became hot topics. Interestingly, however, impreciseprobabilities are becoming more and more popular in formal epistemology as recent work by authors such as James Joye (2010) and Roger White (2010) demonstrates. (shrink)
An examination of topics involved in statistical reasoning with impreciseprobabilities. The book discusses assessment and elicitation, extensions, envelopes and decisions, the importance of imprecision, conditional previsions and coherent statistical models.
This special issue of the International Journal of Approximate Reasoning grew out of the 8th International Symposium on Imprecise Probability: Theories and Applications. The symposium was organized by the Society for Imprecise Probability: Theories and Applications at the Université de Technologie de Compiègne in July 2013. The biennial ISIPTA meetings are well established among international conferences on generalized methods for uncertainty quantification. The first ISIPTA took place in Gent in 1999, followed by meetings in Cornell, Lugano, Carnegie Mellon, (...) Prague, Durham and Innsbruck. Compiègne proved to be a very nice location for ISIPTA 2013, offering wonderful opportunities for collaborations and discussions, as well as sightseeing places such as its imperial palace. (shrink)
Those who model doxastic states with a set of probability functions, rather than a single function, face a pressing challenge: can they provide a plausible decision theory compatible with their view? Adam Elga and others claim that they cannot, and that the set of functions model should be rejected for this reason. This paper aims to answer this challenge. The key insight is that the set of functions model can be seen as an instance of the supervaluationist approach to vagueness (...) more generally. We can then generate our decision theory by applying the general supervaluationist semantics to decision-theoretic claims. The result: if an action is permissible according to all functions in the set, it’s determinately permissible; if impermissible according to all, determinately impermissible; and – crucially – if permissible according to some, but not all, it’s indeterminate whether it’s permissible. This proposal handles with ease some difficult cases ) on which alternative decision theories falter. One reason this view has been overlooked in the literature thus far is that all parties to the debate presuppose that an acceptable decision theory must classify each action as either permissible or impermissible. But I will argue that this thought is misguided. Seeing the set of functions model as an instance of supervaluationism provides a compelling motivation for the claim that there can be indeterminacy in the rationality of some actions. (shrink)
The field of of imprecise probability has matured, in no small part because of Teddy Seidenfeld’s decades of original scholarship and essential contributions to building and sustaining the ISIPTA community. Although the basic idea behind imprecise probability is (at least) 150 years old, a mature mathematical theory has only taken full form in the last 30 years. Interest in imprecise probability during this period has also grown, but many of the ideas that the mature theory serves can (...) be difficult to apprehend to those new to the subject. Although these fundamental ideas are common knowledge in the ISIPTA community, they are expressed, when they are expressed at all, obliquely, over the course of years with students and colleagues. -/- A single essay cannot convey the store of common knowledge from any research community, let alone the ISIPTA community. But, this essay nevertheless is an attempt to guide those familiar with the basic Bayesian framework to appreciate some of the elegant and powerful ideas that underpin the contemporary theory of lower previsions, which is the theory that most people associate with the term ‘impreciseprobabilities’. (shrink)
The generalized Bayes’ rule (GBR) can be used to conduct ‘quasi-Bayesian’ analyses when prior beliefs are represented by imprecise probability models. We describe a procedure for deriving coherent imprecise probability models when the event space consists of a finite set of mutually exclusive and exhaustive events. The procedure is based on Walley’s theory of upper and lower prevision and employs simple linear programming models. We then describe how these models can be updated using Cozman’s linear programming formulation of (...) the GBR. Examples are provided to demonstrate how the GBR can be applied in practice. These examples also illustrate the effects of prior imprecision and prior-data conflict on the precision of the posterior probability distribution. (shrink)
Orthodox Bayesian decision theory requires an agent’s beliefs representable by a real-valued function, ideally a probability function. Many theorists have argued this is too restrictive; it can be perfectly reasonable to have indeterminate degrees of belief. So doxastic states are ideally representable by a set of probability functions. One consequence of this is that the expected value of a gamble will be imprecise. This paper looks at the attempts to extend Bayesian decision theory to deal with such cases, and (...) concludes that all proposals advanced thus far have been incoherent. A more modest, but coherent, alternative is proposed. Keywords: Impreciseprobabilities, Arrow’s theorem. (shrink)
We study probabilistically informative (weak) versions of transitivity by using suitable definitions of defaults and negated defaults in the setting of coherence and impreciseprobabilities. We represent p-consistent sequences of defaults and/or negated defaults by g-coherent imprecise probability assessments on the respective sequences of conditional events. Finally, we present the coherent probability propagation rules for Weak Transitivity and the validity of selected inference patterns by proving p-entailment of the associated knowledge bases.
We review de Finetti’s two coherence criteria for determinate probabilities: coherence1defined in terms of previsions for a set of events that are undominated by the status quo – previsions immune to a sure-loss – and coherence2 defined in terms of forecasts for events undominated in Brier score by a rival forecast. We propose a criterion of IP-coherence2 based on a generalization of Brier score for IP-forecasts that uses 1-sided, lower and upper, probability forecasts. However, whereas Brier score is a (...) strictly proper scoring rule for eliciting determinate probabilities, we show that there is no real-valuedstrictly proper IP-score. Nonetheless, with respect to either of two decision rules – Γ-maximin or E-admissibility-+-Γ-maximin – we give a lexicographic strictly proper IP-scoring rule that is based on Brier score. (shrink)
The modus ponens (A -> B, A :. B) is, along with modus tollens and the two logically not valid counterparts denying the antecedent (A -> B, ¬A :. ¬B) and affirming the consequent, the argument form that was most often investigated in the psychology of human reasoning. The present contribution reports the results of three experiments on the probabilistic versions of modus ponens and denying the antecedent. In probability logic these arguments lead to conclusions with impreciseprobabilities. (...) In the modus ponens tasks the participants inferred probabilities that agreed much better with the coherent normative values than in the denying the antecedent tasks, a result that mirrors results found with the classical argument versions. For modus ponens a surprisingly high number of lower and upper probabilities agreed perfectly with the conjugacy property (upper probabilities equal one complements of the lower probabilities). When the probabilities of the premises are imprecise the participants do not ignore irrelevant (“silent”) boundary probabilities. The results show that human mental probability logic is close to predictions derived from probability logic for the most elementary argument form, but has considerable difficulties with the more complex forms involving negations. (shrink)
Evidentialists say that a necessary condition of sound epistemic reasoning is that our beliefs reflect only our evidence. This thesis arguably conflicts with standard Bayesianism, due to the importance of prior probabilities in the latter. Some evidentialists have responded by modelling belief-states using impreciseprobabilities (Joyce 2005). However, Roger White (2010) and Aron Vallinder (2018) argue that this Imprecise Bayesianism is incompatible with evidentialism due to “inertia”, where Imprecise Bayesian agents become stuck in a state (...) of ambivalence towards hypotheses. Additionally, escapes from inertia apparently only create further conflicts with evidentialism. This dilemma gives a reason for evidentialist imprecise probabilists to look for alternatives without inertia. I shall argue that Henry E. Kyburg’s approach offers an evidentialist-friendly imprecise probability theory without inertia, and that its relevant anti-inertia features are independently justified. I also connect the traditional epistemological debates concerning the “ethics of belief” more systematically with formal epistemology than has been hitherto done. (shrink)
This paper focuses on radical pooling, or the question of how to aggregate credences when there is a fundamental disagreement about which is the relevant logical space for inquiry. The solution advanced is based on the notion of consensus as common ground, where agents can find it by suspending judgment on logical possibilities. This is exemplified with cases of scientific revolution. On a formal level, the proposal uses algebraic joins and impreciseprobabilities; which is shown to be compatible (...) with the principles of marginalization, rigidity, reverse bayesianism, and minimum divergence commonly endorsed in these contexts. Furthermore, I extend results from previous work by to show that pooling sets of impreciseprobabilities can satisfy important pooling axioms. (shrink)
An agent often does not have precise probabilities or utilities to guide resolution of a decision problem. I advance a principle of rationality for making decisions in such cases. To begin, I represent the doxastic and conative state of an agent with a set of pairs of a probability assignment and a utility assignment. Then I support a decision principle that allows any act that maximizes expected utility according to some pair of assignments in the set. Assuming that computation (...) of an option's expected utility uses comprehensive possible outcomes that include the option's risk, no consideration supports a stricter requirement. (shrink)
Unlike other classical arguments for the existence of God, Pascal’s Wager provides a pragmatic rationale for theistic belief. Its most popular version says that it is rationally mandatory to choose a way of life that seeks to cultivate belief in God because this is the option of maximum expected utility. Despite its initial attractiveness, this long-standing argument has been subject to various criticisms by many philosophers. What is less discussed, however, is the rationality of this choice in situations where the (...) decision-makers are confronted with greater uncertainty. In this paper, I examine the imprecise version of Pascal’s Wager: those scenarios where an agent’s credence that God exists is imprecise or vague rather than precise. After introducing some technical background on impreciseprobabilities, I apply five different principles for decision-making to two cases of state uncertainty. In the final part of the paper, I argue that it is not rationally permitted to include zero as the lower probability of God’s existence. Although the conditions for what makes an act uniquely optimal vary significantly across those principles, I also show how the option of wagering for God can defeat any mixed strategy under two distinct interpretations of salvation. (shrink)
We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The experimental task consists of a series of standard lottery choices in which the subject is assumed to use conventional risk attitudes to select one lottery or the other and then a series of betting choices in (...) which the subject is presented with a range of bookies offering odds on the outcome of some event that the subject has a belief over. Knowledge of the risk attitudes of subjects conditions the inferences about subjective beliefs. Maximum simulated likelihood methods are used to estimate a structural model in which subjects employ subjective beliefs to make bets. We present evidence that some subjective probabilities are indeed best characterized as probability distributions with non-zero variance. (shrink)
Sometimes different partitions of the same space each seem to divide that space into propositions that call for equal epistemic treatment. Famously, equal treatment in the form of equal point-valued credence leads to incoherence. Some have argued that equal treatment in the form of equal interval-valued credence solves the puzzle. This paper shows that, once we rule out intervals with extreme endpoints, this proposal also leads to incoherence.
AbstractThe received model of degrees of belief represents them as probabilities. Over the last half century, many philosophers have been convinced that this model fails because it cannot make room for the idea that an agent’s degrees of belief should respect the available evidence. In its place they have advocated a model that represents degrees of belief using impreciseprobabilities (sets of probability functions). This paper presents a model of degrees of belief based on Dempster–Shafer belief functions (...) and then presents arguments for belief functions over impreciseprobabilities as a model of evidence-respecting degrees of belief. The arguments cover three kinds of issue: theoretical virtues (simplicity, interpretability and flexibility); motivations; and problem cases (dilation and belief inertia). (shrink)
Bayesians often confuse insistence that probability judgment ought to be indeterminate (which is incompatible with Bayesian ideals) with recognition of the presence of imprecision in the determination or measurement of personal probabilities (which is compatible with these ideals). The confusion is discussed and illustrated by remarks in a recent essay by R. C. Jeffrey.
Jim Joyce argues for two amendments to probabilism. The first is the doctrine that credences are rational, or not, in virtue of their accuracy or “closeness to the truth” (1998). The second is a shift from a numerically precise model of belief to an imprecise model represented by a set of probability functions (2010). We argue that both amendments cannot be satisfied simultaneously. To do so, we employ a (slightly-generalized) impossibility theorem of Seidenfeld, Schervish, and Kadane (2012), who show (...) that there is no strictly proper scoring rule for impreciseprobabilities. -/- The question then is what should give way. Joyce, who is well aware of this no-go result, thinks that a quantifiability constraint on epistemic accuracy should be relaxed to accommodate imprecision. We argue instead that another Joycean assumption — called strict immodesty— should be rejected, and we prove a representation theorem that characterizes all “mildly” immodest measures of inaccuracy. (shrink)
This paper studies connections between two alternatives to the standard probability calculus for representing and reasoning about uncertainty: imprecise probability andcomparative probability. The goal is to identify complete logics for reasoning about uncertainty in a comparative probabilistic language whose semantics is given in terms of imprecise probability. Comparative probability operators are interpreted as quantifying over a set of probability measures. Modal and dynamic operators are added for reasoning about epistemic possibility and updating sets of probability measures.
Uncertainty and vagueness/imprecision are not the same: one can be certain about events described using vague predicates and about imprecisely specified events, just as one can be uncertain about precisely specified events. Exactly because of this, a question arises about how one ought to assign probabilities to imprecisely specified events in the case when no possible available evidence will eradicate the imprecision (because, say, of the limits of accuracy of a measuring device). Modelling imprecision by rough sets over an (...) approximation space presents an especially tractable case to help get one’s bearings. Two solutions present themselves: the first takes as upper and lower probabilities of the event X the (exact) probabilities assigned X ’s upper and lower rough-set approximations; the second, motivated both by formal considerations and by a simple betting argument, is to treat X ’s rough-set approximation as a conditional event and assign to it a point-valued (conditional) probability. (shrink)
This article considers the extent to which Bayesian networks with impreciseprobabilities, which are used in statistics and computer science for predictive purposes, can be used to represent causal structure. It is argued that the adequacy conditions for causal representation in the precise context—the Causal Markov Condition and Minimality—do not readily translate into the imprecise context. Crucial to this argument is the fact that the independence relation between random variables can be understood in several different ways when (...) the joint probability distribution over those variables is imprecise, none of which provides a compelling basis for the causal interpretation of imprecise Bayes nets. I conclude that there are serious limits to the use of imprecise Bayesian networks to represent causal structure. (shrink)
The purpose of this paper is to show that if one adopts conditional probabilities as the primitive concept of probability, one must deal with the fact that even in very ordinary circumstances at least some probability values may be imprecise, and that some probability questions may fail to have numerically precise answers.
Part 1 Background on de Finetti’s twin criteria of coherence: Coherence1: 2-sided previsions free from dominance through a Book. Coherence2: Forecasts free from dominance under Brier (squared error) score. Part 2 IP theory based on a scoring rule.
This article presents the results of a survey designed to test, with economically sophisticated participants, Ellsberg’s ambiguity aversion hypothesis, and Smithson’s conflict aversion hypothesis. Based on an original sample of 78 professional actuaries (all members of the French Institute of Actuaries), this article provides empirical evidence that ambiguity (i.e. uncertainty about the probability) affect insurers’ decision on pricing insurance. It first reveals that premiums are significantly higher for risks when there is ambiguity regarding the probability of the loss. Second, it (...) shows that insurers are sensitive to sources of ambiguity. The participants indeed, charged a higher premium when ambiguity came from conflict and disagreement regarding the probability of the loss than when ambiguity came from imprecision (imprecise forecast about the probability of the loss). This research thus documents the presence of both ambiguity aversion and conflict aversion in the field of insurance, and discuses economic and psychological rationales for the observed behaviours. (shrink)
Can we extend accuracy-based epistemic utility theory to imprecise credences? There's no obvious way of proceeding: some stipulations will be necessary for either (i) the notion of accuracy or (ii) the epistemic decision rule. With some prima facie plausible stipulations, imprecise credences are always required. With others, they’re always impermissible. Care is needed to reach the familiar evidential view of imprecise credence: that whether precise or imprecise credences are required depends on the character of one's evidence. (...) I propose an epistemic utility theoretic defense of a common view about how evidence places demands on imprecise credence: that your spread of credence should cover the range of chance hypotheses left open by your evidence. I argue that objections to the form of epistemic utility theoretic argument that I use will extend to the standard motivation for epistemically mandatory imprecise credences. (shrink)
Wilcox proposed an argument against impreciseprobabilities and for the principle of indifference based on a thought experiment where he argues that it is very intuitive to feel that one’s confidence in drawing a ball of a given colour out of an unknown urn should decrease while the number of potential colours in the urn increases. In my response to him, I argue that one’s intuitions may be unreliable because it is very hard to truly feel completely ignorant (...) in such a situation. I further argue that Wilcox must also account for the conflicting intuition that it is absurd to have to feel completely convinced that a specific claim about reality is true _in the absence of any evidence_ in order to avoid being irrational. It is dubious that this intuition is considerably less universal and strongly-held than Wilcox’s own intuition. Finally, I point out that even if Wilcox’s intuition were to be universally shared among members of our biological species, it is far from being clear that someone refusing to let that intuition dictate his or her beliefs would be irrational. For all these reasons, I believe that Wilcox was not successful in proving that philosophers and scientists representing uncertainty through impreciseprobabilities are violating the principles of rationality. (shrink)
There is currently much discussion about how decision making should proceed when an agent's degrees of belief are imprecise; represented by a set of probability functions. I show that decision rules recently discussed by Sarah Moss, Susanna Rinard and Rohan Sud all suffer from the same defect: they all struggle to rationalize diachronic ambiguity aversion. Since ambiguity aversion is among the motivations for imprecise credence, this suggests that the search for an adequate imprecise decision rule is not (...) yet over. (shrink)
It is well known that classical, aka ‘sharp’, Bayesian decision theory, which models belief states as single probability functions, faces a number of serious difficulties with respect to its handling of agnosticism. These difficulties have led to the increasing popularity of so-called ‘imprecise’ models of decision-making, which represent belief states as sets of probability functions. In a recent paper, however, Adam Elga has argued in favour of a putative normative principle of sequential choice that he claims to be borne (...) out by the sharp model but not by any promising incarnation of its imprecise counterpart. After first pointing out that Elga has fallen short of establishing that his principle is indeed uniquely borne out by the sharp model, I cast aspersions on its plausibility. I show that a slight weakening of the principle is satisfied by at least one, but interestingly not all, varieties of the imprecise model and point out that Elga has failed to motivate his stronger commitment. (shrink)
A number of Bayesians claim that, if one has no evidence relevant to a proposition P, then one's credence in P should be spread over the interval [0, 1]. Against this, I argue: first, that it is inconsistent with plausible claims about comparative levels of confidence; second, that it precludes inductive learning in certain cases. Two motivations for the view are considered and rejected. A discussion of alternatives leads to the conjecture that there is an in-principle limitation on formal representations (...) of belief: they cannot be both fully accurate and maximally specific. (shrink)