It is well known that classical, aka ‘sharp’, Bayesian decision theory, which models belief states as single probability functions, faces a number of serious difficulties with respect to its handling of agnosticism. These difficulties have led to the increasing popularity of so-called ‘imprecise’ models of decision-making, which represent belief states as sets of probability functions. In a recent paper, however, Adam Elga has argued in favour of a putative normative principle of sequential choice that he claims to be borne out (...) by the sharp model but not by any promising incarnation of its imprecise counterpart. After first pointing out that Elga has fallen short of establishing that his principle is indeed uniquely borne out by the sharp model, I cast aspersions on its plausibility. I show that a slight weakening of the principle is satisfied by at least one, but interestingly not all, varieties of the imprecise model and point out that Elga has failed to motivate his stronger commitment. (shrink)
After a number of decades of research into the dynamics of rational belief, the belief revision theory community remains split on the appropriate handling of sequences of changes in view, the issue of so-called iterated revision. It has long been suggested that the matter is at least partly settled by facts pertaining to the results of various single revisions of one’s initial state of belief. Recent work has pushed this thesis further, offering various strong principles that ultimately result in a (...) wholesale reduction of iterated to one-shot revision. The present paper offers grounds to hold that these principles should be significantly weakened and that the reductionist thesis should ultimately be rejected. Furthermore, the considerations provided suggest a close connection between the logic of iterated belief change and the logic of evidential relevance. (shrink)
In a recent article, Douven and Williamson offer both (i) a rebuttal of various recent suggested sufficient conditions for rational acceptability and (ii) an alleged ‘generalization’ of this rebuttal, which, they claim, tells against a much broader class of potential suggestions. However, not only is the result mentioned in (ii) not a generalization of the findings referred to in (i), but in contrast to the latter, it fails to have the probative force advertised. Their paper does however, if unwittingly, bring (...) us a step closer to a precise characterization of an important class of rationally unacceptable propositions—the class of lottery propositions for equiprobable lotteries. This helps pave the way to the construction of a genuinely lottery-paradox-proof alternative to the suggestions criticized in (i). (shrink)
This article provides a discussion of the principle of transmission of evidential support across entailment from the perspective of belief revision theory in the AGM tradition. After outlining and briefly defending a small number of basic principles of belief change, which include a number of belief contraction analogues of the Darwiche-Pearl postulates for iterated revision, a proposal is then made concerning the connection between evidential beliefs and belief change policies in rational agents. This proposal is found to be suffcient to (...) establish the truth of a much-discussed intuition regarding transmission failure. (shrink)
Crispin Wright’s discussion of the notion of ‘transmission-failure’ promises to have important philosophical ramifications, both in epistemology and beyond. This paper offers a precise, formal characterisation of the concept within a Bayesian framework. The interpretation given avoids the serious shortcomings of a recent alternative proposal due to Samir Okasha.
As the ongoing literature on the paradoxes of the Lottery and the Preface reminds us, the nature of the relation between probability and rational acceptability remains far from settled. This article provides a novel perspective on the matter by exploiting a recently noted structural parallel with the problem of judgment aggregation. After offering a number of general desiderata on the relation between finite probability models and sets of accepted sentences in a Boolean sentential language, it is noted that a number (...) of these constraints will be satisfied if and only if acceptable sentences are true under all valuations in a distinguished non-empty set W. Drawing inspiration from distance-based aggregation procedures, various scoring rule based membership conditions for W are discussed and a possible point of contact with ranking theory is considered. The paper closes with various suggestions for further research. (shrink)
I outline four competing probabilistic accounts of contrastive evidential support and consider various considerations that might help arbitrate between these. The upshot of the discussion is that the so-called 'Law of Likelihood' is to be preferred to any of the alternatives considered.
It appears to have gone unnoticed in the literature that Pollock's widely endorsed analysis of evidential defeat entails a remarkably strong symmetry principle, according to which, for any three propositions D, E and H, if both E and D provide a reason to believe H, then D is a defeater for E's support for H if and only if, in turn, E is a defeater for D's support for H. After illustrating the counterintuitiveness of this constraint, a simple, more suitable, (...) alternative to the Pollockian account is offered. (shrink)
In a recent pair of publications, Richard Bradley has offered two novel no-go theorems involving the principle of Preservation for conditionals, which guarantees that one’s prior conditional beliefs will exhibit a certain degree of inertia in the face of a change in one’s non-conditional beliefs. We first note that Bradley’s original discussions of these results—in which he finds motivation for rejecting Preservation, first in a principle of Commutativity, then in a doxastic analogue of the rule of modus ponens —are problematic (...) in a significant number of respects. We then turn to a recent U-turn on his part, in which he winds up rescinding his commitment to modus ponens, on the grounds of a tension with the rule of Import-Export for conditionals. Here we offer an important positive contribution to the literature, settling the following crucial question that Bradley leaves unanswered: assuming that one gives up on full-blown modus ponens on the grounds of its incompatibility with Import-Export, what weakened version of the principle should one be settling for instead? Our discussion of the issue turns out to unearth an interesting connection between epistemic undermining and the apparent failures of modus ponens in McGee’s famous counterexamples. (shrink)
Darwiche and Pearl’s seminal 1997 article outlined a number of baseline principles for a logic of iterated belief revision. These principles, the DP postulates, have been supplemented in a number of alternative ways. Most suggestions have resulted in a form of ‘reductionism’ that identifies belief states with orderings of worlds. However, this position has recently been criticised as being unacceptably strong. Other proposals, such as the popular principle (P), aka ‘Independence’, characteristic of ‘admissible’ operators, remain commendably more modest. In this (...) paper, we supplement the DP postulates and (P) with a number of novel conditions. While the DP postulates constrain the relation between a prior and a posterior conditional belief set, our new principles notably govern the relation between two posterior conditional belief sets obtained from a common prior by different revisions. We show that operators from the resulting family, which subsumes both lexicographic and restrained revision, can be represented as relating belief states associated with a ‘proper ordinal interval’ (POI) assignment, a structure more fine-grained than a simple ordering of worlds. We close the paper by noting that these operators satisfy iterated versions of many AGM era postulates, including Superexpansion, that are not sound for admissible operators in general. (shrink)
The field of iterated belief change has focused mainly on revision, with the other main operator of AGM belief change theory, i.e. contraction, receiving relatively little attention. In this paper we extend the Harper Identity from single-step change to define iterated contraction in terms of iterated revision. Specifically, just as the Harper Identity provides a recipe for defining the belief set resulting from contracting A in terms of (i) the initial belief set and (ii) the belief set resulting from revision (...) by ¬A, we look at ways to define the plausibility ordering over worlds resulting from contracting A in terms of (iii) the initial plausibility ordering, and (iv) the plausibility ordering resulting from revision by ¬A. After noting that the most straightforward such extension leads to a trivialisation of the space of permissible orderings, we provide a family of operators for combining plausibility orderings that avoid such a result. These operators are characterised in our domain of interest by a pair of intuitively compelling properties, which turn out to enable the derivation of a number of iterated contraction postulates from postulates for iterated revision. We finish by observing that a salient member of this family allows for the derivation of counterparts for contraction of some well known iterated revision operators, as well as for defining new iterated contraction operators. (shrink)
Probability theory promises to deliver an exact and unified foundation for inquiry in epistemology and philosophy of science. But philosophy of religion is also fertile ground for the application of probabilistic thinking. This volume presents original contributions from twelve contemporary researchers, both established and emerging, to offer a representative sample of the work currently being carried out in this potentially rich field of inquiry. Grouped into five parts, the chapters span a broad range of traditional issues in religious epistemology. The (...) first three parts discuss the evidential impact of various considerations that have been brought to bear on the question of the existence of God. These include witness reports of the occurrence of miraculous events, the existence of complex biological adaptations, the apparent 'fine-tuning' for life of various physical constants and the existence of seemingly unnecessary evil. The fourth part addresses a number of issues raised by Pascal's famous pragmatic argument for theistic belief. A final part offers probabilistic perspectives on the rationality of faith and the epistemic significance of religious disagreement. (shrink)
The traditional Bayesian qualitative account of evidential support (TB) takes assertions of the form 'E evidentially supports H' to affirm the existence of a two-place relation of evidential support between E and H. The analysans given for this relation is $C(H,E) =_{def} Pr(H\arrowvertE) \models Pr(H)$ . Now it is well known that when a hypothesis H entails evidence E, not only is it the case that C(H,E), but it is also the case that C(H&X,E) for any arbitrary X. There is (...) a widespread feeling that this is a problematic result for TB. Indeed, there are a number of cases in which many feel it is false to assert 'E evidentially supports H&X', despite H entailing E. This is known, by those who share that feeling, as the 'tacking problem' for Bayesian confirmation theory. After outlining a generalization of the problem, I argue that the Bayesian response has so far been unsatisfactory. I then argue the following: (i) There exists, either instead of, or in addition to, a two-place relation of confirmation, a three-place, 'contrastive' relation of confirmation, holding between an item of evidence E and two competing hypotheses H₁ and H₂. (ii) The correct analysans of the relation is a particular probabilistic inequality, abbreviated C(H₁, H₂, E). (iii) Those who take the putative counterexamples to TB discussed to indeed be counterexamples are interpreting the relevant utterances as implicitly contrastive, contrasting the relevant hypothesis H₁ with a particular competitor H₂. (iv) The probabilistic structure of these cases is such that ∼C(H₁, H₂, E). This solves my generalization of the tacking problem. I then conclude with some thoughts about the relationship between the traditional Bayesian account of evidential support and my proposed account of the three-place relation of confirmation. (shrink)
I outline four competing probabilistic accounts of contrastive evidential support and consider various considerations that might help arbitrate between these. The upshot of the discussion is that the so-called ‘Law of Likelihood’ is to be preferred to any of the alternatives considered.
Recent work has considered the problem of extending to the case of iterated belief change the so-called `Harper Identity' (HI), which defines single-shot contraction in terms of single-shot revision. The present paper considers the prospects of providing a similar extension of the Levi Identity (LI), in which the direction of definition runs the other way. We restrict our attention here to the three classic iterated revision operators--natural, restrained and lexicographic, for which we provide here the first collective characterisation in the (...) literature, under the appellation of `elementary' operators. We consider two prima facie plausible ways of extending (LI). The first proposal involves the use of the rational closure operator to offer a `reductive' account of iterated revision in terms of iterated contraction. The second, which doesn't commit to reductionism, was put forward some years ago by Nayak et al. We establish that, for elementary revision operators and under mild assumptions regarding contraction, Nayak's proposal is equivalent to a new set of postulates formalising the claim that contraction by ¬A should be considered to be a kind of `mild' revision by A. We then show that these, in turn, under slightly weaker assumptions, jointly amount to the conjunction of a pair of constraints on the extension of (HI) that were recently proposed in the literature. Finally, we consider the consequences of endorsing both suggestions and show that this would yield an identification of rational revision with natural revision. We close the paper by discussing the general prospects for defining iterated revision in terms of iterated contraction. (shrink)
In a recent article, David Christensen casts aspersions on a restricted version of van Fraassen's Reflection principle, which he dubs ‘Self-Respect’(sr). Rejecting two possible arguments for sr, he concludes that the principle does not constitute a requirement of rationality. In this paper we argue that not only has Christensen failed to make a case against the aforementioned arguments, but that considerations pertaining to Moore's paradox indicate that sr, or at the very least a mild weakening thereof, is indeed a plausible (...) normative principle. (shrink)
According to Jamie Whyte, the proper assignment of fulfilment conditions to an agent’s set of desires proceeds in three steps. First, one identifies various desire extinction and behavioural reinforcement conditions to obtain the fulfilment conditions of a certain subset of the agent’s desires. With these fulfilment conditions in hand, one then appeals to a principle connecting desire fulfilment conditions with belief truth conditions to obtain the truth conditions of a number of the agent’s beliefs. Finally, one uses these belief truth (...) conditions to generate, via a third principle, the fulfilment conditions for the remaining desires. There is, however, a very straightforward reason why this strategy cannot yield the required results. (shrink)