Betting methods, of which de Finetti's Dutch Book is by far the most well-known, are uncertainty modelling devices which accomplish a twofold aim. Whilst providing an interpretation of the relevant measure of uncertainty, they also provide a formal definition of coherence. The main purpose of this paper is to put forward a betting method for belieffunctions on MV-algebras of many-valued events which allows us to isolate the corresponding coherence criterion, which we term coherence in the aggregate. Our (...) framework generalises the classical Dutch Book method. (shrink)
Given a belief function ? on the set of all subsets of prizes, how should ? values be understood as a decision alternative? This paper presents and characterizes an induced-measure interpretation of belieffunctions.
The Dempster–Shafer approach to expressing beliefabout a parameter in a statistical model is notconsistent with the likelihood principle. Thisinconsistency has been recognized for some time, andmanifests itself as a non-commutativity, in which theorder of operations (combining belief, combininglikelihood) makes a difference. It is proposed herethat requiring the expression of belief to be committed to the model (and to certain of itssubmodels) makes likelihood inference very nearly aspecial case of the Dempster–Shafer theory.
Using probability functions defined over a simple language as models of states of belief, my goal in this article has been to analyse contractions and revisions of beliefs. My first strategy was to formulate postulates for these processes. Close parallels between the postulates for contractions and the postulates for revisions have been established - the results in Section 5 show that contractions and revisions are interchangeable. As a second strategy, some suggestions for more or less explicit constructive definitions (...) of the revision process (and indirectly also of the contraction process) were then presented. However, the results in Section 6 are less conclusive than in the earlier ones. This problem area still awaits further development. (shrink)
In this dissertation, I explore whether teleological, normative, and functional theories of belief each have the resources to answer three central questions about the nature and normativity of belief. These questions are: (i) what are beliefs, (ii), why do we have them, and (iii) how should we interpret doxastic correctness--the principle that it is correct to believe that p if and only if p? -/- I argue that teleological and normative theories fail to adequately address these questions, and (...) I develop and defend a functional alternative. In addition, I attempt to extend my functional theory of belief to account for another, related attitude: suspended belief. (shrink)
This paper uses Popper's treatment of probability and an epistemic constraint on probability assignments to conditionals to extend the Bayesian representation of rational belief so that revision of previously accepted evidence is allowed for. Results of this extension include an epistemic semantics for Lewis' theory of counterfactual conditionals and a representation for one kind of conceptual change.
This paper presents an approach to the belief system based on a computational framework in three levels: first, the logic level with the definition of binary local rules, second, the arithmetic level with the definition of recursive functions and finally the behavioural level with the definition of a recursive construction pattern. Social communication is achieved when different beliefs are expressed, modified, propagated and shared through social nets. This approach is useful to mimic the belief system because the (...) defined functions provide different ways to process the same incoming information as well as a means to propagate it. Our model also provides a means to cross different beliefs so, any incoming information can be processed many times by the same or different functions as it occurs is social nets. (shrink)
This paper is concerned with representations of belief by means of nonadditive probabilities of the Dempster-Shafer (DS) type. After surveying some foundational issues and results in the D.S. theory, including Suppes's related contributions, the paper proceeds to analyze the connection of the D.S. theory with some of the work currently pursued in epistemic logic. A preliminary investigation of the modal logic of belieffunctions à la Shafer is made. There it is shown that the Alchourrron-Gärdenfors-Makinson (A.G.M.) logic (...) of belief change is closely related to the D.S. theory. The final section compares the critique of Bayesianism which underlies the present paper with some important objections raised by Suppes against this doctrine. -/- . (shrink)
We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belieffunctions, which makes it possible to distinguish between a player's assessment of ambiguity and his attitude towards ambiguity. We also generalize the concept of trembling hand perfect equilibrium. Finally, we demonstrate that for certain attitudes towards ambiguity it is possible (...) to explain cooperation in the one-shot Prisoner's Dilemma in a way that is in accordance with some recent experimental findings. (shrink)
Although deductive reasoning is a closed system, one's beliefs about the world can influence validity judgements. To understand the associated functional neuroanatomy of this belief-bias we studied 14 volunteers using event-related fMRI, as they performed reasoning tasks under neutral, facilitatory and inhibitory belief conditions. We found evidence for the engagement of a left temporal lobe system during belief-based reasoning and a bilateral parietal lobe system during belief-neutral reasoning. Activation of right lateral prefrontal cortex was evident when (...) subjects inhibited a prepotent response associated with belief-bias and correctly completed a logical task, a finding consistent with its putative role in cognitive monitoring. By contrast, when logical reasoning was overcome by belief-bias, there was engagement of ventral medial prefrontal cortex, a region implicated in affective processing. This latter involvement suggests that belief-bias effects in reasoning may be mediated through an influence of emotional processes on reasoning. (shrink)
We investigate the discrete (finite) case of the Popper–Renyi theory of conditional probability, introducing discrete conditional probabilistic models for knowledge and conditional belief, and comparing them with the more standard plausibility models. We also consider a related notion, that of safe belief, which is a weak (non-negatively introspective) type of “knowledge”. We develop a probabilistic version of this concept (“degree of safety”) and we analyze its role in games. We completely axiomatize the logic of conditional belief, knowledge (...) and safe belief over conditional probabilistic models. We develop a theory of probabilistic dynamic belief revision, introducing probabilistic “action models” and proposing a notion of probabilistic update product, that comes together with appropriate reduction laws. (shrink)
This paper presents a uniform semantic treatment of nonmonotonic inference operations that allow for inferences from infinite sets of premises. The semantics is formulated in terms of selection functions and is a generalization of the preferential semantics of Shoham (1987), (1988), Kraus, Lehman, and Magidor (1990) and Makinson (1989), (1993). A selection function picks out from a given set of possible states (worlds, situations, models) a subset consisting of those states that are, in some sense, the most preferred ones. (...) A proposition α is a nonmonotonic consequence of a set of propositions Γ iff α holds in all the most preferred Γ-states. In the literature on revealed preference theory, there are a number of well-known theorems concerning the representability of selection functions, satisfying certain properties, in terms of underlying preference relations. Such theorems are utilized here to give corresponding representation theorems for nonmonotonic inference operations. At the end of the paper, the connection between nonmonotonic inference and belief revision, in the sense of Alchourrón, Gärdenfors, and Makinson, is explored. In this connection, infinitary belief revision operations that allow for the revision of a theory with a possibly infinite set of propositions are introduced and characterized axiomatically. (shrink)
Human beings seem to capture time and the temporal properties of events and things in thought by having beliefs usually expressed with statements using tense, or notions such as ‘now’, ‘past’ or ‘future’. Tensed beliefs like these seem indispensable for correct reasoning and timely action. For instance, my belief that my root canal is over seems inexpressible with a statement that does not use tense or a temporal indexical. However, the dominant view on the nature of time is that (...) it forms, with space, a four-dimensional continuum where time does not encompass private perspectives or an absolute, fixed present. This ‘tenseless’ theory of time encounters a challenge in integrating tensed belief, because it cannot easily explain what constitutes a tensed belief in a tenseless world and how such a belief works inside our cognitive network to bring about the actions it does. Providing such an account is the main goal of this dissertation. -/- I argue here that the correct way to proceed would be to utilize philosophical theories dealing with indexicality, as the puzzling features of tensed belief are shared with beliefs expressed by first-person or spatial indexicals. In chapters II and III I expand the dominant theories about indexicality (Lewis, Perry, Kaplan) and apply them to tensed belief. I show that each is in certain respects incomplete or inadequate. -/- My preferred account critiques the preceding theories as mis-attributing the indexicality involved to a fully conceptual element in the way people think about time. I argue that we should instead connect tensed belief to not fully conceptual elements thought. For support I turn to work in perceptual psychology that connects beliefs about space to perceptions of spatial features. In chapter IV I develop an analogous argument about temporal thought and discuss how mental representations involved in perceptions are constitutively related to the formation and preservation of tensed beliefs. Combining this story with a tenseless theory of time should give us a complete, metaphysically uncontroversial, account of the way a tensed belieffunctions in reasoning and produces timely action. (shrink)
Two justifications of backward induction (BI) in generic perfect information games are formulated using Bonanno's (1992; Theory and Decision 33, 153) belief systems. The first justification concerns the BI strategy profile and is based on selecting a set of rational belief systems from which players have to choose their belieffunctions. The second justification concerns the BI path of play and is based on a sequential deletion of nodes that are inconsistent with the choice of rational (...)belieffunctions. (shrink)
I hold that epistemic warrant consists in the normal functioning of the belief-forming process when the process has forming true beliefs reliably as an etiological function. Evolution by natural selection is the central source of etiological functions. This leads many to think that on my view warrant requires a history of natural selection. What then about learning? What then about Swampman? Though functions require history, natural selection is not the only source. Self-repair and trial-and-error learning are both (...) sources. Warrant requires history, but not necessarily that much. (shrink)
We explore ways in which purely qualitative belief change in the AGM tradition throws light on options in the treatment of conditional probability. First, by helping see why it can be useful to go beyond the ratio rule defining conditional from one-place probability. Second, by clarifying what is at stake in different ways of doing that. Third, by suggesting novel forms of conditional probability corresponding to familiar variants of qualitative belief change, and conversely. Likewise, we explain how recent (...) work on the qualitative part of probabilistic inference leads to a very broad class of 'proto-probability' functions. (shrink)
The 1985 paper by Carlos Alchourrón (1931–1996), Peter Gärdenfors, and David Makinson (AGM), "On the Logic of Theory Change: Partial Meet Contraction and Revision Functions" was the starting-point of a large and rapidly growing literature that employs formal models in the investigation of changes in belief states and databases. In this review, the first twentyfive years of this development are summarized. The topics covered include equivalent characterizations of AGM operations, extended representations of the belief states, change operators (...) not included in the original framework, iterated change, applications of the model, its connections with other formal frameworks, computatibility of AGM operations, and criticism of the model. (shrink)
In this paper we compare Leitgeb’s stability theory of belief and Spohn’s ranking-theoretic account of belief. We discuss the two theories as solutions to the lottery paradox. To compare the two theories, we introduce a novel translation between ranking functions and probability functions. We draw some crucial consequences from this translation, in particular a new probabilistic belief notion. Based on this, we explore the logical relation between the two belief theories, showing that models of (...) Leitgeb’s theory correspond to certain models of Spohn’s theory. The reverse is not true. Finally, we discuss how these results raise new questions in belief theory. In particular, we raise the question whether stability is rightly thought of as a property pertaining to belief. (shrink)
In this paper I discuss the foundations of a formal theory of coherent and conservative belief change that is suitable to be used as a method for constructing iterated changes of belief, sensitive to the history of earlier belief changes, and independent of any form of dispositional coherence. I review various ways to conceive the relationship between the beliefs actually held by an agent and her belief change strategies, show the problems they suffer from, and suggest (...) that belief states should be represented by unary revision functions that take sequences of inputs. Three concepts of coherence implicit in current theories of belief change are distinguished: synchronic, diachronic and dispositional coherence. Diachronic coherence is essentially identified with what is known as conservatism in epistemology. The present paper elaborates on the philosophical motivation of the general framework; formal details and results are provided in a companion paper. (shrink)
This paper studies the idea of conservatism with respect to belief change strategies in the setting of unary, iterated belief revision functions (based on the conclusions of Rott, ‘Coherence and Conservatism in the Dynamics of Belief, Part I: Finding the Right Framework’, Erkenntnis 50, 1999, 387–412). Special attention is paid to the case of ‘basic belief change’ where neither the (weak) AGM postulates concerning conservatism with respect to beliefs nor the (stong) supplementary AGM postulates concerning (...) dispositional coherence need to be satisfied. One‐step belief change generated by ‘basic entrenchment’ is combined with a natural conservative method of revising entrenchment relations. A logical characterization of this method is presented, and it is compared with three other methods known from the literature which I call ‘external’, ‘radical’ and ‘moderate’ belief revision. While conservative belief change turns out to be incoherent in its treatment of the recency of information, moderate belief change is more satisfactory in this respect. (shrink)
Abstract: A person who remembers having done something has a belief that she did it from having done it. To have a belief that one did something from having done it is to believe that one did the action on the (causal) basis of having done it, where this belief (in order for one to have it) need not be (causally) based even in part on any contributor to the belief other than doing the action. The (...) notion of a contributor to a belief (as opposed to a mere facilitating cause of the belief) is explicated through a series of examples. The account of having a belief that one did something from having done it is then deployed in criticising Ginet's account of ‘memory connection’, in assessing Martin and Deutscher's causal theory of remembering, in indicating how diachronic justification functions in a nontraditional theory of memory, and in setting forth one type of psychological connectedness which, according to advocates of a psychological continuity theory of personal identity, may be employed (noncircularly) in formulating the theory, and which, according to opponents of the theory, provides a target for criticising the theory. (shrink)
I develop a strategy for representing epistemic states and epistemic changes that seeks to be sensitive to the difference between voluntary and involuntary aspects of our epistemic life, as well as to the role of pragmatic factors in epistemology. The model relies on a particular understanding of the distinction between full belief and acceptance , which makes room for the idea that our reasoning on both practical and theoretical matters typically proceeds in a contextual way. Within this framework, I (...) discuss how agents can rationally shift their credal probability functions so as to consciously modify some of their contextual acceptances; the present account also allows us to represent how the very set of contexts evolves. Voluntary credal shifts, in turn, might provoke changes in the agent’s beliefs, but I show that this is actually a side effect of performing multiple adjustments in the total lot of the agent’s acceptance sets. In this way we obtain a model that preserves many pre-theoretical intuitions about what counts as adequate rationality constraints on our actual practices—and hence about what counts as an adequate, normative epistemological perspective. (shrink)
There have been attempts to get some logic out of belief dynamics, i.e. attempts to deﬁne the constants of propositional logic in terms of functions from sets of beliefs to sets of beliefs. It is interesting to see whether something similar can be done for ontological categories, i.e. ontological constants. The theory presented here will be a (modest) expansion of belief dynamics: it will not only incorporate beliefs, but also parts of beliefs, so called belief fragments. (...) On the basis of this we will give a belief-dynamical account of the ontological categories of states of aﬀairs, individuals, properties of arbitrary adicities and properties of arbitrary orders. (shrink)
This paper presents the model of ‘bounded revision’ that is based on two-dimensional revision functions taking as arguments pairs consisting of an input sentence and a reference sentence. The key idea is that the input sentence is accepted as far as (and just a little further than) the reference sentence is ‘cotenable’ with it. Bounded revision satisfies the AGM axioms as well as the Same Beliefs Condition (SBC) saying that the set of beliefs accepted after the revision does not (...) depend on the reference sentence (although the posterior belief state does depend on it). Bounded revision satisfies the Darwiche–Pearl (DP) axioms for iterated belief change. If the reference sentence is fixed to be a tautology or a contradiction, two well-known one-dimensional revision operations result. Bounded revision thus naturally fills the space between conservative revision (also known as natural revision) and moderate revision (also known as lexicographic revision). I compare this approach to the two-dimensional model of ‘revision by comparison’ investigated by Fermé and Rott (Artif Intell 157:5–47, 2004 ) that satisfies neither the SBC nor the DP axioms. I conclude that two-dimensional revision operations add substantially to the expressive power of qualitative approaches that do not make use of numbers as measures of degrees of belief. (shrink)
Two systems of belief change based on paraconsistent logics are introduced in this article by means of AGM-like postulates. The first one, AGMp, is defined over any paraconsistent logic which extends classical logic such that the law of excluded middle holds w.r.t. the paraconsistent negation. The second one, AGMo , is specifically designed for paraconsistent logics known as Logics of Formal Inconsistency (LFIs), which have a formal consistency operator that allows to recover all the classical inferences. Besides the three (...) usual operations over belief sets, namely expansion, contraction and revision (which is obtained from contraction by the Levi identity), the underlying paraconsistent logic allows us to define additional operations involving (non-explosive) contradictions. Thus, it is defined external revision (which is obtained from contraction by the reverse Levi identity), consolidation and semi-revision, all of them over belief sets. It is worth noting that the latter operations, introduced by S. Hansson, involve the temporary acceptance of contradictory beliefs, and so they were originally defined only for belief bases. Unlike to previous proposals in the literature, only defined for specific paraconsistent logics, the present approach can be applied to a general class of paraconsistent logics which are supraclassical, thus preserving the spirit of AGM. Moreover, representation theorems w.r.t. constructions based on selection functions are obtained for all the operations. (shrink)
We study belief change in the branching-time structures introduced in Bonanno (Artif Intell 171:144–160, 2007 ). First, we identify a property of branching-time frames that is equivalent (when the set of states is finite) to AGM-consistency, which is defined as follows. A frame is AGM-consistent if the partial belief revision function associated with an arbitrary state-instant pair and an arbitrary model based on that frame can be extended to a full belief revision function that satisfies the AGM (...) postulates. Second, we provide a set of modal axioms that characterize the class of AGM-consistent frames within the modal logic introduced in Bonanno (Artif Intell 171:144–160, 2007 ). Third, we introduce a generalization of AGM belief revision functions that allows a clear statement of principles of iterated belief revision and discuss iterated revision both semantically and syntactically. (shrink)
Most belief change operators in the AGM tradition assume an underlying plausibility ordering over the possible worlds which is transitive and complete. A unifying structure for these operators, based on supplementing the plausibility ordering with a second, guiding, relation over the worlds was presented in Booth et al. (Artif Intell 174:1339-1368, 2010). However it is not always reasonable to assume completeness of the underlying ordering. In this paper we generalise the structure of Booth et al. (Artif Intell 174: 1339-1368, (...) 2010) to allow incomparabilities between worlds. We axiomatise the resulting class of belief removal functions, and show that it includes an important family of removal functions based on finite prioritised belief bases. (shrink)
It is natural and important to have a formal representation of plain belief, according to which propositions are held true, or held false, or neither. (In the paper this is called a deterministic representation of epistemic states). And it is of great philosophical importance to have a dynamic account of plain belief. AGM belief revision theory seems to provide such an account, but it founders at the problem of iterated belief revision, since it can generally account (...) only for one step of revision. The paper discusses and rejects two solutions within the confines of AGM theory. It then introduces ranking functions (as I prefer to call them now; in the paper they are still called ordinal conditional functions) as the proper (and, I find, still the best) solution of the problem, proves that conditional independence w.r.t. ranking functions satisfies the so-called graphoid axioms, and proposes general rules of belief change (in close analogy to Jeffrey's generalized probabilistic conditionalization) that encompass revision and contraction as conceived in AGM theory. Indeed, the parallel to probability theory is amazing. Probability theory can profit from ranking theory as well since it is also plagued by the problem of iterated belief revision even if probability measures are conceived as Popper measures (see No. 11). Finally, the theory is compared with predecessors which are numerous and impressive, but somehow failed to explain the all-important conditional ranks in the appropriate way. (shrink)
This paper discusses Jean-Yves Jaffray’s ideas on ambiguity and the views underlying his ideas. His models, developed 20 years ago, provide the most tractable separation of risk attitudes, ambiguity attitudes, and ambiguity beliefs available in the literature today.
This paper extends earlier work by its authors on formal aspects of the processes of contracting a theory to eliminate a proposition and revising a theory to introduce a proposition. In the course of the earlier work, Gardenfors developed general postulates of a more or less equational nature for such processes, whilst Alchourron and Makinson studied the particular case of contraction functions that are maximal, in the sense of yielding a maximal subset of the theory (or alternatively, of one (...) of its axiomatic bases), that fails to imply the proposition being eliminated. In the present paper, the authors study a broader class, including contraction functions that may be less than maximal. Specifically, they investigate "partial meet contraction functions", which are defined to yield the intersection of some nonempty family of maximal subsets of the theory that fail to imply the proposition being eliminated. Basic properties of these functions are established: it is shown in particular that they satisfy the Gardenfors postulates, and moreover that they are sufficiently general to provide a representation theorem for those postulates. Some special classes of partial meet contraction functions, notably those that are "relational" and "transitively relational", are studied in detail, and their connections with certain "supplementary postulates" of Gardenfors investigated, with a further representation theorem established. (shrink)
The present article is an elaborated and upgraded version of the Early Career Award talk that I delivered at the IAPR 2019 conference in Gdańsk, Poland. In line with the conference’s thematic focus on new trends and neglected themes in psychology of religion, I argue that psychology of religion should strive for firmer integration with evolutionary theory and its associated methodological toolkit. Employing evolutionary theory enables to systematize findings from individual psychological studies within a broader framework that could resolve lingering (...) empirical contradictions by providing an ultimate rationale for which results should be expected. The benefits of evolutionary analysis are illustrated through the study of collective rituals and, specifically, their purported function in stabilizing risky collective action. By comparing the socio-ecological pressures faced by chimpanzees, contemporary hunter-gatherers, and early Homo, I outline the selective pressures that may have led to the evolution of collective rituals in the hominin lineage, and, based on these selective pressures, I make predictions regarding the different functions and their underlying mechanisms that collective rituals should possess. While examining these functions, I echo the Early Career Award and focus mostly on my past work and the work of my collaborators, showing that collective rituals may stabilize risky collective action by increasing social bonding, affording to assort cooperative individuals, and providing a platform for reliable communication of commitment to group norms. The article closes with a discussion of the role that belief in superhuman agents plays in stabilizing and enhancing the effects of collective rituals on trust-based cooperation. (shrink)
Traditional Bayesianism requires that an agent’s degrees of belief be represented by a real-valued, probabilistic credence function. However, in many cases it seems that our evidence is not rich enough to warrant such precision. In light of this, some have proposed that we instead represent an agent’s degrees of belief as a set of credence functions. This way, we can respect the evidence by requiring that the set, often called the agent’s credal state, includes all credence (...) class='Hi'>functions that are in some sense compatible with the evidence. One known problem for this evidentially motivated imprecise view is that in certain cases, our imprecise credence in a particular proposition will remain the same no matter how much evidence we receive. In this article I argue that the problem is much more general than has been appreciated so far, and that it’s difficult to avoid it without compromising the initial evidentialist motivation. _1_ Introduction _2_ Precision and Its Problems _3_ Imprecise Bayesianism and Respecting Ambiguous Evidence _4_ Local Belief Inertia _5_ From Local to Global Belief Inertia _6_ Responding to Global Belief Inertia _7_ Conclusion. (shrink)
Many philosophers hold that the probability axioms constitute norms of rationality governing degrees of belief. This view, known as subjective Bayesianism, has been widely criticized for being too idealized. It is claimed that the norms on degrees of belief postulated by subjective Bayesianism cannot be followed by human agents, and hence have no normative force for beings like us. This problem is especially pressing since the standard framework of subjective Bayesianism only allows us to distinguish between two kinds (...) of credence functions—coherent ones that obey the probability axioms perfectly, and incoherent ones that don’t. An attractive response to this problem is to extend the framework of subjective Bayesianism in such a way that we can measure differences between incoherent credence functions. This lets us explain how the Bayesian ideals can be approximated by humans. I argue that we should look for a measure that captures what I call the ‘overall degree of incoherence’ of a credence function. I then examine various incoherence measures that have been proposed in the literature, and evaluate whether they are suitable for measuring overall incoherence. The competitors are a qualitative measure that relies on finding coherent subsets of incoherent credence functions, a class of quantitative measures that measure incoherence in terms of normalized Dutch book loss, and a class of distance measures that determine the distance to the closest coherent credence function. I argue that one particular Dutch book measure and a corresponding distance measure are particularly well suited for capturing the overall degree of incoherence of a credence function. (shrink)
In this paper it is argued that, in order to solve the problem of iterated belief change, both the belief state and its input should be represented as epistemic entrenchment (EE) relations. A belief revision operation is constructed that updates a given EE relation to a new one in light of an evidential EE relation. It is shown that the operation in question satisfies generalized versions of the Gärdenfors revision postulates. The account offered is motivated by Spohn's (...) ordinal conditionalization functions, and can be seen as the Jeffrization of a proposal considered by Rott. (shrink)
Ordinary usage supports both a relatively strong belief requirement on intention and a tight conceptual connection between intention and intentional action. More specifically, it speaks in favor both of the view that "S intends to A" entails "S believes that he (probably) will A" and of the thesis that "S intentionally A-ed" entails "S intended to A." So, at least, proponents of these ideas often claim or assume, and with appreciable justification. The conjunction of these two ideas, however, has (...) some highly counterintuitive implications. This suggests that a certain skepticism about the coherence of ordinary usage of "intention" may be salutary. Fortunately, the skeptic need not abandon the quest for understanding. Much can be gleaned from a careful investigation of the functions attributed to intention in the literature. In this paper, I argue that the capacity of intention to do the work that the literature assigns it does not depend upon intentional A-ing's entailing intending to A, nor upon there being a strong belief constraint on intention, nor even a certain relatively weak belief constraint. I also develop an account of the features of intention in virtue of which it is capable of doing this work. This account provides the core of an adequate conception of intention. Toward the end of the paper, I briefly motivate acceptance of a modest belief requirement on non-functional grounds. (shrink)
First, ranking functions are argued to be superior to AGM belief revision theory in two crucial respects. Second, it is shown how ranking functions are uniquely reflected in iterated belief change. More precisely, conditions on threefold contractions are specified which suffice for representing contractions by a ranking function uniquely up to multiplication by a positive integer. Thus, an important advantage AGM theory seemed to have over ranking functions proves to be spurious.