The degrees of belief of rational agents should be guided by the evidence available to them. This paper takes as a starting point the view—argued elsewhere—that the formal model best able to capture this idea is one that represents degrees of belief using Dempster–Shafer belief functions. However degrees of belief should not only respect evidence: they also guide decision and action. Whatever formal model of degrees of belief we adopt, we need a decision theory that works with it: that takes (...) as input degrees of belief so represented. The task of this paper is to develop such a decision theory for the belief function model of degrees of belief. This is not the first paper to attempt that task, but compared to the existing literature it takes a more abstract route to its destination, via a consideration of the very idea of rational decision making in light of one’s beliefs and desires. After presenting the new decision theory and comparing it to existing views, the paper goes on to consider diachronic decision situations. (shrink)
Previous work has shown that the Complex Conceptual Spaces − Single Observation Mathematical framework is a useful tool for event characterization. This mathematical framework is developed on the basis of Conceptual Spaces and uses integer linear programming to find the needed similarity values. The work of this paper is focused primarily on space event characterization. In particular, the focus is on the ranking of threats for malicious space events such as a kinetic kill. To make the Conceptual Spaces framework work, (...) the similarity values between the contents of observations on the one hand and the properties of the entities observed on the other needs to be found. This paper shows how to exploit Dempster-Shafer theory to implement a statistical approach for finding these similarities values. This approach will allow a user to identify the uncertainty involved in similarity value data, which can later be propagated through the developed mathematical model in order for the user to know the overall uncertainty in the observation-to-concept mappings needed for space event characterization. (shrink)
The received model of degrees of belief represents them as probabilities. Over the last half century, many philosophers have been convinced that this model fails because it cannot make room for the idea that an agent’s degrees of belief should respect the available evidence. In its place they have advocated a model that represents degrees of belief using imprecise probabilities (sets of probability functions). This paper presents a model of degrees of belief based on Dempster–Shafer belief functions and then presents (...) arguments for belief functions over imprecise probabilities as a model of evidence-respecting degrees of belief. The arguments cover three kinds of issue: theoretical virtues (simplicity, interpretability and flexibility); motivations; and problem cases (dilation and belief inertia). (shrink)
This paper develops a probabilistic model of belief change under interpretation shifts, in the context of a problem case from dynamic epistemic logic. Van Benthem  has shown that a particular kind of belief change, typical for dynamic epistemic logic, cannot be modelled by standard Bayesian conditioning. I argue that the problems described by van Benthem come about because the belief change alters the semantics in which the change is supposed to be modelled: the new information induces a shift in (...) the interpretation of the sentences. In this paper I show that interpretation shifts can be modeled in terms of updating by conditioning. The model derives from the knowledge structures developed by Fagin et al , and hinges on a distinction between the propositional and informational content of sentences. Finally, I show that Dempster-Shafer theory provides the appropriate probability kinematics for the model. (shrink)
The paper discusses some ways in which vagueness and its phenomena may be thought to impose certain limits on our knowledge and, more specifically, may be thought to bear on the traditional philosophical idea that certain domains of facts are luminous, i.e., roughly, fully open to our view. The discussion focuses on a very influential argument to the effect that almost no such interesting domains exist. Many commentators have felt that the vagueness unavoidably inherent in the description of the facts (...) that are best candidates for being luminous plays an illicit role in such argument. The paper centres around the idea that vagueness brings with itself the prima facie plausibility of soritical principles. Using the diagnostics of sharpenings, it is first pointed out that, despite certain considerations to the contrary, the margin‐for‐error principle required by the anti‐luminosity argument may well derive all of its plausibility from an underlying soritical principle. The notion of confidence that is relevant to the argument is then isolated and sharply distinguished from the notion of subjective probability. Against this background, it is argued that the reasoning about confidence involved in the argument in favour of the problematic margin‐for‐error principle is fallacious in the same way in which sorites reasoning is. This reveals the possibility of having reliable knowledge even at the penumbral limit with falsity, a possibility for which a concrete formal model is constructed. The model in turn permits a deeper appreciation of the role played in the dialectic by the distinction between confidence and subjective probability as well as by confidence requirements on knowledge. It is concluded that careful heeding of vagueness and its phenomena, far from forcing new and surprising limits on our knowledge, actually removes one of the main barriers – unreliability – often thought to stand in its way. (shrink)
Epistemology is the study of knowledge and justified belief. Belief is thus central to epistemology. It comes in a qualitative form, as when Sophia believes that Vienna is the capital of Austria, and a quantitative form, as when Sophia's degree of belief that Vienna is the capital of Austria is at least twice her degree of belief that tomorrow it will be sunny in Vienna. Formal epistemology, as opposed to mainstream epistemology (Hendricks 2006), is epistemology done in a formal way, (...) that is, by employing tools from logic and mathematics. The goal of this entry is to give the reader an overview of the formal tools available to epistemologists for the representation of belief. A particular focus will be the relation between formal representations of qualitative belief and formal representations of quantitative degrees of belief. (shrink)
In artificial intelligence (AI), a number of criticisms were raised against the use of probability for dealing with uncertainty. All these criticisms, except what in this article we call the non-adequacy claim, have been eventually confuted. The non-adequacy claim is an exception because, unlike the other criticisms, it is exquisitely philosophical and, possibly for this reason, it was not discussed in the technical literature. A lack of clarity and understanding of this claim had a major impact on AI. Indeed, mostly (...) leaning on this claim, some scientists developed an alternative research direction and, as a result, the AI community split in two schools: a probabilistic and an alternative one. In this article, we argue that the non-adequacy claim has a strongly metaphysical character and, as such, should not be accepted as a conclusive argument against the adequacy of probability. (shrink)
Combining testimonial reports from independent and partially reliable information sources is an important epistemological problem of uncertain reasoning. Within the framework of Dempster–Shafer theory, we propose a general model of partially reliable sources, which includes several previously known results as special cases. The paper reproduces these results on the basis of a comprehensive model taxonomy. This gives a number of new insights and thereby contributes to a better understanding of this important application of reasoning with uncertain and incomplete information.
The Dempster–Shafer approach to expressing beliefabout a parameter in a statistical model is notconsistent with the likelihood principle. Thisinconsistency has been recognized for some time, andmanifests itself as a non-commutativity, in which theorder of operations (combining belief, combininglikelihood) makes a difference. It is proposed herethat requiring the expression of belief to be committed to the model (and to certain of itssubmodels) makes likelihood inference very nearly aspecial case of the Dempster–Shafer theory.
The starting point of this work is the gap between two distinct traditions in information engineering: knowledge representation and data - driven modelling. The first tradition emphasizes logic as a tool for representing beliefs held by an agent. The second tradition claims that the main source of knowledge is made of observed data, and generally does not use logic as a modelling tool. However, the emergence of fuzzy logic has blurred the boundaries between these two traditions by putting forward fuzzy (...) rules as a Janus-faced tool that may represent knowledge, as well as approximate non-linear functions representing data. This paper lays bare logical foundations of data - driven reasoning whereby a set of formulas is understood as a set of observed facts rather than a set of beliefs. Several representation frameworks are considered from this point of view: classical logic, possibility theory, belief functions, epistemic logic, fuzzy rule-based systems. Mamdani's fuzzy rules are recovered as belonging to the data - driven view. In possibility theory a third set-function, different from possibility and necessity plays a key role in the data - driven view, and corresponds to a particular modality in epistemic logic. A bi-modal logic system is presented which handles both beliefs and observations, and for which a completeness theorem is given. Lastly, our results may shed new light in deontic logic and allow for a distinction between explicit and implicit permission that standard deontic modal logics do not often emphasize. (shrink)
ABSTRACT A modal logic interpretation of Dempster-Shafer theory is developed in the framework of multivalued models of modal logic, i.e. models in which in any possible world an arbitrary number (possibly zero) of atomic propositions can be true. Several approaches to conditioning in multivalued models of modal logic are presented.
We are concerned with formal models of reasoning under uncertainty. Many approaches to this problem are known in the literature e.g. Dempster-Shafer theory , , bayesian-based reasoning , , belief networks , many-valued logics and fuzzy logics , non-monotonic logics , neural network logics . We propose rough mereology developed by the last two authors [22-25] as a foundation for approximate reasoning about complex objects. Our notion of a complex object includes, among others, proofs understood as schemes constructed in order (...) to support within our knowledge assertions/hypotheses about reality described by our knowledge incompletely. (shrink)
Builds on classical probability theory and offers an extremely workable solution to the many problems of artificial intelligence, concentrating on the rapidly growing areas of fuzzy reasoning and neural computing. Contains a collection of previously unpublished articles by leading researchers in the field.
There are a number of reasons for being interested in uncertainty, and there are also a number of uncertainty formalisms. These formalisms are not unrelated. It is argued that they can all be reflected as special cases of the approach of taking probabilities to be determined by sets of probability functions defined on an algebra of statements. Thus, interval probabilities should be construed as maximum and minimum probabilities within a set of distributions, Glenn Shafer's belief functions should be construed as (...) lower probabilities, etc. Updating probabilities introduces new considerations, and it is shown that the representation of belief as a set of probabilities conflicts in this regard with the updating procedures advocated by Shafer. The attempt to make subjectivistic probability plausible as a doctrine of rational belief by making it more flowery — i.e., by adding new dimensions — does not succeed. But, if one is going to represent beliefs by sets of distributions, those sets of distributions might as well be based in statistical knowledge, as they are in epistemological or evidential probability. (shrink)
The main ingredients of Spohn's theory of epistemic beliefs are (1) a functional representation of an epistemic state called a disbelief function and (2) a rule for revising this function in light of new information. The main contribution of this paper is as follows. First, we provide a new axiomatic definition of an epistemic state and study some of its properties. Second, we study some properties of an alternative functional representation of an epistemic state called a Spohnian belief function. Third, (...) we state a rule for combining disbelief functions that is mathematically equivalent to Spohn's belief revision rule. Whereas Spohn's rule is defined in terms of the initial epistemic state and some features of the final epistemic state, the rule of combination is defined in terms of the initial epistemic state and the incremental epistemic state representing the information gained. Fourth, we state a rule of subtraction that allows one to recover the addendum epistemic state from the initial and final epistemic states. Fifth, we study some properties of our rule of combination. One distinct advantage of our rule of combination is that besides belief revision, it can be used to describe an initial epistemic state for many variables when this information is given as several independent epistemic states each involving few variables. Another advantage of our reformulation is that we can show that Spohn's theory of epistemic beliefs shares the essential abstract features of probability theory and the Dempster-Shafer theory of belief functions. One implication of this is that we have a ready-made algorithm for propagating disbelief functions using only local computation. (shrink)
Degrees of belief; Dempster's rule of combination; Simple and separable support functions; The weights of evidence; Compatible frames of discernment; Support functions; The discernment of evidence; Quasi support functions; Consonance; Statistical evidence; The dual nature of probable reasoning.