This paper calls for a re-appraisal of McGee's analysis of the semantics, logic and probabilities of indicative conditionals presented in his 1989 paper Conditional probabilities and compounds of conditionals. The probabilistic measures introduced by McGee are given a new axiomatisation built on the principle that the antecedent of a conditional is probabilistically independent of the conditional and a more transparent method of constructing such measures is provided. McGee's Dutch book argument is restructured to more clearly reveal that it introduces a (...) novel contribution to the epistemology of semantic indeterminacy, and shows that its more controversial implications are unavoidable if we want to maintain the Ramsey Test along with the standard laws of probability. Importantly, it is shown that the counterexamples that have been levelled at McGee's analysis|generating a rather wide consensus that it yields `unintuitive' or `wrong' probabilities for compounds |fail to strike at their intended target; for to honour the intuitions of the counterexamples one must either give up the Ramsey Test or the standard laws of probability. It will be argued that we need to give up neither if we take the counterexamples as further evidence that the indicative conditional sometimes allows for a non-epistemic `causal' interpretation alongside its usual epistemic interpretation. (shrink)
A globally expressivist analysis of the indicative conditional based on the Ramsey Test is presented. The analysis is a form of ‘global’ expressivism in that it supplies acceptance and rejection conditions for all the sentence forming connectives of propositional logic and so allows the conditional to embed in arbitrarily complex sentences. The expressivist framework is semantically characterized in a restrictor semantics due to Vann McGee, and is completely axiomatized in a logic dubbed ICL. The expressivist framework extends the AGM framework (...) for belief revision and so provides a categorical epistemology for conditionals that complements McGee’s probabilistic framework while drawing on the same semantics. The result is an account of the semantics and acceptability conditions of the indicative conditional that fits well with the linguistic data while integrating both expressivist and semanticist perspectives. (shrink)
It is argued that indicative conditionals are best viewed as having truth conditions (and so they are in part factual) but that these truth conditions are ‘gappy’ which leaves an explanatory gap that can only be filled by epistemic considerations (and so indicative conditionals are in part epistemic). This dual nature of indicative conditionals gives reason to rethink the relationship between logic viewed as a descriptive discipline (focusing on semantics) and logic viewed as a discipline with a normative import (focusing (...) on epistemic notions such as ‘reasoning’, ‘beliefs’ and ‘assumptions’). In particular, it is argued that the development of formal models for epistemic states can serve as a starting point for exploring logic when viewed as a normative discipline. (shrink)
An alleged counterexample to causal decision theory, put forward by Andy Egan, is studied in some detail. It is argued that Egan rejects the evaluation of causal decision theory on the basis of a description of the decision situation that is different from—indeed inconsistent with—the description on which causal decision theory makes its evaluation. So the example is not a counterexample to causal decision theory. Nevertheless, the example shows that causal decision theory can recommend unratifiable acts which presents a problem (...) in the dynamics of intentions. It is argued that we can defuse this problem if we hold that decision theory is a theory of rational decision making rather than a theory of rational acts. It is shown how decisions can have epistemic side-effects that are not mediated by the act and that there are cases where one can only bring oneself to perform the best act by updating by imaging rather than by conditioning. This provides a pragmatic argument for updating by imaging rather than by conditioning in these cases. (shrink)
Conditionals that contain a modality in the consequent give rise to a particular semantic phenomenon whereby the antecedent of the conditional blocks possibilities when interpreting the modality in the consequent. This explains the puzzling logical behaviour of constructions like "If you don't buy a lottery ticket, you can't win", "If you eat that poison, it is unlikely that you will survive the day" and "If you kill Harry, you ought to kill him gently". In this paper it is argued that (...) a semantic version of the Ramsey Test provides a key in the analysis of such constructions. The logic for this semantics is axiomatized and some examples are studied, among them a well-known puzzle for contrary-to-duty obligations. (shrink)
It is argued that the "inner" negation $\mathord{\sim}$ familiar from 3-valued logic can be interpreted as a form of "conditional" negation: $\mathord{\sim}$ is read '$A$ is false if it has a truth value'. It is argued that this reading squares well with a particular 3-valued interpretation of a conditional that in the literature has been seen as a serious candidate for capturing the truth conditions of the natural language indicative conditional (e.g., "If Jim went to the party he had a (...) good time"). It is shown that the logic induced by the semantics shares many familiar properties with classical negation, but is orthogonal to both intuitionistic and classical negation: it differs from both in validating the inference from $A \rightarrow \nega B$ to $\nega(A\rightarrow B)$ to. (shrink)
This paper explores the possibility that causal decision theory can be formulated in terms of probabilities of conditionals. It is argued that a generalized Stalnaker semantics in combination with an underlying branching time structure not only provides the basis for a plausible account of the semantics of indicative conditionals, but also that the resulting conditionals have properties that make them well-suited as a basis for formulating causal decision theory. Decision theory (at least if we omit the frills) is not an (...) esoteric science, however unfamiliar it may seem to an outsider. Rather it is a systematic exposition of the consequences of certain well-chosen platitudes about belief, desire, preference and choice. It is the very core of our common-sense theory of persons, dissected out and elegantly systematized. (David Lewis, Synthese 23:331–344, 1974, p. 337). A small distortion in the analysis of the conditional may create spurious problems with the analysis of other concepts. So if the facts about usage favor one among a number of subtly different theories, it may be important to determine which one it is. (Robert Stalnaker, A Defense of Conditional Excluded Middle, pp. 87–104, 1980, p. 87). (shrink)
Five types of constructions are introduced for non-prioritized belief revision, i.e., belief revision in which the input sentence is not always accepted. These constructions include generalizations of entrenchment-based and sphere-based revision. Axiomatic characterizations are provided, and close interconnections are shown to hold between the different constructions.
Non-bivalent languages (languages containing sentences that can be true, false or neither) are given a probabilitistic interpretation in terms of betting quotients. Necessary and sufficient conditions for avoiding Dutch books—the laws of non-bivalent probability—in such a setting are provided.
The problems that surround iterated contractions and expansions of beliefs are approached by studying hypertheories, a generalisation of Adam Grove's notion of systems of spheres. By using a language with dynamic and doxastic operators different ideas about the basic nature of belief change are axiomatised. It is shown that by imposing quite natural constraints on how hypertheories may change, the basic logics for belief change can be strengthened considerably to bring one closer to a theory of iterated belief change. It (...) is then argued that the logic of expansion, in particular, cannot without loss of generality be strengthened any further to allow for a full logic of iterated belief change. To remedy this situation a notion of directed expansion is introduced that allows for a full logic of iterated belief change. The new operation is given an axiomatisation that is complete for linear hypertheories. (shrink)
Five types of constructions are introduced for non-prioritized belief revision, i.e., belief revision in which the input sentence is not always accepted. These constructions include generalizations of entrenchment-based and sphere-based revision. Axiomatic characterizations are provided, and close interconnections are shown to hold between the different constructions.
The paper presents a non-monotonic inference relation on a language containing a conditional that satisfies the Ramsey Test. The logic is a weakening of classical logic and preserves many of the ‘paradoxes of implication’ associated with the material implication. It is argued, however, that once one makes the proper distinction between supposing that something is the case and accepting that it is the case, these ‘paradoxes’ cease to be counterintuitive. A representation theorem is provided where conditionals are given a non-bivalent (...) semantics and epistemic states are represented via preferential models. (shrink)
This paper provides finitary jointly necessary and sufficient acceptance and rejection conditions for the logical constants of a first order quantificational language. By introducing the notion of making an assignment as a distinct object level practice—something you do with a sentence—(as opposed to a meta-level semantic notion) and combining this with the practice of (hypothetical and categorical) acceptance and rejection and the practice of making suppositions one gains a structure that is sufficiently rich to fully characterize the class of classical (...) first order theories. The analysis thus provides a way of characterizing classical first order quantification by expressivist means. (shrink)
A semantics is presented for belief-revision in the face of common announcements to a group of agents that have beliefs about each other's beliefs. The semantics is based on the idea that possible worlds can be viewed as having an internal structure, representing the belief independent features of the world, and the respective belief states of the agents in a modular fashion. Modularity guarantees that changing one aspect of the world (a belief independent feature or a belief state) has no (...) effect on any other aspect of the world. This allows us to employ an AGM-style selection function to represent revision. The semantics is given a complete axiomatisation (identical to the axiomatisation found by Gerbrandy and Groeneveld for a semantics based on non-wellfounded set theory) for the special case of expansion. (shrink)
Information received from different sources can be inconsistent. Even when the sources of information can be ordered on the basis of their trustworthiness, it turns out that extracting an acceptable notion of support for information is a non-trivial matter, as is the question what information a rational agent should accept. Here it is shown how a support ordering on the information can be generated and how it can be used to decide what information to accept and what not to accept. (...) This ordering, it turns out, is closely related to notions such as Epistemic Entrenchment and Grove spheres studied in belief revision. (shrink)
It is argued that expressivists can solve their problems in accounting for the unity and autonomy of logic – logic is topic independent and does not derive from a general ‘logic’ of mental states – by adopting an analysis of the logical connectives that takes logically complex sentences to express complex combinations of simple attitudes like belief and disapproval and dispositions to form such simple attitudes upon performing suppositional acts, and taking acceptance and rejection of sentences to be the common (...) mental denominator in descriptive and evaluative discourse, and structural requirements governing these to be the basis for logic. Such an account requires that attitudes like belief, intention and disapproval can come in hypothetical mode – plausibly linked to the capacity to mentally simulate or emulate one's own attitudes – and, if correct, suggests that these form the basic building blocks for our capacity to understand logically complex sentences. (shrink)
Safety factor rules are used for drawing putatively reasonable conclusions from incomplete datasets. The paper attempts to provide answers to four questions: “How are safety factors used?”, “When are safety factors used?”, “Why are safety used?” and “How do safety factor rules relate to decision theory?”. The authors conclude that safety factor rules should be regarded as decision methods rather than as criteria of rightness and that they can be used in both practical and theoretical reasoning. Simplicity of application and (...) inability or unwillingness to defer judgment appear to be important factors in explaining why the rules are used. (shrink)
A semantics is presented for belief revision in the face of common announcements to a group of agents that have beliefs about each other’s beliefs. The semantics is based on the idea that possible worlds can be viewed as having an internal-structure, representing the belief independent features of the world, and the respective belief states of the agents in a modular fashion. Modularity guarantees that changing one aspect of the world (a belief independent feature or a belief state) has no (...) effect on any other aspect of the world. This allows us to employ an AGM-style selection function to represent revision. The semantics is given a complete axiomatisation (identical to the axiomatisation found by Gerbrandy and Groeneveld for a semantics based on non-wellfounded set theory) for the special case of expansion. (shrink)
The typical function of goals is to regulate action in a way that furthers goal achievement. Goals are typically set on the assumption that they will help bring the agent closer to the desired state of affairs. However, sometimes endorsement of a goal, or the processes by which the goal is set, can obstruct its achievement. When this happens, the goal is self-defeating. Self-defeating goals are common in both private and social decision-making but have not received much attention by decision (...) theorists. In this paper, we investigate different variants of three major types of self-defeating mechanisms: The goal can be an obstacle to its own fulfilment, goal-setting activities can impede goal achievement, and disclosure of the goal can interfere with its attainment. Different strategies against self-defeasance are tentatively explored, and their efficiency against different types of self-defeasance is investigated. (shrink)
The connectives of classical propositional logic are given an analysis in terms of necessary and sufficient conditions of acceptance and rejection, i.e. the connectives are analyzed within an expressivist bilateral meaning-is-use framework. It is explained how such a framework differs from standard inferentialist frameworks and it is argued that it is better suited to address the particular issues raised by the expressivist thesis that the meaning of a sentence is determined by the mental state that it is conventionally used to (...) express. Furthermore, it is shown that the classical requirements governing the connectives completely characterize classical logic, are conservative and separable, are in bilateral harmony, are structurally preservative with respect to the classical coordination requirements and resolve the categoricity problem. These results are taken to show that one can give an expressivist bilateral meaning-is-use analysis of the connectives that confer on them a determinate coherent classical interpretation. (shrink)
The logic of dominance arguments is analyzed using two different kinds of conditionals: indicative (epistemic) and subjunctive (counter-factual). It is shown that on the indicative interpretation an assumption of independence is needed for a dominance argument to go through. It is also shown that on the subjunctive interpretation no assumption of independence is needed once the standard premises of the dominance argument are true, but that independence plays an important role in arguing for the truth of the premises of the (...) dominance argument. A key feature of the analysis is the interpretation of the doubly conditional comparative "I will get a better outcome if A than if B" which is taken to have the structure "(the outcome if A) is better than (the outcome if B)". (shrink)
When a belief set is contracted only some beliefs are eligible for removal. By introducing eligibility for removal as a new semantic primitive for contraction and combining it with epistemic entrenchment we get a contraction operator with a number of interesting properties. By placing some minimal constraint upon eligibility we get an explicit contraction recipe that exactly characterises the so called interpolation thesis, a thesis that states upper and lower bounds for the amount of information to be given up in (...) contraction. As a result we drop the controversial property of recovery. By placing additional constraints on eligibility we get representation theorems for a number of contraction operators of varying strength. In addition it is shown that recovery contraction is a special case that we get if eligibility is explicitly constructed in terms of logical relevance. (shrink)
This paper is about the statics and dynamics of belief states that are represented by pairs consisting of an agent's credences (represented by a subjective probability measure) and her categorical beliefs (represented by a set of possible worlds). Regarding the static side, we argue that the latter proposition should be coherent with respect to the probability measure and that its probability should reach a certain threshold value. On the dynamic side, we advocate Jeffrey conditionalisation as the principal mode of changing (...) one's belief state. This updating method fits the idea of the Lockean Thesis better than plain Bayesian conditionalisation, and it affords a flexible method for adding and withdrawing categorical beliefs. We show that it fails to satisfy the traditional principles of Inclusion and Preservation for belief revision and the principle of Recovery for belief withdrawals{, as well as the Levi and Harper identities. We take this to be a problem for the latter principles rather than for the idea of coherent belief change. (shrink)
It is argued that truth value of a sentence containing free variables in a context of use, just as the reference of the free variables concerned, depends on the assumptions and posits given by the context. However, context may under-determine the reference of a free variable and the truth value of sentences in which it occurs. It is argued that in such cases a free variable has indeterminate reference and a sentence in which it occurs may have indeterminate truth value. (...) On letting, say, x be such that \, the sentence ‘Either \ or \’ is true but the sentence ‘\’ has an indeterminate truth value: it is determinate that the variable x refers to either 2 or \, but it is indeterminate which of the two it refers to, as a result ‘\’ has a truth value but its truth value is indeterminate. The semantic indeterminacy is analysed in a ‘radically’ supervaluational semantic framework closely analogous to the treatment of vagueness in McGee and McLaughlin and Smith, which saves bivalence, the T-schema and the truth-functional analysis of the boolean connectives. It is shown that on such an analysis the modality ‘determinately’ is quite clearly not an epistemic modality, avoiding a potential objection raised by Williamson against such ‘radically’ supervaluational treatments of vagueness, and that determinate truth is the semantic value preserved in classically valid arguments. The analysis is contrasted with the epistemicist proposal of Breckenridge and Magidor which implies that ‘\’ has a determinate but unknowable truth value. (shrink)
Vann McGee has proposed a counterexample to the Ramsey Test. In the counterexample, a seemingly trustworthy source has testified that p and that if not-p, then q. If one subsequently learns not-p, then one has reason to doubt the trustworthiness of the source and so, the argument goes, one has reason to doubt the conditional asserted by the source. Since what one learns is that the antecedent of the conditional holds, these doubts are contrary to the Ramsey Test. We argue (...) that the counterexample fails. It rests on a principle of testimonial dependence that is not applicable when a source hedges his or her claims. (shrink)
A formal model for updates—the result of learning that the world has changed—in a multi-agent setting is presented and completely axiomatized. The model allows that several agents simultaneously are informed of an event in the world in such a way that it becomes common knowledge among the agents that the event has occurred. The model shares many features with the model for common announcements—an announcement about the state of the world in which it becomes common knowledge among the audience that (...) the announcement has been made—presented in Cantwell (2005), but exploits the difference between learning that a state of the world obtains and learning that the state of the world has changed. (shrink)
The view that decision methods can only be justified by appeal to pragmatic considerations is defended. Pragmatic considerations are viewed as providing the underlying subject matter of decision theories. It is argued that other approaches fail to provide grounds for a normative decision theory.It is argued that preferences that can lead to pragmatically adverse outcomes in a relevantly similar possible decision situation are pragmatically unsound, even if the decision situation never arises. This rebuts several standard objections to money-pump and Dutch (...) book arguments. However, because one can only appeal to relevantly similar decision situations in pragmatic arguments, these will have a less general scope than is often imagined. A conclusion is that pragmatic arguments for strong unconditional principles such as ‘always maximise expected utility!’ do not work. Pragmatic considerations can however be used to argue for conditional principles of the form ‘if conditions X, Y and Z are satisfied, then one ought to satisfy W’, where W need not follow logically from X, Y and Z.The notion of a sound pragmatic argument is defined in terms of particular notion of coherence, it is shown how this can be applied and how it handles problematic cases such as van Fraassen’s Dutch book for the principle of Reflection. (shrink)
The view that decision methods can only be justified by appeal to pragmatic considerations is defended. Pragmatic considerations are viewed as providing the underlying subject matter (“semantics”) of decision theories. It is argued that other approaches (e.g. justifying principles by appeal to obviousness, common usage, etc.) fail to provide grounds for a normative decision theory.It is argued that preferences that can lead to pragmatically adverse outcomes in a relevantly similar possible decision situation are pragmatically unsound, even if the decision situation (...) never arises. This rebuts several standard objections to money-pump and Dutch book arguments. However, because one can only appeal to relevantly similar decision situations in pragmatic arguments, these will have a less general scope than is often imagined. A conclusion is that pragmatic arguments for strong unconditional principles such as ‘always maximise expected utility!’ do not work. Pragmatic considerations can however be used to argue for conditional principles of the form ‘if conditions X, Y and Z are satisfied, then one ought to satisfy W’, where W need not follow logically from X, Y and Z.The notion of a sound pragmatic argument is defined in terms of particular notion of coherence, it is shown how this can be applied and how it handles problematic cases such as van Fraassen’s Dutch book for the principle of Reflection. (shrink)
Ever since [4], systems of spheres have been considered to give an intuitive and elegant way to give a semantics for logics of theory- or belief- change. Several authors [5, 11] have considered giving up the rather strong assumption that systems of spheres be linearly ordered by inclusion. These more general structures are called hypertheories after [8]. It is shown that none of the proposed logics induced by these weaker structures are compact and thus cannot be given a strongly complete (...) axiomatization in a finitary logic. Complete infinitary axiomatizations are given for several intuitive logics based on hypertheories that are not linearly ordered by inclusion. (shrink)
Ever since [4], systems of spheres have been considered to give an intuitive and elegant way to give a semantics for logics of theory- or belief- change. Several authors [5, 11] have considered giving up the rather strong assumption that systems of spheres be linearly ordered by inclusion. These more general structures are called hypertheories after [8]. It is shown that none of the proposed logics induced by these weaker structures are compact and thus cannot be given a strongly complete (...) axiomatization in a finitary logic. Complete infinitary axiomatizations are given for several intuitive logics based on hypertheories that are not linearly ordered by inclusion. (shrink)