"A Survey of Ranking Theory": The paper gives an up-to-date survey of ranking theory. It carefully explains the basics. It elaborates on the ranking theoretic explication of reasons and their balance. It explains the dynamics of belief statable in ranking terms and indicates how the ranks can thereby be measured. It suggests how the theory of Bayesian nets can be carried over to ranking theory. It indicates what it might mean to objectify ranks. It discusses the formal and the philosophical (...) aspects of the tight relation and the complementarity of ranks and probabilities. It closes with comparative remarks on predecessors and other philosophical proposals as well as formal models developed in AI. (shrink)
The paper argues that the objects of belief should not be conceived as sets of possible worlds or propositions of set of centered possible worlds or egocentric propositions (this is the propositional conception), but rather as sets of pairs consisting of a centered world and a sequence of objects (this is the intentional conception of the objects of belief). The paper explains the deep significance of this thesis for the framework of two-dimensional semantics, indeed for any framework trying to adequately (...) relate semantics and epistemology (which is here construed as what I call the Congruence Principle). I give three arguments for this thesis, two preliminary indecisive ones by way of examples, and third theoretical one alluding to a deep principle of philosophical psychology (which I call the Invariance Principle). This paper is an improved and up-dated version of my paper No. 25. It will appear in Causation, Coherence, and Concepts. A Collection of Essays of mine. (shrink)
The paper identifies two major strands of truth theories, ontological and epistemological ones, and argues that both are of equal primacy and find their home within two-dimensional semantics. Contrary to received views, it argues further that epistemological truth theories operate on Lewisian possible worlds and ontological truth theories on Wittgensteinian possible worlds and that both are mediated by the so-called epistemic-ontic map the further specification of which is of utmost philosophical importance.
Ranking theory delivers an account of iterated contraction; each ranking function induces a specific iterated contraction behavior. The paper shows how to reconstruct a ranking function from its iterated contraction behavior uniquely up to multiplicative constant and thus how to measure ranks on a ratio scale. Thereby, it also shows how to completely axiomatize that behavior. The complete set of laws of iterated contraction it specifies amend the laws hitherto discussed in the literature.
This paper introduces a new equilibrium concept for normal form games called dependency equilibrium; it is defined, exemplified, and compared with Nash and correlated equilibria in Sections 2–4. Its philosophical motive is to rationalize cooperation in the one shot prisoners' dilemma. A brief discussion of its meaningfulness in Section 5 concludes the paper. †To contact the author, please write to: Department of Philosophy, University of Konstanz, 78457 Konstanz, Germany; e-mail: Wolfgang.Spohn@uni-konstanz.de.
The paper builds on the basically Humean idea that A is a cause of B iff A and B both occur, A precedes B, and A raises the metaphysical or epistemic status of B given the obtaining circumstances. It argues that in pursuit of a theory of deterministic causation this ‘status raising’ is best explicated not in regularity or counterfactual terms, but in terms of ranking functions. On this basis, it constructs a rigorous theory of deterministic causation that successfully deals (...) with cases of overdetermination and pre-emption. It finally indicates how the account's profound epistemic relativization induced by ranking theory can be undone. Introduction Variables, propositions, time Induction first Causation Redundant causation Objectivization. (shrink)
The paper proposes to amend structuralism in mathematics by saying what places in a structure and thus mathematical objects are. They are the objects of the canonical system realizing a categorical structure, where that canonical system is a minimal system in a specific essentialistic sense. It would thus be a basic ontological axiom that such a canonical system always exists. This way of conceiving mathematical objects is underscored by a defense of an essentialistic version of Leibniz principle according to which (...) each object is uniquely characterized by its proper and possibly relational essence (where proper means not referring to identity"). (shrink)
This paper compares the epistemological conception of Isaac Levi with mine. We are joined in both giving a constructive answer to the relation of belief and probability, without reducing one to the other. However, our constructions differ in at least nine more or less important ways, all discussed in the paper. In particular, the paper explains the similarities and differences of Shackle's functions of potential surprise, as used by Levi, and my ranking functions in formal as well as in philosophical (...) respects. The appendix explains how ranking and probability theory can be combined in the notion of a ranked probability measure (or probabilified ranking function). (shrink)
The paper is based on ranking theory, a theory of degrees of disbelief (and hence belief). On this basis, it explains enumerative induction, the confirmation of a law by its positive instances, which may indeed take various schemes. It gives a ranking theoretic explication of a possible law or a nomological hypothesis. It proves, then, that such schemes of enumerative induction uniquely correspond to mixtures of such nomological hypotheses. Thus, it shows that de Finetti's probabilistic representation theorems may be transformed (...) into an account of confirmation of possible laws and that enumerative induction is equivalent to such an account. The paper concludes with some remarks about the apriority of lawfulness or the uniformity of nature. (shrink)
The paper pleads for compatibilism by distinguishing the first-persons normative and the observers empirical perspective. In the normative perspective ones own actions are uncaused and free, in the empirical perspective they are caused and may be predetermined. Still, there is only one notion of causation that is able to account for the relation between the causal conceptions within the two perspectives. The other main idea for explicating free will by explaining free actions or intentions as appropriately caused in a specified (...) way is acknowledged, but not discussed. The paper finally argues that the normative and the empirical perspective are on a par; none is prior; even from within the empirical perspective the normative perspective is ineliminable. (shrink)
"Five Questions on Formal Philosophy": Like the other authors in the volume, I was asked for my reflections on the character of (formal) philosophy by answering the following five questions: 1. Why were you initially drawn to formal methods? 2. What example(s) from your work illustrates the role formal methods can play in philosophy? 3. What is the proper role of philosophy in relation to other disciplines? 4. What do you consider the most neglected topics and/or contributions in (...) late 20th century philosophy? 5. What are the most important problems in philosophy and which are the prospects for progress? (shrink)
From Humean Supervenience to Humean Projection": This paper attempts to develop a projectivistic understanding of chance or objective probability or partial determination. It does so by critically examining David Lewis philosophy of probability and his defense of Humean Supervenience, building thereupon the constructive projectivistic alternative, which will basically be a suitable reinterpretation of de Finettis position. Any treatment of the topic must show how it extends to natural necessity or deterministic laws or full determination in perfect parallel. The paper indicates (...) at the end how this demand can be met. (shrink)
The characteristic difference between laws and accidental generalizations lies in our epistemic or inductive attitude towards them. This idea has taken various forms and dominated the discussion about lawlikeness in the last decades. Hence, ranking theory with its resources of formalizing defeasible reasoning or inductive schemes seems ideally suited to explicate the idea in a formal way. This is what the paper attempts to do. Thus it will turn out that a law is simply the deterministic analogue of a sequence (...) of independent, identically distributed random variables. This entails that de Finetti‘s representation theorems can be directly transformed into an account of confirmation of laws thus conceived. (shrink)
This paper arose from a comment to the talk of Tara Smith at the Pittsburgh-Konstanz Colloquium in October 2002. Now it is a self-contained text precisely about what the title indicates. It is a somewhat mixed bag, but a nice read.
In his influential paper "Epistemology Naturalized" Quine argues that Carnap's failure to define disposition predicates and his subsequent preference for reduction sentences naturally lead to an entirely naturalized epistemology. This conclusion is too hasty, I object. Applying the account of dispositional predicates developed in No. 26 I defend Carnap's aprioristic epistemology against Quine's attacks.
The paper attempts to rationalize cooperation in the one-shot prisoners' dilemma (PD). It starts by introducing (and preliminarily investigating) a new kind of equilibrium (differing from Aumann's correlated equilibria) according to which the players' actions may be correlated (sect. 2). In PD the Pareto-optimal among these equilibria is joint cooperation. Since these equilibria seem to contradict causal preconceptions, the paper continues with a standard analysis of the causal structure of decision situations (sect. 3). The analysis then raises to a reflexive (...) point of view according to which the agent integrates his own present and future decision situations into the causal picture of his situation (sect. 4). This reflexive structure is first applied to the toxin puzzle and then to Newcomb's problem, showing a way to rationalize drinking the toxin and taking only one box with-out assuming causal mystery (sect. 5). The latter result is finally extended to a rationalization of cooperation in PD (sect. 6). (shrink)
In this paper two theories of defeasible reasoning, Pollock's account and my theory of ranking functions, are compared, on a strategic level, since a strictly formal comparison would have been unfeasible. A brief summary of the accounts shows their basic difference: Pollock's is a strictly computational one, whereas ranking functions provide a regulative theory. Consequently, I argue that Pollock's theory is normatively defective, unable to provide a theoretical justification for its basic inference rules and thus an independent notion of admissible (...) rules. Conversely, I explain how quite a number of achievements of Pollock's account can be adequately duplicated within ranking theory. The main purpose of the paper, though, is not to settle a dispute with formal epistemology, but rather to emphasize the importance of formal methods to the whole of epistemology. (shrink)
The characteristic difference between laws and accidental generalizations lies in our epistemic or inductive attitude towards them. This idea has taken various forms and dominated the discussion about lawlikeness in the last decades. Likewise, the issue about ceteris paribus conditions is essentially about how we epistemically deal with exceptions. Hence, ranking theory with its resources of defeasible reasoning seems ideally suited to explicate these points in a formal way. This is what the paper attempts to do. Thus it will turn (...) out that a law is simply the deterministic analogue of a sequence of independent, identically distributed random variables. This entails that de Finetti's representation theorems can be directly transformed into an account of confirmation of laws thus conceived. (shrink)
Meets what? Ranking theory is, as far as I know, the only existing theory suited for underpinning Keith Lehrers account of knowledge and justification. If this is true, its high time to bring both together. This is what I shall do in this paper. However, the result of defining Lehrers primitive notions in terms of ranking theory will be disappointing: justified acceptance will, depending on the interpretation, either have an unintelligible structure or reduce to mere acceptance, and in the latter (...) interpretation knowledge will reduce to true belief. Of course, this result will require a discussion of who should be disappointed. So, the plan of the paper is simple: In section 1 I shall briefly state what is required for underpinning Lehrers account and why most of the familiar theories fail to do so. In section 2 I shall briefly motivate and introduce ranking theory. Basing Lehrers account on it will be entirely straightforward. Section 3 proves the above-mentioned results. Section 4, finally, discusses the possible conclusions. (shrink)
Modern theory of rationality has truly grown into a science of its own. Still, the general topic remained a genuinely philosophical one. This essay is concerned with giving a brief overview. Section 2 explains the fundamental scheme of all rationality assessments. With its help, a schematic order of the main questions concerning the theory of rationality can be given; the questions turn out to be quite unevenly addressed in the literature. Section 3 discusses the fundamental issue that the theory of (...) rationality seems to be both a normative and an empirical theory. Section 4, finally, shows how the unity of the theory of rationality can nevertheless be maintained. (shrink)
Putnam (1975) and Burge (1979) have made a convincing case that neither mea- nings nor beliefs are in the head. Most philosophers, it seems, have accepted their argument. Putnam explained that a subject.
This paper is the most complete presentation of my views on deterministic causation. It develops the deterministic theory in perfect parallel to my theory of probabilistic causation and thus unites the two aspects. It also argues that the theory presented is superior to all regularity and all counterfactual theories of causation.
The paper is essentially a short version Spohn "Strategic Rationality" which emphasizes in particular how the ideas developed there may be used to shed new light on the iterated prisoner's dilemma (and on iterated Newcomb's problem).
First, ranking functions are argued to be superior to AGM belief revision theory in two crucial respects. Second, it is shown how ranking functions are uniquely reflected in iterated belief change. More precisely, conditions on threefold contractions are specified which suffice for representing contractions by a ranking function uniquely up to multiplication by a positive integer. Thus, an important advantage AGM theory seemed to have over ranking functions proves to be spurious.
The paper argues that the standard decision theoretic account of strategies and their rationality or optimality is much too narrow, that strategies should rather condition future action to future decision situations (a point of view already developed in my Grundlagen der Entscheidungstheorie, sect. 4.4), that practical deliberation must therefore essentially rely on a relation of superiority and inferiority between possible future decision situations, that all this allows to substantially broaden the theory of practical rationality, that a long list of points (...) attended to in the literature can be subsumed under the broadened perspective (including a novel view on the iterated prisoner's dilemma and on iterated Newcomb's problem, which, however, is revised in Spohn (2003) "Dependency Equilibria and the Causal Structure of Decision and Game Situation"), and that the task to complete and systematize this list indeed forms a fruitful research programme. (shrink)
The paper proposes two principles of coherence (thus taking up work started in Spohn (1991) "A Reason for Explanation: Explanations Provide Stable Reasons"). The latter indeed serves as a weak, but precise explication of the notion of coherence as it is used in the current epistemological discussion. After discussing their epistemological setting, the paper considers four ways of establishing these principles. They may be inferred neither from enumerative induction, nor from the nature of propositions as objects of belief, nor in (...) a Kantian way from self-consciousness. Rather, I propose a fairly rigorous way to infer them from an even more fundamental rationality principle of non-dogmatism and an elementary theory of perception. (shrink)
The central claim of the paper is, roughly, that the fact that it looks to somebody as if p is a defeasibly a priori reason for assuming that p (and vice versa), for any person, even for the perceiver himself. As a preparation, it outlines a doxastic conception suitable to explicate this claim and explains how to analyse dispositions within this conception. Since an observable p has the disposition to look as if p, this analysis generalizes to the central claim (...) which is then argued to be at the bottom of coherentism. Thus, the defense of the claim supports coherentism as opposed to foundationalism and at the same time provides an answer to skepticism about the external world. (shrink)
where _x_ stands for a visible object and _y_ for a perceiving subject (the reference to a time may be neglected).1 I take here ”character” in the sense of Kaplan (1977) as substantiated by Haas-Spohn (1995 and Chapter 14 in this book)). The point of using Kaplan’s framework is simple, but of utmost importance: It provides a scheme for clearly separating epistemological and metaphysical issues, for specifying how the two domains are related, and for connecting them to questions concerning meaning (...) where confusions are often only duplicated. All this is achieved by it better than by any alternative I know of.2. (shrink)
The paper analyzes the meaning of color terms within the framework of Kaplan's character theory (which, when generalized to a treatment of hidden indexicality or dependence on the context world, can perfectly accommodate Kripke's notions of apriority and of (metaphysical) necessity). It explains this framework and why it might be fruitfully applied to color terms. Then it defends six theses: that (1) the predicate "is red" and (2) even the relation "appears red to" are hidden indexicals (i.e., have, as used (...) in English, different extensions in different context worlds), that (3) the phenomenal, the comparative, and the epistemic reading of "appears red to" are not three different readings, but reflect the context world dependence of this term, that (4) the statement "x is red iff x would appear red to most English-speaking people under normal conditions" is a priori in English, but analytic only in one reading and not in another, and that these observations account well for the epistemology of color terms and allow us to be metaphysically conservative by claiming that our context world is presumably such that (5) the statement "x appears red to y iff x (appropriately) causes y to be in a certain (disjunctive) neural state N" is necessarily true and (6) the statement "x is red iff the reflectance spectrum of the surface of x is of some (disjunctive) kind R" is necessarily true as well. (shrink)
When I talk about the objects of belief I do not mean, e.g., the sun to which my thought that the sun will rise tomorrow refers; I do not mean the objects we think about. I take objects rather in a general philosophical sense; they simply are the bearers of properties and the relata of relations. I am thus concerned with the objects that are related by the belief relation „_a_ believes that _p_“. In this scheme „ _a _“ represents (...) a person or an epistemic subject; but I am not going to discuss what a person is. „ _p _“ or „that _p _“ represents an object, namely the object of belief; and I am going to discuss what this is. In other words, I am interested in belief contents – to use a less neutral, narrower and equally unclear term. (shrink)
This paper deals with Hans Reichenbach's common cause principle. It was propounded by him in (1956, ch. 19), and has been developed and widely applied by Wesley Salmon, e.g. in (1978) and (1984, ch. 8). Thus, it has become one of the focal points of the continuing discussion of causation. The paper addresses five questions. Section 1 asks: What does the principle say? And section 2 asks: What is its philosophical significance? The most important question, of course, is this: (...) Is the principle true? To answer that question, however, one must first consider how one might one argue about it at all. One can do so by way of examples, the subject of section 3, or more theoretically, which is the goal of section 4. Based on an explication of probabilistic causation proposed by me in (1980), (1983), and (1990), section 4 shows that a variant of the principle is provable within a classical framework. The question naturally arises whether the proved variant is adequate, or too weak. This is pursued in section 5. My main conclusion will be that some version of Reichenbach's principle is provably true, and others may be. This may seem overly ambitious, but it is not. The paper does not make any progress on essential worries about the common cause principle arising in the quantum domain; it only establishes more rigorously what has been thought to be plausible at least within a classical framework. (shrink)
As the paper explains, it is crucial to epistemology in general and to the theory of causation in particular to investigate the properties of conditional independence as completely as possible. The paper summarizes the most important results concerning conditional independence with respect to two important representations of epistemic states, namely (strictly positive) probability measures and natural conditional (or disbelief or ranking) functions. It finally adds some new observations.
And this paper is an attempt to say precisely how, thus addressing a philosophical problem which is commonly taken to be a serious one. It does so, however, in quite an idiosyncratic way. It is based on the account of inductive schemes I have given in (1988) and (1990a) and on the conception of causation I have presented in (1980), (1983), and (1990b), and it intends to fill one of many gaps which have been left by these papers. Still, I (...) have tried to make this paper self-contained. Section 1 explains the philosophical question this paper is about; in more general terms it asks what might be meant by objectifying epistemic states or features of them and to which extent epistemic states can be objectified. The next sections introduce the basis I rely on with formal precision and some explanation; section 2 deals with induction and section 3 with causation. Within these confines, section 4 attempts to give an explication of the relevant sense of objectification and section 5 investigates the extent to which various features of epistemic states are objectifiable. The two most salient results are roughly that the relation "A is a reason for B" cannot be objectified at all and that the relation "A is a cause of B" can be objectified only under substantial, though reasonable restrictions. What has all of this to do with probability? A lot. The paper trades on a pervasive duality between probabilistic and deterministic epistemology, between a probabilistic representation of epistemic states together with a theory of probabilistic causation and another representation of epistemic states which I call deterministic because it lends itself, in a perfectly parallel fashion, to a theory of deterministic causation. Here I explicitly deal only with the deterministic side, but the duality should pave the way for further conclusions concerning objective probabilities and statistical laws. This outlook is briefly expanded in the final section 6. (shrink)
Probability theory, epistemically interpreted, provides an excellent, if not the best available account of inductive reasoning. This is so because there are general and definite rules for the change of subjective probabilities through information or experience; induction and belief change are one and same topic, after all. The most basic of these rules is simply to conditionalize with respect to the information received; and there are similar and more general rules. 1 Hence, a fundamental reason for the epistemological success of (...) probability theory is that there at all exists a well-behaved concept of conditional probability. Still, people have, and have reasons for, various concerns over probability theory. One of these is my starting point: Intuitively, we have the notion of plain belief; we believe propositions2 to be true (or to be false or neither). Probability theory, however, offers no formal counterpart to this notion. Believing A is not the same as having probability 1 for A, because probability 1 is incorrigible3; but plain belief is clearly corrigible. And believing A is not the same as giving A a probability larger than some 1 - c, because believing A and believing B is usually taken to be equivalent to believing A & B.4 Thus, it seems that the formal representation of plain belief has to take a non-probabilistic route. Indeed, representing plain belief seems easy enough: simply represent an epistemic state by the set of all propositions believed true in it or, since I make the common assumption that plain belief is deductively closed, by the conjunction of all propositions believed true in it. But this does not yet provide a theory of induction, i.e. an answer to the question how epistemic states so represented are changed tbrough information or experience. There is a convincing partial answer: if the new information is compatible with the old epistemic state, then the new epistemic state is simply represented by the conjunction of the new information and the old beliefs. This answer is partial because it does not cover the quite common case where the new information is incompatible with the old beliefs. It is, however, important to complete the answer and to cover this case, too; otherwise, we would not represent plain belief as conigible. The crucial problem is that there is no good completion. When epistemic states are represented simply by the conjunction of all propositions believed true in it, the answer cannot be completed; and though there is a lot of fruitful work, no other representation of epistemic states has been proposed, as far as I know, which provides a complete solution to this problem. In this paper, I want to suggest such a solution. In , I have more fully argued that this is the only solution, if certain plausible desiderata are to be satisfied. Here, in section 2, I will be content with formally defining and intuitively explaining my proposal. I will compare my proposal with probability theory in section 3. It will turn out that the theory I am proposing is structurally homomorphic to probability theory in important respects and that it is thus equally easily implementable, but moreover computationally simpler. Section 4 contains a very brief comparison with various kinds of logics, in particular conditional logic, with Shackle's functions of potential surprise and related theories, and with the Dempster - Shafer theory of belief functions. (shrink)
It is natural and important to have a formal representation of plain belief, according to which propositions are held true, or held false, or neither. (In the paper this is called a deterministic representation of epistemic states). And it is of great philosophical importance to have a dynamic account of plain belief. AGM belief revision theory seems to provide such an account, but it founders at the problem of iterated belief revision, since it can generally account only for one step (...) of revision. The paper discusses and rejects two solutions within the confines of AGM theory. It then introduces ranking functions (as I prefer to call them now; in the paper they are still called ordinal conditional functions) as the proper (and, I find, still the best) solution of the problem, proves that conditional independence w.r.t. ranking functions satisfies the so-called graphoid axioms, and proposes general rules of belief change (in close analogy to Jeffrey's generalized probabilistic conditionalization) that encompass revision and contraction as conceived in AGM theory. Indeed, the parallel to probability theory is amazing. Probability theory can profit from ranking theory as well since it is also plagued by the problem of iterated belief revision even if probability measures are conceived as Popper measures (see No. 11). Finally, the theory is compared with predecessors which are numerous and impressive, but somehow failed to explain the all-important conditional ranks in the appropriate way. (shrink)
The aim of the paper is to explicate the concept of causal independence between sets of factors and Reichenbach's screening-off-relation in probabilistic terms along the lines of Suppes' probabilistic theory of causality (1970). The probabilistic concept central to this task is that of conditional stochastic independence. The adequacy of the explication is supported by proving some theorems about the explicata which correspond to our intuitions about the explicanda.
The paper displays the similarity between the theory of probabilistic causation developed by Glymour et al. since 1983 and mine developed since 1976: the core of both is that causal graphs are Bayesian nets. The similarity extends to the treatment of actions or interventions in the two theories. But there is also a crucial difference. Glymour et al. take causal dependencies as primitive and argue them to behave like Bayesian nets under wide circumstances. By contrast, I argue the behavior of (...) Bayesian nets to be ultimately the defining characteristic of causal dependence. (shrink)