Truth is one of the most debated topics in philosophy; Wolfgang Kunne presents a comprehensive critical examination of all major theories, from Aristotle to the present day. He argues that it is possible to give a satisfactory 'modest' account of truth without invoking problematic notions like correspondence, fact, or meaning. The clarity of exposition and the wealth of examples will make Conceptions of Truth an invaluable and stimulating guide for advanced students and scholars.
This is one of a pair of discussion notes comparing some features of the account of causation in Wolfgang Spohn’s Laws of Belief with the “interventionist” account in James Woodward’s Making Things Happen. This note locates the core difference of the accounts in the fact that Woodward’s account follows an epistemological order, while Spohn’s follows a conceptual order. This unfolds in five further differences: type- versus token-level causation, reference to time, actual/counterfactual intervention versus epistemic/suppositional wiggling, a circular versus a (...) circle-free conception of the circumstances of a direct causal relation, and absolute versus model-relative causation. (shrink)
Magnetism in meta-semantics is the view that the meaning of our words is determined in part by their use and in part by the objective naturalness of candidate meanings. This hypothesis is commonly attributed to David Lewis, and has been put to philosophical work by Brian Weatherson, Ted Sider and others. I argue that there is no evidence that Lewis ever endorsed the view, and that his actual account of language reveals good reasons against it.
It is natural and important to have a formal representation of plain belief, according to which propositions are held true, or held false, or neither. (In the paper this is called a deterministic representation of epistemic states). And it is of great philosophical importance to have a dynamic account of plain belief. AGM belief revision theory seems to provide such an account, but it founders at the problem of iterated belief revision, since it can generally account only for one step (...) of revision. The paper discusses and rejects two solutions within the confines of AGM theory. It then introduces ranking functions (as I prefer to call them now; in the paper they are still called ordinal conditional functions) as the proper (and, I find, still the best) solution of the problem, proves that conditional independence w.r.t. ranking functions satisfies the so-called graphoid axioms, and proposes general rules of belief change (in close analogy to Jeffrey's generalized probabilistic conditionalization) that encompass revision and contraction as conceived in AGM theory. Indeed, the parallel to probability theory is amazing. Probability theory can profit from ranking theory as well since it is also plagued by the problem of iterated belief revision even if probability measures are conceived as Popper measures (see No. 11). Finally, the theory is compared with predecessors which are numerous and impressive, but somehow failed to explain the all-important conditional ranks in the appropriate way. (shrink)
This collection includes twenty original philosophical essays in honour of Wolfgang Spohn. The contributions mirror the scope of Wolfgang Spohn’s work. They address topics from epistemology (e.g., the theory of ranking functions, belief revision, and the nature of knowledge and belief), philosophy of science (e.g., causation, induction, and laws of nature), the philosophy of language (e.g., the theory of meaning and the semantics of counterfactuals), and the philosophy of mind (e.g., intentionality and free will), as well as problems (...) of ontology, logic, the theory of practical rationality, and meta-philosophy. ― Contributors: Ansgar Beckermann, Wolfgang Benkewitz, Bernd Buldt, Ralf Busse, Christoph Fehige, Wolfgang Freitag, Gordian Haas, Volker Halbach, Franz Huber, Andreas Kemmerling, Manfred Kupffer, Hannes Leitgeb, Godehard Link, Arthur Merin, Thomas Müller, Julian Nida-Rümelin, Martine Nida-Rümelin, Hans Rott, Holger Sturm, Thomas Ede Zimmermann, Alexandra Zinke. (shrink)
The paper builds on the basically Humean idea that A is a cause of B iff A and B both occur, A precedes B, and A raises the metaphysical or epistemic status of B given the obtaining circumstances. It argues that in pursuit of a theory of deterministic causation this ‘status raising’ is best explicated not in regularity or counterfactual terms, but in terms of ranking functions. On this basis, it constructs a rigorous theory of deterministic causation that successfully deals (...) with cases of overdetermination and pre-emption. It finally indicates how the account's profound epistemic relativization induced by ranking theory can be undone. Introduction Variables, propositions, time Induction first Causation Redundant causation Objectivization. (shrink)
Conditionals somehow express conditional beliefs. However, conditional belief is a bi-propositional attitude that is generally not truth-evaluable, in contrast to unconditional belief. Therefore, this article opts for an expressivistic semantics for conditionals, grounds this semantics in the arguably most adequate account of conditional belief, that is, ranking theory, and dismisses probability theory for that purpose, because probabilities cannot represent belief. Various expressive options are then explained in terms of ranking theory, with the intention to set out a general interpretive scheme (...) that is able to account for the most variegated usage of conditionals. The Ramsey test is only the first option. Relevance is another, familiar, but little understood item, which comes in several versions. This article adds a further family of expressive options, which is able to subsume also counterfactuals and causal conditionals, and indicates at the end how this family allows for partial recovery of truth conditions for conditionals. (shrink)
The paper takes an expressivistic perspective, i.e., it takes conditionals of all sorts to primarily express conditional beliefs. Therefore it is based on what it takes to be the best account of conditional belief, namely ranking theory. It proposes not to start looking at the bewildering linguistic phenomenology, but first to systematically study the various options of expressing features of conditional belief. Those options by far transcend the Ramsey test and include relevancies of various kinds and in particular the so-called (...) “circumstances are such that” reading, under which also all conditionals representing causal relations can be subsumed. In this way a unifying perspective on the many kinds of conditionals is offered. The final section explains the considerable extent to which truth conditions for conditionals, which may seem lost in the expressivistic or epistemic perspective, may be recovered. (shrink)
I defend a general rule for updating beliefs that takes into account both the impact of new evidence and changes in the subject’s location. The rule combines standard conditioning with a shifting operation that moves the center of each doxastic possibility forward to the next point where information arrives. I show that well-known arguments for conditioning lead to this combination when centered information is taken into account. I also discuss how my proposal relates to other recent proposals, what results it (...) delivers for puzzles like the Sleeping Beauty problem, and whether there are diachronic constraints on rational belief at all. (shrink)
Concentrating upon applications that are most relevant to modern physics, this valuable book surveys variational principles and examines their relationship to dynamics and quantum theory. Stressing the history and theory of these mathematical concepts rather than the mechanics, the authors provide many insights into the development of quantum mechanics and present much hard-to-find material in a remarkably lucid, compact form. After summarizing the historical background from Pythagoras to Francis Bacon, Professors Yourgrau and Mandelstram cover Fermat's principle of least time, the (...) principle of least action of Maupertuis, development of this principle by Euler and Lagrange, and the equations of Lagrange and Hamilton. Equipped by this thorough preparation to treat variational principles in general, they proceed to derive Hamilton's principle, the Hamilton-Jacobi equation, and Hamilton's canonical equations. An investigation of electrodynamics in Hamiltonian form covers next, followed by a resume of variational principles in classical dynamics. The authors then launch into an analysis of their most significant topics: the relation between variational principles and wave mechanics, and the principles of Feynman and Schwinger in quantum mechanics. Two concluding chapters extend the discussion to hydrodynamics and natural philosophy. Professional physicists, mathematicians, and advanced students with a strong mathematical background will find this stimulating volume invaluable reading. Extremely popular in its hardcover edition, this volume will find even wider appreciation in its first fine inexpensive paperbound edition. (shrink)
Wolfgang Welsch examines global aestheticization phenomena, probes the relationship of aesthetics and ethics, and considers the broad relevance of aesthetics for contemporary thinking. He argues that modes of thought familiar from the aesthetic realm comprise fundamental paradigms for understanding todayÆs reality. The implications for specific and everyday issues are demonstrated in studies of architecture, advertising, the Internet, and our perception of the life world. Surgically precise, innovative, and, above all, relevant, this book is an essential resource, providing the analysis (...) of contemporary culture with philosophical bite. Aesthetics is to transcend itself, address the whole realm of aethesis, and hence enable orientation in the contemporary condition. Undoing aesthetics means releasing it from old cultural binds and giving it new ties with the future. (shrink)
"A Survey of Ranking Theory": The paper gives an up-to-date survey of ranking theory. It carefully explains the basics. It elaborates on the ranking theoretic explication of reasons and their balance. It explains the dynamics of belief statable in ranking terms and indicates how the ranks can thereby be measured. It suggests how the theory of Bayesian nets can be carried over to ranking theory. It indicates what it might mean to objectify ranks. It discusses the formal and the philosophical (...) aspects of the tight relation and the complementarity of ranks and probabilities. It closes with comparative remarks on predecessors and other philosophical proposals as well as formal models developed in AI. (shrink)
Ranking theory delivers an account of iterated contraction; each ranking function induces a specific iterated contraction behavior. The paper shows how to reconstruct a ranking function from its iterated contraction behavior uniquely up to multiplicative constant and thus how to measure ranks on a ratio scale. Thereby, it also shows how to completely axiomatize that behavior. The complete set of laws of iterated contraction it specifies amend the laws hitherto discussed in the literature.
The characteristic difference between laws and accidental generalizations lies in our epistemic or inductive attitude towards them. This idea has taken various forms and dominated the discussion about lawlikeness in the last decades. Likewise, the issue about ceteris paribus conditions is essentially about how we epistemically deal with exceptions. Hence, ranking theory with its resources of defeasible reasoning seems ideally suited to explicate these points in a formal way. This is what the paper attempts to do. Thus it will turn (...) out that a law is simply the deterministic analogue of a sequence of independent, identically distributed random variables. This entails that de Finetti's representation theorems can be directly transformed into an account of confirmation of laws thus conceived. (shrink)
This paper investigates different readings of plural and reciprocal sentences and how they can be derived from syntactic surface structures in a systematic way. The main thesis is that these readings result from different ways of inserting logical operators at the level of Logical Form. The basic operator considered here is a cumulative mapping from predicates that apply to singularities onto the corresponding predicates that apply to pluralities. Given a theory which allows for free insertion of such operators, it can (...) then be shown that the lexical semantics of the reciprocal expressions each other/one another consists of exactly two components, namely an anaphoric variable and a non-identity statement. This receives further support from the observation that it is exactly these two components that can be focused by only; all that remains to be done is to correctly manipulate these components at the level of LF. (shrink)
Probability theory, epistemically interpreted, provides an excellent, if not the best available account of inductive reasoning. This is so because there are general and definite rules for the change of subjective probabilities through information or experience; induction and belief change are one and same topic, after all. The most basic of these rules is simply to conditionalize with respect to the information received; and there are similar and more general rules. 1 Hence, a fundamental reason for the epistemological success of (...) probability theory is that there at all exists a well-behaved concept of conditional probability. Still, people have, and have reasons for, various concerns over probability theory. One of these is my starting point: Intuitively, we have the notion of plain belief; we believe propositions2 to be true (or to be false or neither). Probability theory, however, offers no formal counterpart to this notion. Believing A is not the same as having probability 1 for A, because probability 1 is incorrigible3; but plain belief is clearly corrigible. And believing A is not the same as giving A a probability larger than some 1 - c, because believing A and believing B is usually taken to be equivalent to believing A & B.4 Thus, it seems that the formal representation of plain belief has to take a non-probabilistic route. Indeed, representing plain belief seems easy enough: simply represent an epistemic state by the set of all propositions believed true in it or, since I make the common assumption that plain belief is deductively closed, by the conjunction of all propositions believed true in it. But this does not yet provide a theory of induction, i.e. an answer to the question how epistemic states so represented are changed tbrough information or experience. There is a convincing partial answer: if the new information is compatible with the old epistemic state, then the new epistemic state is simply represented by the conjunction of the new information and the old beliefs. This answer is partial because it does not cover the quite common case where the new information is incompatible with the old beliefs. It is, however, important to complete the answer and to cover this case, too; otherwise, we would not represent plain belief as conigible. The crucial problem is that there is no good completion. When epistemic states are represented simply by the conjunction of all propositions believed true in it, the answer cannot be completed; and though there is a lot of fruitful work, no other representation of epistemic states has been proposed, as far as I know, which provides a complete solution to this problem. In this paper, I want to suggest such a solution. In , I have more fully argued that this is the only solution, if certain plausible desiderata are to be satisfied. Here, in section 2, I will be content with formally defining and intuitively explaining my proposal. I will compare my proposal with probability theory in section 3. It will turn out that the theory I am proposing is structurally homomorphic to probability theory in important respects and that it is thus equally easily implementable, but moreover computationally simpler. Section 4 contains a very brief comparison with various kinds of logics, in particular conditional logic, with Shackle's functions of potential surprise and related theories, and with the Dempster - Shafer theory of belief functions. (shrink)
When an agent undergoes fission, how should the beliefs of the fission results relate to the pre-fission beliefs? This question is important for the Everett interpretation of quantum mechanics, but it is of independent philosophical interest. Among other things, fission scenarios demonstrate that ‘self-locating’ information can affect the probability of uncentred propositions even if an agent has no essentially self-locating uncertainty. I present a general update rule for centred beliefs that gives sensible verdicts in cases of fission, without relying on (...) controversial metaphysical or linguistic assumptions. The rule is supported by the same considerations that support standard conditioning in the traditional framework of uncentred propositions. (shrink)
There seem to be two ways of supposing a proposition: supposing “indicatively” that Shakespeare didn’t write Hamlet, it is likely that someone else did; supposing “subjunctively” that Shakespeare hadn’t written Hamlet, it is likely that nobody would have written the play. Let P be the probability of B on the subjunctive supposition that A. Is P equal to the probability of the corresponding counterfactual, A □→B? I review recent triviality arguments against this hypothesis and argue that they do not succeed. (...) On the other hand, I argue that even if we can equate P with P, we still need an account of how subjunctive conditional probabilities are related to unconditional probabilities. The triviality arguments reveal that the connection is not as straightforward as one might have hoped. (shrink)
The paper will show how one may rationalize one-boxing in Newcomb's problem and drinking the toxin in the Toxin puzzle within the confines of causal decision theory by ascending to so-called reflexive decision models which reflect how actions are caused by decision situations (beliefs, desires, and intentions) represented by ordinary unreflexive decision models.
One of the world's pre-eminent Max Weber scholars here presents a comprehensive analysis of Weber's ambiguous stance toward modernity considered from a normative, theoretical, and historical point of view. The book is in two parts. Part I scrutinises Weber's world view. On the basis of his thinking about the meaning and inter-relationships of science, politics, and ethics in the modern era, Weber is seen as the embodiment of a social scientist and political thinker who exposes himself to intellectual risks and (...) existential tensions while resisting final solutions. It includes a masterly analysis of Weber's two famous speeches 'Science as a Vocation' and 'Politics as a Vocation'. Part II considers Weber's unfinished project on the sociology of religion. Piecing together planned and partially completed works on Islam and Western Christianity, Schluchter locates them in the history and theory of Weber's overall work. This reconstruction of Weber's work on religion emphasises its interplay between religion, economy, politics, and law. (shrink)
The paper displays the similarity between the theory of probabilistic causation developed by Glymour et al. since 1983 and mine developed since 1976: the core of both is that causal graphs are Bayesian nets. The similarity extends to the treatment of actions or interventions in the two theories. But there is also a crucial difference. Glymour et al. take causal dependencies as primitive and argue them to behave like Bayesian nets under wide circumstances. By contrast, I argue the behavior of (...) Bayesian nets to be ultimately the defining characteristic of causal dependence. (shrink)
The aim of the paper is to explicate the concept of causal independence between sets of factors and Reichenbach's screening-off-relation in probabilistic terms along the lines of Suppes' probabilistic theory of causality (1970). The probabilistic concept central to this task is that of conditional stochastic independence. The adequacy of the explication is supported by proving some theorems about the explicata which correspond to our intuitions about the explicanda.
The modalities come into the world by being projections or objectivizations of our epistemic constitution. Thus this paper is a statement of Humean projectivism. In fact, it goes beyond Simon Blackburn’s version. It is also designed as a comprehensive counter-program to David Lewis’ program of Humean supervenience. In detail, the paper explains: Already the basic fact that the world is a world of states of affairs is due to the nature of our epistemic states. Objects, which figure in states of (...) affairs and which embody metaphysical modality, are constitutable by their essential properties and in fact constituted by us according to our ontological policies. What the facts are, to which the correspondence notion of truth refers, is determined by applying an epistemic or pragmatic notion of truth to the world. Causation is a specific objectivization of our conditional beliefs. Nomicity is a ‘habit of belief’, a specific way of generalizing epistemic attitudes. This covers the basic metaphysical and natural modalities. The paper attempts to convey that talking of projection or objectivization is not just imagery, but a constructively realizable program. (shrink)
Counterpart theory is often advertised by its track record at solving metaphysical puzzles. Here I focus on puzzles of occasional identity, wherein distinct individuals at one world or time appear to be identical at another world or time. To solve these puzzles, the usual interpretation rules of counterpart theory must be extended beyond the simple language of quantified modal logic. I present a more comprehensive semantics that allows talking about specific times and worlds, that takes into account the multiplicity and (...) sortal-dependence of counterpart relations, and that does not require names to denote actual or present individuals. In addition, the semantics I defend does not identify ordinary individuals with world-bound or time-bound stages and thereby avoids the most controversial aspect of counterpart theory. Humphrey’s counterpart at other worlds or times is none other than Humphrey himself. (shrink)
The paper focuses on interpreting ceteris paribus conditions as normal conditions. After discussing six basic problems for the explication of normal conditions and seven interpretations that do not well solve those problems I turn to what I call the epistemic account. According to it the normal is, roughly, the not unexpected. This is developed into a rigorous constructive account of normal conditions, which makes essential use of ranking theory and in particular allows to explain the phenomenon of multiply exceptional conditions. (...) Finally, this static account is extended to a schematic dynamic model of how we may learn about those normal and exceptional conditions. (shrink)