Analogical arguments -- Philosophical theories -- Computational theories -- The articulation model -- Analogies in mathematics -- Similarity and patterns of generalization -- Analogy and epistemic values -- Analogy and symmetry -- A wider role for analogies.
By Parallel Reasoning is the first comprehensive philosophical examination of analogical reasoning in more than forty years designed to formulate and justify standards for the critical evaluation of analogical arguments. It proposes a normative theory with special focus on the use of analogies in mathematics and science. In recent decades, research on analogy has been dominated by computational theories whose objective has been to model analogical reasoning as a psychological process. These theories have devoted little attention to normative questions. In (...) this book Bartha proposes that a good analogical argument must articulate a clear relationship that is capable of generalization. This idea leads to a set of distinct models for the critical analysis of prominent forms of analogical argument. The same core principle makes it possible to relate analogical reasoning to norms and values of scientific practice. Reasoning by analogy is justified because it strikes an optimal balance between conservative values, such as simplicity and coherence, and progressive values, such as fruitfulness and theoretical unification. Analogical arguments are also justified by appeal to symmetry--like cases are to be treated alike. In elaborating the connection between analogy and these broad epistemic principles, By Parallel Reasoning offers a novel contribution to explaining how analogies can play an important role in the confirmation of scientific hypotheses. (shrink)
Among recent objections to Pascal's Wager, two are especially compelling. The first is that decision theory, and specifically the requirement of maximizing expected utility, is incompatible with infinite utility values. The second is that even if infinite utility values are admitted, the argument of the Wager is invalid provided that we allow mixed strategies. Furthermore, Hájek has shown that reformulations of Pascal's Wager that address these criticisms inevitably lead to arguments that are philosophically unsatisfying and historically unfaithful. Both the objections (...) and Hájek's philosophical worries disappear, however, if we represent our preferences using relative utilities rather than a one-place utility function. Relative utilities provide a conservative way to make sense of infinite value that preserves the familiar equation of rationality with the maximization of expected utility. They also provide a means of investigating a broader class of problems related to the Wager. (shrink)
The Pasadena game invented by Nover and Hájek raises a number of challenges for decision theory. The basic problem is how the game should be evaluated: it has no expectation and hence no well-defined value. Easwaran has shown that the Pasadena game does have a weak expectation, raising the possibility that we can eliminate the value gap by requiring agents to value gambles at their weak expectations. In this paper, I first prove a negative result: there are gambles like the (...) Pasadena game that do not even have a weak expectation. Hence, problematic value gaps remain even if decision theory is extended to take in weak expectations. There is a further challenge: the existence of a ‘value gap’ in the Pasadena game seems to make decision theory inapplicable in a number of cases where the right choice is obvious. The positive contribution of the paper is a theory of ‘relative utilities’, an extension of decision theory that lets us make comparative judgements even among gambles that have no well-defined value. (shrink)
De Finetti would claim that we can make sense of a draw in which each positive integer has equal probability of winning. This requires a uniform probability distribution over the natural numbers, violating countable additivity. Countable additivity thus appears not to be a fundamental constraint on subjective probability. It does, however, seem mandated by Dutch Book arguments similar to those that support the other axioms of the probability calculus as compulsory for subjective interpretations. These two lines of reasoning can be (...) reconciled through a slight generalization of the Dutch Book framework. Countable additivity may indeed be abandoned for de Finetti's lottery, but this poses no serious threat to its adoption in most applications of subjective probability. Introduction The de Finetti lottery Two objections to equiprobability 3.1 The ‘No random mechanism’ argument 3.2 The Dutch Book argument Equiprobability and relative betting quotients The re-labelling paradox 5.1 The paradox 5.2 Resolution: from symmetry to relative probability Beyond the de Finetti lottery. (shrink)
We examine a distinctive kind of problem for decision theory, involving what we call discontinuity at infinity. Roughly, it arises when an infinite sequence of choices, each apparently sanctioned by plausible principles, converges to a ‘limit choice’ whose utility is much lower than the limit approached by the utilities of the choices in the sequence. We give examples of this phenomenon, focusing on Arntzenius et al.’s Satan’s apple, and give a general characterization of it. In these examples, repeated dominance reasoning (...) (a paradigm of rationality) apparently gives rise to a situation closely analogous to having intransitive preferences (a paradigm of irrationality). Indeed, the agents in these examples are vulnerable to a money pump set-up despite having preferences that exhibit no obvious defect of rationality. We explore several putative solutions to such problems, particularly those that appeal to binding and to deliberative dynamics. We consider the prospects for these solutions, concluding that if they fail, the examples show that money pump arguments are invalid. (shrink)
We provide a solution to the well-known “Shooting-Room” paradox, developed by John Leslie in connection with his Doomsday Argument. In the “Shooting-Room” paradox, the death of an individual is contingent upon an event that has a 1/36 chance of occurring, yet the relative frequency of death in the relevant population is 0.9. There are two intuitively plausible arguments, one concluding that the appropriate subjective probability of death is 1/36, the other that this probability is 0.9. How are these two values (...) to be reconciled? We show that only the first argument is valid for a standard, countably additive probability distribution. However, both lines of reasoning are legitimate if probabilities are non-standard. The subjective probability of death rises from 1/36 to 0.9 by conditionalizing on an event that is not measurable, or whose probability is zero. Thus we can sometimes meaningfully ascribe conditional probabilities even when the event conditionalized upon is not of positive finite (or even infinitesimal) measure. (shrink)
Some environmental ethicists and economists argue that attributing infinite value to the environment is a good way to represent an absolute obligation to protect it. Others argue against modelling the value of the environment in this way: the assignment of infinite value leads to immense technical and philosophical difficulties that undermine the environmentalist project. First, there is a problem of discrimination: saving a large region of habitat is better than saving a small region; yet if both outcomes have infinite value, (...) then decision theory prescribes indifference. Second, there is a problem of swamping probabilities: an act with a small but positive probability of saving an endangered species appears to be on par with an act that has a high probability of achieving this outcome, since both have infinite expected value. Our paper shows that a relative concept of infinite value can be meaningfully defined, and provides a good model for securing the priority of the natural environment while avoiding the failures noted by sceptics about infinite value. Our claim is not that the relative infinity utility model gets every detail correct, but rather that it provides a rigorous philosophical framework for thinking about decisions affecting the environment. (shrink)
The Principle of Indifference, which dictates that we ought to assign two outcomes equal probability in the absence of known reasons to do otherwise, is vulnerable to well-known objections. Nevertheless, the appeal of the principle, and of symmetry-based assignments of equal probability, persists. We show that, relative to a given class of symmetries satisfying certain properties, we are justified in calling certain outcomes equally probable, and more generally, in defining what we call relative probabilities. Relative probabilities are useful in providing (...) a generalized approach to conditionalization. The technique is illustrated by application to simple examples. (shrink)
Carter and Leslie (1996) have argued, using Bayes's theorem, that our being alive now supports the hypothesis of an early 'Doomsday'. Unlike some critics (Eckhardt 1997), we accept their argument in part: given that we exist, our existence now indeed favors 'Doom sooner' over 'Doom later'. The very fact of our existence, however, favors 'Doom later'. In simple cases, a hypothetical approach to the problem of 'old evidence' shows that these two effects cancel out: our existence now yields no information (...) about the coming of Doom. More complex cases suggest a move from countably additive to non-standard probability measures. (shrink)
In his famous Wager, Blaise Pascal offers the reader an argument that it is rational to strive to believe in God. Philosophical debates about this classic argument have continued until our own times. This volume provides a comprehensive examination of Pascal's Wager, including its theological framework, its place in the history of philosophy, and its importance to contemporary decision theory. The volume starts with a valuable primer on infinity and decision theory for students and non-specialists. A sequence of chapters then (...) examines topics including the Wager's underlying theology, its influence on later philosophical figures, and contemporary analyses of the Wager including Alan Hájek's challenge to its validity, the many gods objection, and the ethics of belief. The final five chapters explore various ways in which the Wager has inspired contemporary decision theory, including questions related to infinite utility, imprecise probabilities, and infinitesimals. (shrink)
Confronted with the possibility of severe environmental harms, such as catastrophic climate change, some researchers have suggested that we should abandon the principle at the heart of standard decision theory—the injunction to maximize expected utility—and embrace a different one: the Precautionary Principle. Arguably, the most sophisticated philosophical treatment of the Precautionary Principle is due to Steel. Steel interprets PP as a qualitative decision rule and appears to conclude that a quantitative decision-theoretic statement of PP is both impossible and unnecessary. In (...) this article, we propose a decision-theoretic formulation of PP in terms of lexical utilities. We show that this lexical model is largely faithful to Steel’s approach, but also that it corrects three problems with Steel’s account and clarifies the relationship between PP and standard decision theory. Using a range of examples, we illustrate how the lexical model can be used to explore a variety of issues related to precautionary reasoning. (shrink)
How can self-locating propositions be integrated into normal patterns of belief revision? Puzzles such as Sleeping Beauty seem to show that such propositions lead to violation of ordinary principles for reasoning with subjective probability, such as Conditionalization and Reflection. I show that sophisticated forms of Conditionalization and Reflection are not only compatible with self-locating propositions, but also indispensable in understanding how they can function as evidence in Sleeping Beauty and similar cases.
Qu'est-ce qui explique l'unité d'une substance leibnizienne, au-dessus des attributs compris dans sa notion individuelle complète? C'est une question commune dans la littérature sur la notion de la substance chez Leibniz. Cet article soutient qu'elle n'admette pas de réponse consistante dans le système leibnizien. Premièrement, je discute la manière dans laquelle Leibniz a essayé de répondre à la question en „rehabillitant" a les formes substantielles des scholastiques. Puis je cherche à montrer que ça lui a ammené à une conception composée (...) de la nature de la substance, une position similaire de quelques façons à celle de Duns Scotus. Cette conception composée, cependant, n'est pas soutenable étant donné que Leibniz a rejeté la „distinction formelle" de Scotus. (shrink)