Taking Joyce’s (1998; 2009) recent argument(s) for probabilism as our point of departure, we propose a new way of grounding formal, synchronic, epistemic coherence requirements for (opinionated) full belief. Our approach yields principled alternatives to deductive consistency, sheds new light on the preface and lottery paradoxes, and reveals novel conceptual connections between alethic and evidential epistemic norms.
Many philosophers have argued that "degree of belief" or "credence" is a more fundamental state grounding belief. Many other philosophers have been skeptical about the notion of "degree of belief", and take belief to be the only meaningful notion in the vicinity. This paper shows that one can take belief to be fundamental, and ground a notion of "degree of belief" in the patterns of belief, assuming that an agent has a collection of beliefs that isn't dominated by some other (...) collection in terms of the overall balance of truth and falsity that it could contain. (shrink)
Many philosophers have become worried about the use of standard real numbers for the probability function that represents an agent's credences. They point out that real numbers can't capture the distinction between certain extremely unlikely events and genuinely impossible ones—they are both represented by credence 0, which violates a principle known as “regularity.” Following Skyrms 1980 and Lewis 1980, they recommend that we should instead use a much richer set of numbers, called the “hyperreals.” This essay argues that this popular (...) view is the result of two mistakes. The first mistake, which this essay calls the “numerical fallacy,” is to assume that a distinction that isn't represented by different numbers isn't represented at all in a mathematical representation. In this case, the essay claims that although the real numbers do not make all relevant distinctions, the full mathematical structure of a probability function does. The second mistake is that the hyperreals make too many distinctions. They have a much more complex structure than credences in ordinary propositions can have, so they make distinctions that don't exist among credences. While they might be useful for generating certain mathematical models, they will not appear in a faithful mathematical representation of credences of ordinary propositions. (shrink)
Expected accuracy arguments have been used by several authors (Leitgeb and Pettigrew, and Greaves and Wallace) to support the diachronic principle of conditionalization, in updates where there are only finitely many possible propositions to learn. I show that these arguments can be extended to infinite cases, giving an argument not just for conditionalization but also for principles known as ‘conglomerability’ and ‘reflection’. This shows that the expected accuracy approach is stronger than has been realized. I also argue that we should (...) be careful to distinguish diachronic update principles from related synchronic principles for conditional probability. (shrink)
To the extent that we have reasons to avoid these “bad B -properties”, these arguments provide reasons not to have an incoherent credence function b — and perhaps even reasons to have a coherent one. But, note that these two traditional arguments for probabilism involve what might be called “pragmatic” reasons (not) to be (in)coherent. In the case of the Dutch Book argument, the “bad” property is pragmatically bad (to the extent that one values money). But, it is not clear (...) whether the DBA pinpoints any epistemic defect of incoherent agents. The same can be said for Representation Theorem arguments, since they involve the structure of an agent’s preferences. (shrink)
We introduce a family of rules for adjusting one's credences in response to learning the credences of others. These rules have a number of desirable features. 1. They yield the posterior credences that would result from updating by standard Bayesian conditionalization on one's peers' reported credences if one's likelihood function takes a particular simple form. 2. In the simplest form, they are symmetric among the agents in the group. 3. They map neatly onto the familiar Condorcet voting results. 4. They (...) preserve shared agreement about independence in a wide range of cases. 5. They commute with conditionalization and with multiple peer updates. Importantly, these rules have a surprising property that we call synergy - peer testimony of credences can provide mutually supporting evidence raising an individual's credence higher than any peer's initial prior report. At first, this may seem to be a strike against them. We argue, however, that synergy is actually a desirable feature and the failure of other updating rules to yield synergy is a strike against them. (shrink)
Bayesianism is a collection of positions in several related fields, centered on the interpretation of probability as something like degree of belief, as contrasted with relative frequency, or objective chance. However, Bayesianism is far from a unified movement. Bayesians are divided about the nature of the probability functions they discuss; about the normative force of this probability function for ordinary and scientific reasoning and decision making; and about what relation (if any) holds between Bayesian and non-Bayesian concepts.
Naive versions of decision theory take probabilities and utilities as primitive and use expected value to give norms on rational decision. However, standard decision theory takes rational preference as primitive and uses it to construct probability and utility. This paper shows how to justify a version of the naive theory, by taking dominance as the most basic normatively required preference relation, and then extending it by various conditions under which agents should be indifferent between acts. The resulting theory can make (...) all the decisions of classical expected utility theory, plus more in cases where expected utilities are infinite or undefined. Although the theory requires similarly strong assumptions to classical expected utility theory, versions of the theory can be developed with slightly weaker assumptions, without having to prove a new representation theorem for the weaker theory. This alternate foundation is particularly useful if probability is prior to preference, as suggested by the recent program to base probabilism on accuracy and alethic considerations rather than pragmatic ones. (shrink)
In the first paper, I discussed the basic claims of Bayesianism (that degrees of belief are important, that they obey the axioms of probability theory, and that they are rationally updated by either standard or Jeffrey conditionalization) and the arguments that are often used to support them. In this paper, I will discuss some applications these ideas have had in confirmation theory, epistemol- ogy, and statistics, and criticisms of these applications.
In a series of papers, Don Fallis points out that although mathematicians are generally unwilling to accept merely probabilistic proofs, they do accept proofs that are incomplete, long and complicated, or partly carried out by computers. He argues that there are no epistemic grounds on which probabilistic proofs can be rejected while these other proofs are accepted. I defend the practice by presenting a property I call ‘transferability’, which probabilistic proofs lack and acceptable proofs have. I also consider what this (...) says about the similarities between mathematics and, on the one hand natural sciences, and on the other hand philosophy. (shrink)
Fine has shown that assigning any value to the Pasadena game is consistent with a certain standard set of axioms for decision theory. However, I suggest that it might be reasonable to believe that the value of an individual game is constrained by the long-run payout of repeated plays of the game. Although there is no value that repeated plays of the Pasadena game converges to in the standard strong sense, I show that there is a weaker sort of convergence (...) it exhibits, and use this to define a notion of ‘weak expectation’ that can give values to the Pasadena game and many others, though not to all games that fail to have a strong expectation in the standard sense. CiteULike Connotea Del.icio.us What's this? (shrink)
Newcomb-like problems are classified by the payoff table of their act-state pairs, and the causal structure that gives rise to the act-state correlation. Decision theories are classified by the one or more points of intervention whose causal role is taken to be relevant to rationality in various problems. Some decision theories suggest an inherent conflict between different notions of rationality that are all relevant. Some issues with causal modeling raise problems for decision theories in the contexts where Newcomb problems arise.
I defend a causal reductionist account of the nature of rates of change like velocity and acceleration. This account identifies velocity with the past derivative of position and acceleration with the future derivative of velocity. Unlike most reductionist accounts, it can preserve the role of velocity as a cause of future positions and acceleration as the effect of current forces. I show that this is possible only if all the fundamental laws are expressed by differential equations of the same order. (...) Consideration of the continuity of time explains why the differential equations are all second order. This explanation is not available on non-causal or non-reductionist accounts of rates of change. Finally, I argue that alleged counterexamples to the reductionist account involving physically impossible worlds are irrelevant to an analysis of the properties that play a causal role in the actual world. 1 Background2 Grounding3 Causation4 The Proposal5 Why No Third Derivatives?6 Why Any Derivatives?7 Counterexamples? (shrink)
It is sometimes alleged that arguments that probability functions should be countably additive show too much, and that they motivate uncountable additivity as well. I show this is false by giving two naturally motivated arguments for countable additivity that do not motivate uncountable additivity.
To answer the question of whether mathematics needs new axioms, it seems necessary to say what role axioms actually play in mathematics. A first guess is that they are inherently obvious statements that are used to guarantee the truth of theorems proved from them. However, this may neither be possible nor necessary, and it doesn’t seem to fit the historical facts. Instead, I argue that the role of axioms is to systematize uncontroversial facts that mathematicians can accept from a wide (...) variety of philosophical positions. Once the axioms are generally accepted, mathematicians can expend their energies on proving theorems instead of arguing philosophy. Given this account of the role of axioms, I give four criteria that axioms must meet in order to be accepted. Penelope Maddy has proposed a similar view in Naturalism in Mathematics, but she suggests that the philosophical questions bracketed by adopting the axioms can in fact be ignored forever. I contend that these philosophical arguments are in fact important, and should ideally be resolved at some point, but I concede that their resolution is unlikely to affect the ordinary practice of mathematics. However, they may have effects in the margins of mathematics, including with regards to the controversial “large cardinal axioms” Maddy would like to support. (shrink)
The central aim of this paper is to argue that there is a meaningful sense in which a concept of rationality can apply to a city. The idea will be that a city is rational to the extent that the collective practices of its people enable diverse inhabitants to simultaneously live the kinds of life they are each trying to live. This has significant implications for the varieties of social practices that constitute being more or less rational. Some of these (...) implications may be welcome to a theorist that wants to identify collective rationality with a notion of justice, while others are unwelcome. There are some significant challenges to this use of the concept of rationality, but I claim that these challenges at the city level have parallels at the individual level, and may thus help deepen our understanding of rationality at all levels. (shrink)
Doxastic TheoriesThe application of formal tools to questions related to epistemology is of course not at all new. However, there has been a surge of interest in the field now known as “formal epistemology” over the past decade, with two annual conference series and an annual summer school at Carnegie Mellon University, in addition to many one-off events devoted to the field. A glance at the programs of these series illustrates the wide-ranging set of topics that have been grouped under (...) this name, ranging from rational choice theory and the foundations of statistics, to logics of knowledge and formal measures of coherence, with much more besides.In this paper I will ignore most of these topics, and just trace some parts of the history of two ideas about belief whose current interaction may lead to future progress. One idea is the idea of belief, disbelief, and suspension of judgment as a meaningful tripartite dis .. (shrink)
Pascal’s Wager holds that one has pragmatic reason to believe in God, since that course of action has infinite expected utility. The mixed strategy objection holds that one could just as well follow a course of action that has infinite expected utility but is unlikely to end with one believing in God. Monton (2011. Mixed strategies can’t evade Pascal’s Wager. Analysis 71: 642–45.) has argued that mixed strategies can’t evade Pascal’s Wager, while Robertson (2012. Some mixed strategies can evade Pascal’s (...) Wager: a reply to Monton. Analysis 72: 295–98.) has argued that Monton is mistaken. We show that Monton is correct, highlight the crucial assumptions that he relies on, and shed some light on the role of mixed strategies in decision theory. (shrink)
Belief and credence are often characterized in three different ways—they ought to govern our actions, they ought to be governed by our evidence, and they ought to aim at the truth. If one of these roles is to be central, we need to explain why the others should be features of the same mental state rather than separate ones. If multiple roles are equally central, then this may cause problems for some traditional arguments about what belief and credence must be (...) like. I read the history of formal and traditional epistemology through the lens of these functional roles, and suggest that considerations from one literature might have a role in the other. The similarities and differences between these literatures may suggest some more general ideas about the nature of epistemology in abstraction from the details of credence and belief in particular. (shrink)
There is general agreement in mathematics about what continuity is. In this paper we examine how well the mathematical definition lines up with common sense notions. We use a recent paper by Hud Hudson as a point of departure. Hudson argues that two objects moving continuously can coincide for all but the last moment of their histories and yet be separated in space at the end of this last moment. It turns out that Hudson’s construction does not deliver mathematically continuous (...) motion, but the natural question then is whether there is any merit in the alternative definition of continuity that he implicitly invokes. (shrink)
Bill D'Alessandro talks to Kenny Easwaran about fractal music, Zoom conferences, being a good referee, teaching in math and philosophy, the rationalist community and its relationship to academia, decision-theoretic pluralism, and the city of Manhattan, Kansas.
As is clear from the other articles in this volume, logic has applications in a broad range of areas of philosophy. If logic is taken to include the mathematical disciplines of set theory, model theory, proof theory, and recursion theory (as well as first-order logic, second-order logic, and modal logic), then the only other area of mathematics with such wide-ranging applications in philosophy is probability theory.
Probability and logic are two branches of mathematics that have important philosophical applications. This article discusses several areas of intersection between them. Several involve the role for probability in giving semantics for logic or the role of logic in governing assignments of probability. Some involve probability over non-classical logic or self-referential sentences.
In my () I argued that a central component of mathematical practice is that published proofs must be “transferable” — that is, they must be such that the author's reasons for believing the conclusion are shared directly with the reader, rather than requiring the reader to essentially rely on testimony. The goal of this paper is to explain this requirement of transferability in terms of a more general norm on defeat in mathematical reasoning that I will call “convertibility”. I begin (...) by discussing two types of epistemic defeat: “rebutting” and “undercutting”. I give examples of both of these kinds of defeat from the history of mathematics. I then argue that an important requirement in mathematics is that published proofs be detailed enough to allow the conversion of rebutting defeat into undercutting defeat. Finally, I show how this sort of convertibility explains the requirement of transferability, and contributes to the way mathematics develops by the pattern referred to by Lakatos () as “lemma incorporation”. (shrink)
Arguments for probabilism aim to undergird/motivate a synchronic probabilistic coherence norm for partial beliefs. Standard arguments for probabilism are all of the form: An agent S has a non-probabilistic partial belief function b iff (⇐⇒) S has some “bad” property B (in virtue of the fact that their p.b.f. b has a certain kind of formal property F). These arguments rest on Theorems (⇒) and Converse Theorems (⇐): b is non-Pr ⇐⇒ b has formal property F.