This article discusses how the concept of a fair finite lottery can best be extended to denumerably infinite lotteries. Techniques and ideas from non-standard analysis are brought to bear on the problem.
Non-Archimedean probability functions allow us to combine regularity with perfect additivity. We discuss the philosophical motivation for a particular choice of axioms for a non-Archimedean probability theory and answer some philosophical objections that have been raised against infinitesimal probabilities in general. _1_ Introduction _2_ The Limits of Classical Probability Theory _2.1_ Classical probability functions _2.2_ Limitations _2.3_ Infinitesimals to the rescue? _3_ NAP Theory _3.1_ First four axioms of NAP _3.2_ Continuity and conditional probability _3.3_ The final axiom of NAP (...) _3.4_ Infinite sums _3.5_ Definition of NAP functions via infinite sums _3.6_ Relation to numerosity theory _4_ Objections and Replies _4.1_ Cantor and the Archimedean property _4.2_ Ticket missing from an infinite lottery _4.3_ Williamson’s infinite sequence of coin tosses _4.4_ Point sets on a circle _4.5_ Easwaran and Pruss _5_ Dividends _5.1_ Measure and utility _5.2_ Regularity and uniformity _5.3_ Credence and chance _5.4_ Conditional probability _6_ General Considerations _6.1_ Non-uniqueness _6.2_ Invariance Appendix. (shrink)
Hartry Field distinguished two concepts of type‐free truth: scientific truth and disquotational truth. We argue that scientific type‐free truth cannot do justificatory work in the foundations of mathematics. We also present an argument, based on Crispin Wright's theory of cognitive projects and entitlement, that disquotational truth can do justificatory work in the foundations of mathematics. The price to pay for this is that the concept of disquotational truth requires non‐classical logical treatment.
We investigate axiomatizations of Kripke's theory of truth based on the Strong Kleene evaluation scheme for treating sentences lacking a truth value. Feferman's axiomatization KF formulated in classical logic is an indirect approach, because it is not sound with respect to Kripke's semantics in the straightforward sense: only the sentences that can be proved to be true in KF are valid in Kripke's partial models. Reinhardt proposed to focus just on the sentences that can be proved to be true in (...) KF and conjectured that the detour through classical logic in KF is dispensable. We refute Reinhardt's Conjecture, and provide a direct axiomatization PKF of Kripke's theory in partial logic. We argue that any natural axiomatization of Kripke's theory in Strong Kleene logic has the same proof-theoretic strength as PKF, namely the strength of the system RA< ωω ramified analysis or a system of Tarskian ramified truth up to ωω. Thus any such axiomatization is much weaker than Feferman's axiomatization KF in classical logic, which is equivalent to the system RA<ε₀ of ramified analysis up to ε₀. (shrink)
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned probability zero (in other words: the probability functions are regular). We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov’s axiomatization of probability is replaced by (...) a different type of infinite additivity. (shrink)
Even though disquotationalism is not correct as it is usually formulated, a deep insight lies behind it. Specifically, it can be argued that, modulo implicit commitment to reflection principles, all there is to the notion of truth is given by a simple, natural collection of truth-biconditionals.
Psillos has recently argued that van Fraassen’s arguments against abduction fail. Moreover, he claimed that, if successful, these arguments would equally undermine van Fraassen’s own constructive empiricism, for, Psillos thinks, it is only by appeal to abduction that constructive empiricism can be saved from issuing in a bald scepticism. We show that Psillos’ criticisms are misguided, and that they are mostly based on misinterpretations of van Fraassen’s arguments. Furthermore, we argue that Psillos’ arguments for his claim that constructive empiricism itself (...) needs abduction point up to his failure to recognize the importance of van Fraassen’s broader epistemology for constructive empiricism. Towards the end of our paper we discuss the suspected relationship between constructive empiricism and scepticism in the light of this broader epistemology, and from a somewhat more general perspective. (shrink)
This article gives an epistemological analysis of the reflection process by means of which you can come to know the consistency of a mathematical theory that you already accept. It is argued that this process can result in warranted belief in new mathematical principles without justifying them.
In this article, the prospects of deflationism about the concept of truth are investigated. A new version of deflationism, called inferential deflationism, is articulated and defended. It is argued that it avoids the pitfalls of earlier deflationist views such as Horwich’s minimalist theory of truth and Field’s version of deflationism.
The logician Kurt Godel in 1951 established a disjunctive thesis about the scope and limits of mathematical knowledge: either the mathematical mind is equivalent to a Turing machine (i.e., a computer), or there are absolutely undecidable mathematical problems. In the second half of the twentieth century, attempts have been made to arrive at a stronger conclusion. In particular, arguments have been produced by the philosopher J.R. Lucas and by the physicist and mathematician Roger Penrose that intend to show that the (...) mathematical mind is more powerful than any computer. These arguments, and counterarguments to them, have not convinced the logical and philosophical community. The reason for this is an insufficiency if rigour in the debate. The contributions in this volume move the debate forward by formulating rigorous frameworks and formally spelling out and evaluating arguments that bear on Godel's disjunction in these frameworks. The contributions in this volume have been written by world leading experts in the field. (shrink)
In this paper, a general perspective on criteria of identity of kinds of objects is developed. The question of the admissibility of impredicative or circular identity criteria is investigated in the light of the view that is articulated. It is argued that in and of itself impredicativity does not constitute sufficient grounds for rejecting a putative identity criterion. The view that is presented is applied to Davidson’s criterion of identity for events and to the structuralist criterion of identity of places (...) in a structure. (shrink)
Building on the seminal work of Kit Fine in the 1980s, Leon Horsten here develops a new theory of arbitrary entities. He connects this theory to issues and debates in metaphysics, logic, and contemporary philosophy of mathematics, investigating the relation between specific and arbitrary objects and between specific and arbitrary systems of objects. His book shows how this innovative theory is highly applicable to problems in the philosophy of arithmetic, and explores in particular how arbitrary objects can engage with the (...) nineteenth-century concept of variable mathematical quantities, how they are relevant for debates around mathematical structuralism, and how they can help our understanding of the concept of random variables in statistics. This fully worked through theory will open up new avenues within philosophy of mathematics, bringing in the work of other philosophers such as Saul Kripke, and providing new insights into the development of the foundations of mathematics from the eighteenth century to the present day. (shrink)
Justified belief is a core concept in epistemology and there has been an increasing interest in its logic over the last years. While many logical investigations consider justified belief as an operator, in this paper, we propose a logic for justified belief in which the relevant notion is treated as a predicate instead. Although this gives rise to the possibility of liar-like paradoxes, a predicate treatment allows for a rich and highly expressive framework, which lives up to the universal ambitions (...) of investigating epistemological concepts. We start with a base theory for justified belief, and then systematically present putative additional axioms for justified belief. We provide an overview of consistency results when the additional principles are added to the base theory, and discuss their philosophical plausibility. (shrink)
If mathematics is regarded as a science, then the philosophy of mathematics can be regarded as a branch of the philosophy of science, next to disciplines such as the philosophy of physics and the philosophy of biology. However, because of its subject matter, the philosophy of mathematics occupies a special place in the philosophy of science. Whereas the natural sciences investigate entities that are located in space and time, it is not at all obvious that this is also the case (...) with respect to the objects that are studied in mathematics. In addition to that, the methods of investigation of mathematics differ markedly from the methods of investigation in the natural sciences. Whereas the latter acquire general knowledge using inductive methods, mathematical knowledge appears to be acquired in a different way, namely, by deduction from basic principles. The status of mathematical knowledge also appears to differ from the status of knowledge in the natural sciences. The theories of the natural sciences appear to be less certain and more open to revision than mathematical theories. For these reasons mathematics poses problems of a quite distinctive kind for philosophy. Therefore philosophers have accorded special attention to ontological and epistemological questions concerning mathematics. (shrink)
Hartry Field distinguished two concepts of type-free truth: scientific truth and disquotational truth. We argue that scientific type-free truth cannot do justificatory work in the foundations of mathematics. We also present an argument, based on Crispin Wright's theory of cognitive projects and entitlement, that disquotational truth can do justificatory work in the foundations of mathematics. The price to pay for this is that the concept of disquotational truth requires non-classical logical treatment.
We formulate and explore two basic axiomatic systems of type-free subjective probability. One of them explicates a notion of finitely additive probability. The other explicates a concept of infinitely additive probability. It is argued that the first of these systems is a suitable background theory for formally investigating controversial principles about type-free subjective probability.
Williamson has forcefully argued that Fitch's argument shows that the domain of the unknowable is non-empty. And he exhorts us to make more inroads into the land of the unknowable. Concluding his discussion of Fitch's argument, he writes: " Once we acknowledge that [the domain of the unknowable] is non-empty, we can explore more effectively its extent. … We are only beginning to understand the deeper limits of our knowledge. " I shall formulate and evaluate a new argument concerning the (...) domain of the unknowable. It is an argument about knowability. More specifically, it is an argument about what we can know about the natural numbers. Since the domain of discourse will be the natural numbers structure, the notion of knowability can for the purposes of the argument be identified with a priori knowability or – which amounts to the same thing – absolute provability .Suppose, for a reductio, that there exists a property θ of natural numbers such that it is provable that for some natural number n, θ is true but unprovable. …. (shrink)
This article is concerned with reflection principles in the context of Cantor’s conception of the set-theoretic universe. We argue that within such a conception reflection principles can be formulated that confer intrinsic plausibility to strong axioms of infinity.
According to structuralism in philosophy of mathematics, arithmetic is about a single structure. First-order theories are satisfied by models that do not instantiate this structure. Proponents of structuralism have put forward various accounts of how we succeed in fixing one single structure as the intended interpretation of our arithmetical language. We shall look at a proposal that involves Tennenbaum's theorem, which says that any model with addition and multiplication as recursive operations is isomorphic to the standard model of arithmetic. On (...) this account, the intended models of arithmetic are the notation systems with recursive operations on them satisfying the Peano axioms. [A]m Anfang […] ist das Zeichen. (shrink)
We investigate how to assign probabilities to sentences that contain a type-free truth predicate. These probability values track how often a sentence is satisfied in transfinite revision sequences, following Gupta and Belnap’s revision theory of truth. This answers an open problem by Leitgeb which asks how one might describe transfinite stages of the revision sequence using such probability functions. We offer a general construction, and explore additional constraints that lead to desirable properties of the resulting probability function. One such property (...) is Leitgeb’s Probabilistic Convention T, which says that the probability of φ equals the probability that φ is true. (shrink)
This article explores ways in which the Revision Theory of Truth can be expressed in the object language. In particular, we investigate the extent to which semantic deficiency, stable truth, and nearly stable truth can be so expressed, and we study different axiomatic systems for the Revision Theory of Truth.
On the one hand, the concept of truth is a major research subject in analytic philosophy. On the other hand, mathematical logicians have developed sophisticated logical theories of truth and the paradoxes. Recent developments in logical theories of the semantical paradoxes are highly relevant for philosophical research on the notion of truth. And conversely, philosophical guidance is necessary for the development of logical theories of truth and the paradoxes. From this perspective, this volume intends to reflect and promote deeper interaction (...) and collaboration between philosophers and logicians investigating the concept of truth than has existed so far.Aside from an extended introductory overview of recent work in the theory of truth, the volume consists of articles by leading philosophers and logicians on subjects and debates that are situated on the interface between logical and philosophical theories of truth. The volume is intended for graduate students in philosophy and in logic who want an introduction to contemporary research in this area, as well as for professional philosophers and logicians. (shrink)
We relate Popper functions to regular and perfectly additive such non-Archimedean probability functions by means of a representation theorem: every such non-Archimedean probability function is infinitesimally close to some Popper function, and vice versa. We also show that regular and perfectly additive non-Archimedean probability functions can be given a lexicographic representation. Thus Popper functions, a specific kind of non-Archimedean probability functions, and lexicographic probability functions triangulate to the same place: they are in a good sense interchangeable.
Earman (1993) distinguishes three notions of empirical indistinguishability and offers a rigorous framework to investigate how each of these notions relates to the problem of underdetermination of theory choice. He uses some of the results obtained in this framework to argue for a version of scientific anti- realism. In the present paper we first criticize Earman's arguments for that position. Secondly, we propose and motivate a modification of Earman's framework and establish several results concerning some of the notions of indistinguishability (...) in this modified framework. Finally, we interpret these results in the light of the realism/anti- realism debate. (shrink)
In this article, we reflect on the use of formal methods in the philosophy of science. These are taken to comprise not just methods from logic broadly conceived, but also from other formal disciplines such as probability theory, game theory, and graph theory. We explain how formal modelling in the philosophy of science can shed light on difficult problems in this domain.
We investigate and classify the notion of final derivability of two basic inconsistency-adaptive logics. Specifically, the maximal complexity of the set of final consequences of decidable sets of premises formulated in the language of propositional logic is described. Our results show that taking the consequences of a decidable propositional theory is a complicated operation. The set of final consequences according to either the Reliability Calculus or the Minimal Abnormality Calculus of a decidable propositional premise set is in general undecidable, and (...) can be -complete. These classifications are exact. For first order theories even finite sets of premises can generate such consequence sets in either calculus. (shrink)
This paper presents a defense of Epistemic Arithmetic as used for a formalization of intuitionistic arithmetic and of certain informal mathematical principles. First, objections by Allen Hazen and Craig Smorynski against Epistemic Arithmetic are discussed and found wanting. Second, positive support is given for the research program by showing that Epistemic Arithmetic can give interesting formulations of Church's Thesis.
Criteria of identity should mirror the identity relation in being reflexive, symmetrical, and transitive. However, this logical requirement is only rarely met by the criteria that we are most inclined to propose as candidates. The present paper addresses the question how such obvious candidates are best approximated by means of relations that have all of the aforementioned features, i.e., which are equivalence relations. This question divides into two more basic questions. First, what is to be considered a ‘best’ approximation. And (...) second, how can these best approximations be found? In answering these questions, we both rely on and constructively criticize ground-breaking work done by Timothy Williamson. Guiding ideas of our approach are that we allow approximations by means of overlapping equivalence-relations, and that closeness of approximation is measured in terms of the number of mistakes made by the approximation when compared to the obvious candidate criterion. (shrink)
There are two perspectives from which formal theories can be viewed. On the one hand, one can take a theory to be about some privileged models. On the other hand, one can take all models of a theory to be on a par. In contrast with what is usually done in philosophical debates, we adopt the latter viewpoint. Suppose that from this perspective we want to add an adequate truth predicate to a background theory. Then on the one hand the (...) truth theory ought to be semantically conservative over the background theory. At the same time, it is generally recognised that the central function of a truth predicate is an expressive one. A truth predicate ought to allow us to express propositions that we could not express before. In this article we argue that there are indeed natural truth theories which satisfy both the demand of semantical conservativeness and the demand of adequately extending the expressive power of our language. (shrink)
In this article ideas from Kit Fine’s theory of arbitrary objects are applied to questions regarding mathematical structuralism. I discuss how sui generis mathematical structures can be viewed as generic systems of mathematical objects, where mathematical objects are conceived of as arbitrary objects in Fine’s sense.
A series of unnoticeably small changes in an observable property may add up to a noticeable change. Crispin Wright has used this fact to prove that perceptual indiscriminability is a non-transitive relation. Delia Graff has recently argued that there is a 'tension' between Wright's assumptions. But Graff has misunderstood one of these, that 'phenomenal continua' are possible; and the other, that our powers of discrimination are finite, is sound. If the first assumption is properly understood, it is not in tension (...) with but is actually implied by the second, given a plausible physical assumption. (shrink)
Proof-theoretic reflection principles have been discussed in proof theory ever since Gödel’s discovery of the incompleteness theorems. But these reflection principles have not received much attention in the philosophical community. The present chapter aims to survey some of the principal meta-mathematical results on the iteration of proof-theoretic reflection principles and investigate these results from a logico-philosophical perspective; we will concentrate on the epistemological significance of these technical results and on the epistemic notions involved in the proofs. In particular, we will (...) focus on the notions of commitment to and acceptance of a theory. Special attention is given to the connection between proof-theoretic reflection and axiomatic truth theories. After distinguishing between different types of proof-theoretic reflection principles, we review some proof-theoretic results concerning extensions of formal theories by (iterated) reflection principles. As basis theories, we concentrate on standard arithmetical and elementary axiomatic truth theories. We then go on to explore the epistemological significance of these results. In this investigation, we aim to show that the epistemic notion of acceptance of (or commitment to) a theory plays a crucial role in the philosophical argumentation for reflection principles and their iteration. (shrink)
The difficulties with formalizing the intensional notions necessity, knowability and omniscience, and rational belief are well-known. If these notions are formalized as predicates applying to (codes of) sentences, then from apparently weak and uncontroversial logical principles governing these notions, outright contradictions can be derived. Tense logic is one of the best understood and most extensively developed branches of intensional logic. In tense logic, the temporal notions future and past are formalized as sentential operators rather than as predicates. The question therefore (...) arises whether the notions that are investigated in tense logic can be consistently formalized as predicates. In this paper it is shown that the answer to this question is negative. The logical treatment of the notions of future and past as predicates gives rise to paradoxes due the specific interplay between both notions. For this reason, the tense paradoxes that will be presented are not identical to the paradoxes referred to above. (shrink)
Jonathan Lowe has argued that a particular variation on C.I. Lewis' notion of strict implication avoids the paradoxes of strict implication. We show that Lowe's notion of implication does not achieve this aim, and offer a general argument to demonstrate that no other variation on Lewis' notion of constantly strict implication describes the logical behaviour of natural-language conditionals in a satisfactory way.
Both Lowe and Tsai have presented their own versions of the theory that both indicative and subjunctive conditionals are strict conditionals. We critically discuss both versions and we find each version wanting.
An epistemic formalization of arithmetic is constructed in which certain non-trivial metatheoretical inferences about the system itself can be made. These inferences involve the notion of provability in principle, and cannot be made in any consistent extensions of Stewart Shapiro's system of epistemic arithmetic. The system constructed in the paper can be given a modal-structural interpretation.
New epistemic principles are formulated in the language of Shapiro's system of Epistemic Arithmetic. It is argued that some plausibility can be attributed to these principles. The relations between these principles and variants of controversial constructivistic principles are investigated. Special attention is given to variants of the intuitionistic version of Church's thesis and to variants of Markov's principle.
Inspired by Kit Fine’s theory of arbitrary objects, we explore some ways in which the generic structure of the natural numbers can be presented. Following a suggestion of Saul Kripke’s, we discuss how basic facts and questions about this generic structure can be expressed in the framework of Carnapian quantified modal logic.
It is often alleged that Cantor’s views about how the set theoretic universe as a whole should be considered are fundamentally unclear. In this article we argue that Cantor’s views on this subject, at least up until around 1896, are relatively clear, coherent, and interesting. We then go on to argue that Cantor’s views about the set theoretic universe as a whole have implications for theology that have hitherto not been sufficiently recognised. However, the theological implications in question, at least (...) as articulated here, would not have satisfied Cantor himself. (shrink)
This paper outlines a framework for the abstract investigation of the concept of canonicity of names and of naming systems. Degrees of canonicity of names and of naming systems are distinguished. The structure of the degrees is investigated, and a notion of relative canonicity is defined. The notions of canonicity are formally expressed within a Carnapian system of second-order modal logic.
Halbach has argued that Tarski biconditionals are not ontologically conservative over classical logic, but his argument is undermined by the fact that he cannot include a theory of arithmetic, which functions as a theory of syntax. This article is an improvement on Halbach's argument. By adding the Tarski biconditionals to inclusive negative free logic and the universal closure of minimal arithmetic, which is by itself an ontologically neutral combination, one can prove that at least one thing exists. The result can (...) then be strengthened to the conclusion that infinitely many things exist. Those things are not just all Gödel codes of sentences but rather all natural numbers. Against this background inclusive negative free logic collapses into noninclusive free logic, which collapses into classical logic. The consequences for ontological deflationism with respect to truth are discussed. (shrink)