Do numbers, sets, and so forth, exist? What do mathematical statements mean? Are they literally true or false, or do they lack truth values altogether? Addressing questions that have attracted lively debate in recent years, Stewart Shapiro contends that standard realist and antirealist accounts of mathematics are both problematic. As Benacerraf first noted, we are confronted with the following powerful dilemma. The desired continuity between mathematical and, say, scientific language suggests realism, but realism in this context suggests seemingly intractable epistemic (...) problems. As a way out of this dilemma, Shapiro articulates a structuralist approach. On this view, the subject matter of arithmetic, for example, is not a fixed domain of numbers independent of each other, but rather is the natural number structure, the pattern common to any system of objects that has an initial object and successor relation satisfying the induction principle. Using this framework, realism in mathematics can be preserved without troublesome epistemic consequences. Shapiro concludes by showing how a structuralist approach can be applied to wider philosophical questions such as the nature of an "object" and the Quinean nature of ontological commitment. Clear, compelling, and tautly argued, Shapiro's work, noteworthy both in its attempt to develop a full-length structuralist approach to mathematics and to trace its emergence in the history of mathematics, will be of deep interest to both philosophers and mathematicians. (shrink)
The central contention of this book is that second-order logic has a central role to play in laying the foundations of mathematics. In order to develop the argument fully, the author presents a detailed description of higher-order logic, including a comprehensive discussion of its semantics. He goes on to demonstrate the prevalence of second-order concepts in mathematics and the extent to which mathematical ideas can be formulated in higher-order logic. He also shows how first-order languages are often insufficient to codify (...) many concepts in contemporary mathematics, and thus that both first- and higher-order logics are needed to fully reflect current work. Throughout, the emphasis is on discussing the associated philosophical and historical issues and the implications they have for foundational studies. For the most part, the author assumes little more than a familiarity with logic comparable to that provided in a beginning graduate course which includes the incompleteness of arithmetic and the Lowenheim-Skolem theorems. All those concerned with the foundations of mathematics will find this a thought-provoking discussion of some of the central issues in the field today. (shrink)
Stewart Shapiro's ambition in Vagueness in Context is to develop a comprehensive account of the meaning, function, and logic of vague terms in an idealized version of a natural language like English. It is a commonplace that the extensions of vague terms vary according to their context: a person can be tall with respect to male accountants and not tall (even short) with respect to professional basketball players. The key feature of Shapiro's account is that the extensions of vague terms (...) also vary in the course of conversations and that, in some cases, a competent speaker can go either way without sinning against the meaning of the words or the non-linguistic facts. As Shapiro sees it, vagueness is a linguistic phenomenon, due to the kinds of languages that humans speak; but vagueness is also due to the world we find ourselves in, as we try to communicate features of it to each other. (shrink)
This paper discusses the neo-logicist approach to the foundations of mathematics by highlighting an issue that arises from looking at the Bad Company objection from an epistemological perspective. For the most part, our issue is independent of the details of any resolution of the Bad Company objection and, as we will show, it concerns other foundational approaches in the philosophy of mathematics. In the first two sections, we give a brief overview of the "Scottish" neo-logicist school, present a generic form (...) of the Bad Company objection and introduce an epistemic issue connected to this general problem that will be the focus of the rest of the paper. In the third section, we present an alternative approach within philosophy of mathematics, a view that emerges from Hilbert's Grundlagen der Geometrie (1899, Leipzig: Teubner; Foundations of geometry (trans.: Townsend, E.). La Salle, Illinois: Open Court, 1959.). We will argue that Bad Company-style worries, and our concomitant epistemic issue, also affects this conception and other foundationalist approaches. In the following sections, we then offer various ways to address our epistemic concern, arguing, in the end, that none resolves the issue. The final section offers our own resolution which, however, runs against the foundationalist spirit of the Scottish neo-logicist program. (shrink)
Some authors have claimed that ante rem structuralism has problems with structures that have indiscernible places. In response, I argue that there is no requirement that mathematical objects be individuated in a non-trivial way. Metaphysical principles and intuitions to the contrary do not stand up to ordinary mathematical practice, which presupposes an identity relation that, in a sense, cannot be defined. In complex analysis, the two square roots of –1 are indiscernible: anything true of one of them is true of (...) the other. I suggest that i functions like a parameter in natural deduction systems. I gave an early version of this paper at a workshop on structuralism in mathematics and science, held in the Autumn of 2006, at Bristol University. Thanks to the organizers, particularly Hannes Leitgeb, James Ladyman, and Øystein Linnebo, to my commentator Richard Pettigrew, and to the audience there. The paper also benefited considerably from a preliminary session at the Arché Research Centre at the University of St Andrews. I am indebted to my colleagues Craige Roberts, for help with the linguistics literature, and Ben Caplan and Gabriel Uzquiano, for help with the metaphysics. Thanks also to Hannes Leitgeb and Jeffrey Ketland for reading an earlier version of the manuscript and making helpful suggestions. I also benefited from conversations with Richard Heck, John Mayberry, Kevin Scharp, and Jason Stanley. CiteULike Connotea Del.icio.us What's this? (shrink)
This unique book by Stewart Shapiro looks at a range of philosophical issues and positions concerning mathematics in four comprehensive sections. Part I describes questions and issues about mathematics that have motivated philosophers since the beginning of intellectual history. Part II is an historical survey, discussing the role of mathematics in the thought of such philosophers as Plato, Aristotle, Kant, and Mill. Part III covers the three major positions held throughout the twentieth century: the idea that mathematics is logic (logicism), (...) the view that the essence of mathematics is the rule-governed manipulation of characters (formalism), and a revisionist philosophy that focuses on the mental activity of mathematics (intuitionism). Finally, Part IV brings the reader up-to-date with a look at contemporary developments within the discipline. This sweeping introductory guide to the philosophy of mathematics makes these fascinating concepts accessible to those with little background in either mathematics or philosophy. (shrink)
There is a parallel between the debate between Gottlob Frege and David Hilbert at the turn of the twentieth century and at least some aspects of the current controversy over whether category theory provides the proper framework for structuralism in the philosophy of mathematics. The main issue, I think, concerns the place and interpretation of meta-mathematics in an algebraic or structuralist approach to mathematics. Can meta-mathematics itself be understood in algebraic or structural terms? Or is it an exception to the (...) slogan that mathematics is the science of structure? (shrink)
The notion of potential infinity dominated in mathematical thinking about infinity from Aristotle until Cantor. The coherence and philosophical importance of the notion are defended. Particular attention is paid to the question of whether potential infinity is compatible with classical logic or requires a weaker logic, perhaps intuitionistic.
Mathematics and logic have been central topics of concern since the dawn of philosophy. Since logic is the study of correct reasoning, it is a fundamental branch of epistemology and a priority in any philosophical system. Philosophers have focused on mathematics as a case study for general philosophical issues and for its role in overall knowledge- gathering. Today, philosophy of mathematics and logic remain central disciplines in contemporary philosophy, as evidenced by the regular appearance of articles on these topics in (...) the best mainstream philosophical journals; in fact, the last decade has seen an explosion of scholarly work in these areas. This volume covers these disciplines in a comprehensive and accessible manner, giving the reader an overview of the major problems, positions, and battle lines. The 26 contributed chapters are by established experts in the field, and their articles contain both exposition and criticism as well as substantial development of their own positions. The essays, which are substantially self-contained, serve both to introduce the reader to the subject and to engage in it at its frontiers. Certain major positions are represented by two chapters--one supportive and one critical. The Oxford Handbook of Philosophy of Math and Logic is a ground-breaking reference like no other in its field. It is a central resource to those wishing to learn about the philosophy of mathematics and the philosophy of logic, or some aspect thereof, and to those who actively engage in the discipline, from advanced undergraduates to professional philosophers, mathematicians, and historians. (shrink)
The purpose of this paper is to assess the prospects for a neo-logicist development of set theory based on a restriction of Frege's Basic Law V, which we call (RV): PQ[Ext(P) = Ext(Q) [(BAD(P) & BAD(Q)) x(Px Qx)]] BAD is taken as a primitive property of properties. We explore the features it must have for (RV) to sanction the various strong axioms of Zermelo–Fraenkel set theory. The primary interpretation is where ‘BAD’ is Dummett's ‘indefinitely extensible’. 1 Background: what and why? (...) 2 Framework 3 GOOD candidates, indefinite extensibility 4 The framework of (RV) alone, or almost alone 5 The axioms 6 Brief closing. (shrink)
This paper uses neo-Fregean-style abstraction principles to develop the integers from the natural numbers (assuming Hume’s principle), the rational numbers from the integers, and the real numbers from the rationals. The first two are first-order abstractions that treat pairs of numbers: (DIF) INT(a,b)=INT(c,d) ≡ (a+d)=(b+c). (QUOT) Q(m,n)=Q(p,q) ≡ (n=0 & q=0) ∨ (n≠0 & q≠0 & m⋅q=n⋅p). The development of the real numbers is an adaption of the Dedekind program involving “cuts” of rational numbers. Let P be a property (of (...) rational numbers) and r a rational number. Say that r is an upper bound of P, written P≤r, if for any rational number s, if Ps then either s<r or s=r. In other words, P≤r if r is greater than or equal to any rational number that P applies to. Consider the Cut Abstraction Principle: (CP) ∀P∀Q(C(P)=C(Q) ≡ ∀r(P≤r ≡ Q≤r)). In other words, the cut of P is identical to the cut of Q if and only if P and Q share all of their upper bounds. The axioms of second-order real analysis can be derived from (CP), just as the axioms of second-order Peano arithmetic can be derived from Hume’s principle. The paper raises some of the philosophical issues connected with the neo-Fregean program, using the above abstraction principles as case studies. (shrink)
We develop a point-free construction of the classical one- dimensional continuum, with an interval structure based on mereology and either a weak set theory or logic of plural quantification. In some respects this realizes ideas going back to Aristotle,although, unlike Aristotle, we make free use of classical "actual infinity". Also, in contrast to intuitionistic, Bishop, and smooth infinitesimal analysis, we follow classical analysis in allowing partitioning of our "gunky line" into mutually exclusive and exhaustive disjoint parts, thereby demonstrating the independence (...) of "indecomposability" from a non-punctiform conception. It is surprising that such simple axioms as ours already imply the Archimedean property and that they determine an isomorphism with the Dedekind-Cantor structure of R as a complete, separable, ordered field. We also present some simple topological models of our system, establishing consistency relative to classical analysis. Finally, after describing how to nominalize our theory, we close with comparisons with earlier efforts related to our own. (shrink)
The purpose of this paper is to apply Crispin Wright’s criteria and various axes of objectivity to mathematics. I test the criteria and the objectivity of mathematics against each other. Along the way, various issues concerning general logic and epistemology are encountered.
The subject of this paper is the philosophical problem of accounting for the relationship between mathematics and non-mathematical reality. The first section, devoted to the importance of the problem, suggests that many of the reasons for engaging in philosophy at all make an account of the relationship between mathematics and reality a priority, not only in philosophy of mathematics and philosophy of science, but also in general epistemology/metaphysics. This is followed by a (rather brief) survey of the major, traditional philosophies (...) of mathematics indicating how each is prepared to deal with the present problem. It is shown that (the standard formulations of) some views seem to deny outright that there is a relationship between mathematics and any non-mathematical reality; such philosophies are clearly unacceptable. Other views leave the relationship rather mysterious and, thus, are incomplete at best. The final, more speculative section provides the direction of a positive account. A structuralist philosophy of mathematics is outlined and it is proposed that mathematics applies to reality though the discovery of mathematical structures underlying the non-mathematical universe. (shrink)
We examine George Boolos's proposed abstraction principle for extensions based on the limitation-of-size conception, New V, from several perspectives. Crispin Wright once suggested that New V could serve as part of a neo-logicist development of real analysis. We show that it fails both of the conservativeness criteria for abstraction principles that Wright proposes. Thus, we support Boolos against Wright. We also show that, when combined with the axioms for Boolos's iterative notion of set, New V yields a system equivalent to (...) full Zermelo-Fraenkel set theory with a principle of global choice. This advances Boolos's longstanding interest in the foundations of set theory. (shrink)
This chapter provides broad coverage of the notion of logical consequence, exploring its modal, semantic, and epistemic aspects. It develops the contrast between proof-theoretic notion of consequence, in terms of deduction, and a model-theoretic approach, in terms of truth-conditions. The main purpose is to relate the formal, technical work in logic to the philosophical concepts that underlie reasoning.
There is an interesting logical/semantic issue with some mathematical languages and theories. In the language of (pure) complex analysis, the two square roots of i’ manage to pick out a unique object? This is perhaps the most prominent example of the phenomenon, but there are some others. The issue is related to matters concerning the use of definite descriptions and singular pronouns, such as donkey anaphora and the problem of indistinguishable participants. Taking a cue from some work in linguistics and (...) the philosophy of language, I suggest that i functions like a parameter in natural deduction systems. This may require some rethinking of the role of singular terms, at least in mathematical languages. (shrink)
In previous work, Hellman and Shapiro present a regions-based account of a one-dimensional continuum. This paper produces a more Aristotelian theory, eschewing the existence of points and the use of infinite sets or pluralities. We first show how to modify the original theory. There are a number of theorems that have to be added as axioms. Building on some work by Linnebo, we then show how to take the ‘potential’ nature of the usual operations seriously, by using a modal language, (...) and we show that the two approaches are equivalent. (shrink)
The neo-logicist argues tliat standard mathematics can be derived by purely logical means from abstraction principles—such as Hume's Principle— which are held to lie 'epistcmically innocent'. We show that the second-order axiom of comprehension applied to non-instantiated properties and the standard first-order existential instantiation and universal elimination principles are essential for the derivation of key results, specifically a theorem of infinity, but have not been shown to be epistemically innocent. We conclude that the epistemic innocence of mathematics has not been (...) established by the neo-logicist. (shrink)
At the beginning of Die Grundlagen der Arithmetik (§2) , Frege observes that “it is in the nature of mathematics to prefer proof, where proof is possible”. This, of course, is true, but thinkers differ on why it is that mathematicians prefer proof. And what of propositions for which no proof is possible? What of axioms? This talk explores various notions of self-evidence, and the role they play in various foundational systems, notably those of Frege and Zermelo. I argue that (...) both programs are undermined at a crucial point, namely when self-evidence is supported by holistic and even pragmatic considerations. (shrink)
A paper in this journal by Fraser MacBride, ‘Can Ante Rem Structuralism Solve the Access Problem?’, raises important issues concerning the epistemological goals and burdens of contemporary philosophy of mathematics, and perhaps philosophy of science and other disciplines as well. I use a response to MacBride's paper as a framework for developing a broadly holistic framework for these issues, and I attempt to steer a middle course between reductive foundationalism and extreme naturalistic quietism. For this purpose the notion of entitlement (...) is invoked along the way, suitably modified for the present anti-foundationalist setting. (shrink)
In light of the close connection between the ontological hierarchy of set theory and the ideological hierarchy of type theory, Øystein Linnebo and Agustín Rayo have recently offered an argument in favour of the view that the set-theoretic universe is open-ended. In this paper, we argue that, since the connection between the two hierarchies is indeed tight, any philosophical conclusions cut both ways. One should either hold that both the ontological hierarchy and the ideological hierarchy are open-ended, or that neither (...) is. If there is reason to accept the view that the set-theoretic universe is open-ended, that will be because such a view is the most compelling one to adopt on the purely ontological front. (shrink)
Logical pluralism is the view that different logics are equally appropriate, or equally correct. Logical relativism is a pluralism according to which validity and logical consequence are relative to something. Stewart Shapiro explores various such views. He argues that the question of meaning shift is itself context-sensitive and interest-relative.
The article is part of a symposium on Hartry Field’s “Saving truth from paradox”. The book is one of the most significant intellectual achievements of the past decades, but it is not clear what, exactly, it accomplishes. I explore some alternatives, relating the developed view to the intuitive, pre-theoretic notion of truth.
There are a number of regions-based accounts of space/time, due to Whitehead, Roeper, Menger, Tarski, the present authors, and others. They all follow the Aristotelian theme that continua are not composed of points: each region has a proper part. The purpose of this note is to show how to recapture ‘points’ in such frameworks via Scottish neo-logicist abstraction principles. The results recapitulate some Aristotelian themes. A second agenda is to provide a new arena to help decide what is at stake (...) when adjudicating issues concerning the identity of neo-logicist abstracts — so-called ‘Caesar questions’. (shrink)
The purpose of this paper is to present a thought experiment and argument that spells trouble for “radical” deflationism concerning meaning and truth such as that advocated by the staunch nominalist Hartry Field. The thought experiment does not sit well with any view that limits a truth predicate to sentences understood by a given speaker or to sentences in (or translatable into) a given language, unless that language is universal. The scenario in question concerns sentences that are not understood but (...) are known to be logical consequences of known and understood sentences. Ultimately, the issue turns on the notion of logical consequence that is available to various versions of deflationism. (shrink)
Sections 3.16 and 3.23 of Roger Penrose's Shadows of the mind (Oxford, Oxford University Press, 1994) contain a subtle and intriguing new argument against mechanism, the thesis that the human mind can be accurately modeled by a Turing machine. The argument, based on the incompleteness theorem, is designed to meet standard objections to the original Lucas-Penrose formulations. The new argument, however, seems to invoke an unrestricted truth predicate (and an unrestricted knowability predicate). If so, its premises are inconsistent. The usual (...) ways of restricting the predicates either invalidate Penrose's reasoning or require presuppositions that the mechanist can reject. (shrink)
According to ante rem structuralism a branch of mathematics, such as arithmetic, is about a structure, or structures, that exist independent of the mathematician, and independent of any systems that exemplify the structure. A structure is a universal of sorts: structure is to exemplified system as property is to object. So ante rem structuralist is a form of ante rem realism concerning universals. Since the appearance of my Philosophy of mathematics: Structure and ontology, a number of criticisms of the idea (...) of ante rem structures have appeared. Some argue that it is impossible to give identity conditions for places in homogeneous ante rem structures, invoking a version of the identity of indiscernibles. Others raise issues concerning the identity and distinctness of places in different structures, such as the the natural number 2 and the real number 2. The purpose of this paper is to take the measure of these objections, and to further articulate ante rem structuralism to take them into account. (shrink)
Stewart Shapiro's aim in Vagueness in Context is to develop both a philosophical and a formal, model-theoretic account of the meaning, function, and logic of vague terms in an idealized version of a natural language like English. It is a commonplace that the extensions of vague terms vary with such contextual factors as the comparison class and paradigm cases. A person can be tall with respect to male accountants and not tall with respect to professional basketball players. The main feature (...) of Shapiro's account is that the extensions of vague terms also vary in the course of a conversation, even after the external contextual features, such as the comparison class, are fixed. A central thesis is that in some cases, a competent speaker of the language can go either way in the borderline area of a vague predicate without sinning against the meaning of the words and the non-linguistic facts. Shapiro calls this open texture, borrowing the term from Friedrich Waismann.The formal model theory has a similar structure to the supervaluationist approach, employing the notion of a sharpening of a base interpretation. In line with the philosophical account, however, the notion of super-truth does not play a central role in the development of validity. The ultimate goal of the technical aspects of the work is to delimit a plausible notion of logical consequence, and to explore what happens with the sorites paradox.Later chapters deal with what passes for higher-order vagueness - vagueness in the notions of 'determinacy' and 'borderline' - and with vague singular terms, or objects. In each case, the philosophical picture is developed by extending and modifying the original account. This is followed with modifications to the model theory and the central meta-theorems.As Shapiro sees it, vagueness is a linguistic phenomenon, due to the kinds of languages that humans speak. But vagueness is also due to the world we find ourselves in, as we try to communicate features of it to each other. Vagueness is also due to the kinds of beings we are. There is no need to blame the phenomenon on any one of those aspects. (shrink)
It is a commonplace that the extensions of most, perhaps all, vague predicates vary with such features as comparison class and paradigm and contrasting cases. My view proposes another, more pervasive contextual parameter. Vague predicates exhibit what I call open texture: in some circumstances, competent speakers can go either way in the borderline region. The shifting extension and anti-extensions of vague predicates are tracked by what David Lewis calls the “conversational score”, and are regulated by what Kit Fine calls penumbral (...) connections, including a principle of tolerance. As I see it, vague predicates are response-dependent, or, better, judgement-dependent, at least in their borderline regions. This raises questions concerning how one reasons with such predicates. In this paper, I present a model theory for vague predicates, so construed. It is based on an overall supervaluationist-style framework, and it invokes analogues of Kripke structures for intuitionistic logic. I argue that the system captures, or at least nicely models, how one ought to reason with the shifting extensions (and anti-extensions) of vague predicates, as borderline cases are called and retracted in the course of a conversation. The model theory is illustrated with a forced march sorites series, and also with a thought experiment in which vague predicates interact with so-called future contingents. I show how to define various connectives and quantifiers in the language of the system, and how to express various penumbral connections and the principle of tolerance. The project fits into one of the topics of this special issue. In the course of reasoning, even with the external context held fixed, it is uncertain what the future extension of the vague predicates will be. Yet we still manage to reason with them. The system is based on that developed, more fully, in my Vagueness in Context , Oxford, Oxford University Press, 2006, but some criticisms and replies to critics are incorporated. (shrink)
At the beginning of Die Grundlagen der Arithmetik , Frege observes that “it is in the nature of mathematics to prefer proof, where proof is possible”. This, of course, is true, but thinkers differ on why it is that mathematicians prefer proof. And what of propositions for which no proof is possible? What of axioms? This talk explores various notions of self-evidence, and the role they play in various foundational systems, notably those of Frege and Zermelo. I argue that both (...) programs are undermined at a crucial point, namely when self-evidence is supported by holistic and even pragmatic considerations. (shrink)
It is sometimes said that there are two, competing versions of W. V. O. Quine’s unrelenting empiricism, perhaps divided according to temporal periods of his career. According to one, logic is exempt from, or lies outside the scope of, the attack on the analytic-synthetic distinction. This logic-friendly Quine holds that logical truths and, presumably, logical inferences are analytic in the traditional sense. Logical truths are knowable a priori, and, importantly, they are incorrigible, and so immune from revision. The other, radical (...) reading of Quine does not exempt logic from the attack on analyticity and a priority. Logical truths and inferences are themselves part of the web of belief, and the same global methodology applies to logic as to any other part of the web, such as theoretical chemistry or ordinary beliefs about ordinary objects. Everything, including logic, is up for grabs in our struggle for holistic confirmation. The purpose of this paper is to examine the law of non-contradiction, and the concomitant principle of ex falso quodlibet, from the perspective of the principles advocated by the radical Quine. I show that he has no compelling reason to accept either of these. To put it bluntly, neither the law of non-contradiction nor the rule of ex falso quodlibet is empirically confirmed, and these principles fare poorly on the various criteria for theory acceptance on the methodology of the radical Quine. So the radical Quine is led rather quickly and rather directly into something in the neighborhood of Graham Priest’s dialetheism. (shrink)