In light of the close connection between the ontological hierarchy of set theory and the ideological hierarchy of type theory, Øystein Linnebo and Agustín Rayo have recently offered an argument in favour of the view that the set-theoretic universe is open-ended. In this paper, we argue that, since the connection between the two hierarchies is indeed tight, any philosophical conclusions cut both ways. One should either hold that both the ontological hierarchy and the ideological hierarchy are open-ended, or that neither (...) is. If there is reason to accept the view that the set-theoretic universe is open-ended, that will be because such a view is the most compelling one to adopt on the purely ontological front. (shrink)
In this paper, I use the cases of intuitionistic arithmetic with Church’s thesis, intuitionistic analysis, and smooth infinitesimal analysis to argue for a sort of pluralism or relativism about logic. The thesis is that logic is relative to a structure. There are classical structures, intuitionistic structures, and (possibly) paraconsistent structures. Each such structure is a legitimate branch of mathematics, and there does not seem to be an interesting logic that is common to all of them. One main theme of my (...) ante rem structuralism is that any coherent axiomatization describes a structure, or a class of structures. If one weakens the logic, then more axiomatizations become coherent. (shrink)
We develop a point-free construction of the classical one- dimensional continuum, with an interval structure based on mereology and either a weak set theory or logic of plural quantification. In some respects this realizes ideas going back to Aristotle,although, unlike Aristotle, we make free use of classical "actual infinity". Also, in contrast to intuitionistic, Bishop, and smooth infinitesimal analysis, we follow classical analysis in allowing partitioning of our "gunky line" into mutually exclusive and exhaustive disjoint parts, thereby demonstrating the independence (...) of "indecomposability" from a non-punctiform conception. It is surprising that such simple axioms as ours already imply the Archimedean property and that they determine an isomorphism with the Dedekind-Cantor structure of R as a complete, separable, ordered field. We also present some simple topological models of our system, establishing consistency relative to classical analysis. Finally, after describing how to nominalize our theory, we close with comparisons with earlier efforts related to our own. (shrink)
On Richard’s When Truth Gives Out Content Type Journal Article Pages 1-9 DOI 10.1007/s11098-011-9796-0 Authors Kevin Scharp, Department of Philosophy, The Ohio State University, 350 University Hall, 230 North Oval Mall, Columbus, OH 43210, USA Stewart Shapiro, Department of Philosophy, The Ohio State University, 350 University Hall, 230 North Oval Mall, Columbus, OH 43210, USA Journal Philosophical Studies Online ISSN 1573-0883 Print ISSN 0031-8116.
A paper in this journal by Fraser MacBride, ‘Can Ante Rem Structuralism Solve the Access Problem?’, raises important issues concerning the epistemological goals and burdens of contemporary philosophy of mathematics, and perhaps philosophy of science and other disciplines as well. I use a response to MacBride's paper as a framework for developing a broadly holistic framework for these issues, and I attempt to steer a middle course between reductive foundationalism and extreme naturalistic quietism. For this purpose the notion of entitlement (...) is invoked along the way, suitably modified for the present anti-foundationalist setting. (shrink)
The article is part of a symposium on Hartry Field’s “Saving truth from paradox”. The book is one of the most significant intellectual achievements of the past decades, but it is not clear what, exactly, it accomplishes. I explore some alternatives, relating the developed view to the intuitive, pre-theoretic notion of truth.
This paper discusses the neo-logicist approach to the foundations of mathematics by highlighting an issue that arises from looking at the Bad Company objection from an epistemological perspective. For the most part, our issue is independent of the details of any resolution of the Bad Company objection and, as we will show, it concerns other foundational approaches in the philosophy of mathematics. In the first two sections, we give a brief overview of the "Scottish" neo-logicist school, present a generic form (...) of the Bad Company objection and introduce an epistemic issue connected to this general problem that will be the focus of the rest of the paper. In the third section, we present an alternative approach within philosophy of mathematics, a view that emerges from Hilbert's Grundlagen der Geometrie (1899, Leipzig: Teubner; Foundations of geometry (trans.: Townsend, E.). La Salle, Illinois: Open Court, 1959.). We will argue that Bad Company-style worries, and our concomitant epistemic issue, also affects this conception and other foundationalist approaches. In the following sections, we then offer various ways to address our epistemic concern, arguing, in the end, that none resolves the issue. The final section offers our own resolution which, however, runs against the foundationalist spirit of the Scottish neo-logicist program. (shrink)
Some central philosophical issues concern the use of mathematics in putatively non-mathematical endeavors. One such endeavor, of course, is philosophy, and the philosophy of mathematics is a key instance of that. The present article provides an idiosyncratic survey of the use of mathematical results to provide support or counter-support to various philosophical programs concerning the foundations of mathematics.
At the beginning of Die Grundlagen der Arithmetik (§2) , Frege observes that “it is in the nature of mathematics to prefer proof, where proof is possible”. This, of course, is true, but thinkers differ on why it is that mathematicians prefer proof. And what of propositions for which no proof is possible? What of axioms? This talk explores various notions of self-evidence, and the role they play in various foundational systems, notably those of Frege and Zermelo. I argue that (...) both programs are undermined at a crucial point, namely when self-evidence is supported by holistic and even pragmatic considerations. (shrink)
Some authors have claimed that ante rem structuralism has problems with structures that have indiscernible places. In response, I argue that there is no requirement that mathematical objects be individuated in a non-trivial way. Metaphysical principles and intuitions to the contrary do not stand up to ordinary mathematical practice, which presupposes an identity relation that, in a sense, cannot be defined. In complex analysis, the two square roots of –1 are indiscernible: anything true of one of them is true of (...) the other. I suggest that i functions like a parameter in natural deduction systems. I gave an early version of this paper at a workshop on structuralism in mathematics and science, held in the Autumn of 2006, at Bristol University. Thanks to the organizers, particularly Hannes Leitgeb, James Ladyman, and Øystein Linnebo, to my commentator Richard Pettigrew, and to the audience there. The paper also benefited considerably from a preliminary session at the Arché Research Centre at the University of St Andrews. I am indebted to my colleagues Craige Roberts, for help with the linguistics literature, and Ben Caplan and Gabriel Uzquiano, for help with the metaphysics. Thanks also to Hannes Leitgeb and Jeffrey Ketland for reading an earlier version of the manuscript and making helpful suggestions. I also benefited from conversations with Richard Heck, John Mayberry, Kevin Scharp, and Jason Stanley. CiteULike Connotea Del.icio.us What's this? (shrink)
It is a commonplace that the extensions of most, perhaps all, vague predicates vary with such features as comparison class and paradigm and contrasting cases. My view proposes another, more pervasive contextual parameter. Vague predicates exhibit what I call open texture: in some circumstances, competent speakers can go either way in the borderline region. The shifting extension and anti-extensions of vague predicates are tracked by what David Lewis calls the “conversational score”, and are regulated by what Kit Fine calls penumbral (...) connections, including a principle of tolerance. As I see it, vague predicates are response-dependent, or, better, judgement-dependent, at least in their borderline regions. This raises questions concerning how one reasons with such predicates. In this paper, I present a model theory for vague predicates, so construed. It is based on an overall supervaluationist-style framework, and it invokes analogues of Kripke structures for intuitionistic logic. I argue that the system captures, or at least nicely models, how one ought to reason with the shifting extensions (and anti-extensions) of vague predicates, as borderline cases are called and retracted in the course of a conversation. The model theory is illustrated with a forced march sorites series, and also with a thought experiment in which vague predicates interact with so-called future contingents. I show how to define various connectives and quantifiers in the language of the system, and how to express various penumbral connections and the principle of tolerance. The project fits into one of the topics of this special issue. In the course of reasoning, even with the external context held fixed, it is uncertain what the future extension of the vague predicates will be. Yet we still manage to reason with them. The system is based on that developed, more fully, in my Vagueness in Context , Oxford, Oxford University Press, 2006, but some criticisms and replies to critics are incorporated. (shrink)
It is sometimes said that there are two, competing versions of W. V. O. Quine’s unrelenting empiricism, perhaps divided according to temporal periods of his career. According to one, logic is exempt from, or lies outside the scope of, the attack on the analytic-synthetic distinction. This logic-friendly Quine holds that logical truths and, presumably, logical inferences are analytic in the traditional sense. Logical truths are knowable a priori, and, importantly, they are incorrigible, and so immune from revision. The other, radical (...) reading of Quine does not exempt logic from the attack on analyticity and a priority. Logical truths and inferences are themselves part of the web of belief, and the same global methodology applies to logic as to any other part of the web, such as theoretical chemistry or ordinary beliefs about ordinary objects. Everything, including logic, is up for grabs in our struggle for holistic confirmation. The purpose of this paper is to examine the law of non-contradiction, and the concomitant principle of ex falso quodlibet, from the perspective of the principles advocated by the radical Quine. I show that he has no compelling reason to accept either of these. To put it bluntly, neither the law of non-contradiction nor the rule of ex falso quodlibet is empirically confirmed, and these principles fare poorly on the various criteria for theory acceptance on the methodology of the radical Quine. So the radical Quine is led rather quickly and rather directly into something in the neighborhood of Graham Priest’s dialetheism. (shrink)
The purpose of this paper is to apply Crispin Wright’s criteria and various axes of objectivity to mathematics. I test the criteria and the objectivity of mathematics against each other. Along the way, various issues concerning general logic and epistemology are encountered.
According to ante rem structuralism a branch of mathematics, such as arithmetic, is about a structure, or structures, that exist independent of the mathematician, and independent of any systems that exemplify the structure. A structure is a universal of sorts: structure is to exemplified system as property is to object. So ante rem structuralist is a form of ante rem realism concerning universals. Since the appearance of my Philosophy of mathematics: Structure and ontology, a number of criticisms of the idea (...) of ante rem structures have appeared. Some argue that it is impossible to give identity conditions for places in homogeneous ante rem structures, invoking a version of the identity of indiscernibles. Others raise issues concerning the identity and distinctness of places in different structures, such as the the natural number 2 and the real number 2. The purpose of this paper is to take the measure of these objections, and to further articulate ante rem structuralism to take them into account. (shrink)
Stewart Shapiro's ambition in Vagueness in Context is to develop a comprehensive account of the meaning, function, and logic of vague terms in an idealized version of a natural language like English. It is a commonplace that the extensions of vague terms vary according to their context: a person can be tall with respect to male accountants and not tall (even short) with respect to professional basketball players. The key feature of Shapiro's account is that the extensions of vague terms (...) also vary in the course of conversations and that, in some cases, a competent speaker can go either way without sinning against the meaning of the words or the non-linguistic facts. As Shapiro sees it, vagueness is a linguistic phenomenon, due to the kinds of languages that humans speak; but vagueness is also due to the world we find ourselves in, as we try to communicate features of it to each other. (shrink)
There is a parallel between the debate between Gottlob Frege and David Hilbert at the turn of the twentieth century and at least some aspects of the current controversy over whether category theory provides the proper framework for structuralism in the philosophy of mathematics. The main issue, I think, concerns the place and interpretation of meta-mathematics in an algebraic or structuralist approach to mathematics. Can meta-mathematics itself be understood in algebraic or structural terms? Or is it an exception to the (...) slogan that mathematics is the science of structure? (shrink)
Mathematics and logic have been central topics of concern since the dawn of philosophy. Since logic is the study of correct reasoning, it is a fundamental branch of epistemology and a priority in any philosophical system. Philosophers have focused on mathematics as a case study for general philosophical issues and for its role in overall knowledge- gathering. Today, philosophy of mathematics and logic remain central disciplines in contemporary philosophy, as evidenced by the regular appearance of articles on these topics in (...) the best mainstream philosophical journals; in fact, the last decade has seen an explosion of scholarly work in these areas. This volume covers these disciplines in a comprehensive and accessible manner, giving the reader an overview of the major problems, positions, and battle lines. The 26 contributed chapters are by established experts in the field, and their articles contain both exposition and criticism as well as substantial development of their own positions. The essays, which are substantially self-contained, serve both to introduce the reader to the subject and to engage in it at its frontiers. Certain major positions are represented by two chapters--one supportive and one critical. The Oxford Handbook of Philosophy of Math and Logic is a ground-breaking reference like no other in its field. It is a central resource to those wishing to learn about the philosophy of mathematics and the philosophy of logic, or some aspect thereof, and to those who actively engage in the discipline, from advanced undergraduates to professional philosophers, mathematicians, and historians. (shrink)
After a brief account of the problem of higher-order vagueness, and its seeming intractability, I explore what comes of the issue on a linguistic, contextualist account of vagueness. On the view in question, predicates like ‘borderline red’ and ‘determinately red’ are, or at least can be, vague, but they are different in kind from ‘red’. In particular, ‘borderline red’ and ‘determinately red’ are not colours. These predicates have linguistic components, and invoke notions like ‘competent user of the language’. On my (...) view, so-called ‘higher-order vagueness’ is actually ordinary, ﬁrst-order vagueness in different predicates. I explore the possibility that, nevertheless, a pernicious regress ensues. (shrink)
Since virtually every mathematical theory can be interpreted in set theory, the latter is a foundation for mathematics. Whether set theory, as opposed to any of its rivals, is the right foundation for mathematics depends on what a foundation is for. One purpose is philosophical, to provide the metaphysical basis for mathematics. Another is epistemic, to provide the basis of all mathematical knowledge. Another is to serve mathematics, by lending insight into the various fields. Another is to provide an arena (...) for exploring relations and interactions between mathematical fields, their relative strengths, etc. Given the different goals, there is little point to determining a single foundation for all of mathematics. (shrink)
This article is an extended critical study of Kit Fine’s The limits of abstraction, which is a sustained attempt to take the measure of the neo-logicist program in the philosophy and foundations of mathematics, founded on abstraction principles like Hume’s principle. The present article covers the philosophical and technical aspects of Fine’s deep and penetrating study.
A number of authors have recently weighed in on the issue of whether it is coherent to have bound variables that range over absolutely everything. Prima facie, it is difficult, and perhaps impossible, to coherently state the “relativist” position without violating it. For example, the relativist might say, or try to say, that for any quantifier used in a proposition of English, there is something outside of its range. What is the range of this quantifier? Or suppose we ask the (...) relativist if there are some things that cannot appear in the range of any bound variable. The likely response would be along these lines: “No. For each object o, it possible to include o in the range of quantifiers, but one cannot quantify over everything at once.” This sentence contains unrestricted quantifiers, or so it seems, pending some clever move from a relativist. On the other hand, in the context of set theory, the reasoning behind the Burali-Forti paradox strongly suggests that there are well-orderings strictly longer than the collection of all ordinals. And set theorists regularly do transfinite recursions and transfinite reductions along such well-orderings. The relativist simply points out that one can always define new ordinals, and thus expand the range of one’s bound variables. The purpose of this paper is to explore the iterative framework, proposed in Zermelo’s 1930 paper, “Über Grenzzahlen und Mengenbereiche” (“On boundary numbers and domains of sets”), in order to shed light on these issues, and see what is involved in resolving them. (shrink)
Sections 3.16 and 3.23 of Roger Penrose's Shadows of the mind (Oxford, Oxford University Press, 1994) contain a subtle and intriguing new argument against mechanism, the thesis that the human mind can be accurately modeled by a Turing machine. The argument, based on the incompleteness theorem, is designed to meet standard objections to the original Lucas–Penrose formulations. The new argument, however, seems to invoke an unrestricted truth predicate (and an unrestricted knowability predicate). If so, its premises are inconsistent. The usual (...) ways of restricting the predicates either invalidate Penrose's reasoning or require presuppositions that the mechanist can reject. (shrink)
The purpose of this paper is to assess the prospects for a neo-logicist development of set theory based on a restriction of Frege's Basic Law V, which we call (RV): PQ[Ext(P) = Ext(Q) [(BAD(P) & BAD(Q)) x(Px Qx)]] BAD is taken as a primitive property of properties. We explore the features it must have for (RV) to sanction the various strong axioms of Zermelo–Fraenkel set theory. The primary interpretation is where ‘BAD’ is Dummett's ‘indefinitely extensible’. 1 Background: what and why? (...) 2 Framework 3 GOOD candidates, indefinite extensibility 4 The framework of (RV) alone, or almost alone 5 The axioms 6 Brief closing. (shrink)
The purpose of this paper is to present a thought experiment and argument that spells trouble for “radical” deflationism concerning meaning and truth such as that advocated by the staunch nominalist Hartry Field. The thought experiment does not sit well with any view that limits a truth predicate to sentences understood by a given speaker or to sentences in (or translatable into) a given language, unless that language is universal. The scenario in question concerns sentences that are not understood but (...) are known to be logical consequences of known and understood sentences. Ultimately, the issue turns on the notion of logical consequence that is available to various versions of deflationism. (shrink)
He argues that the intuitively provable arithmetic sentences constitute a recursively enumerable set, which has a Gödel sentence which is itself intuitively provable. The incompleteness theorem does not apply, since the set of provable arithmetic sentences is not consistent. The purpose of this article is to sharpen Priest's argument, avoiding reference to informal notions, consensus, or Church's thesis. We add Priest's dialetheic semantics to ordinary Peano arithmetic PA, to produce a recursively axiomatized formal system PA that contains its own truth (...) predicate. Whether one is a dialetheist or not, PA is a legitimate, rigorously defined formal system, and one can explore its proof-theoretic properties. The system is inconsistent (but presumably non-trivial), and it proves its own Gödel sentence as well as its own soundness. Although this much is perhaps welcome to the dialetheist, it has some untoward consequences. There are purely arithmetic (indeed, 0) sentences that are both provable and refutable in PA. So if the dialetheist maintains that PA is sound, then he must hold that there are true contradictions in the most elementary language of arithmetic. Moreover, the thorough dialetheist must hold that there is a number g which both is and is not the code of a derivation of the indicated Gödel sentence of PA. For the thorough dialetheist, it follows ordinary PA and even Robinson arithmetic are themselves inconsistent theories. I argue that this is a bitter pill for the dialetheist to swallow. (shrink)
Famously, Michael Dummett argues that considerations concerning the role of language in communication lead to the rejection of classical logic in favor of intuitionistic logic. Potentially, this results in massive revisions of established mathematics. Recently, Neil Tennant (“The law of excluded middle is synthetic a priori, if valid”, Philosophical Topics 24 (1996), 205-229) suggested that a Dummettian anti-realist can accept the law of excluded middle as a synthetic, a priori principle grounded on a metaphysical principle of determinacy. This article shows (...) that the for the anti-realist, the law of excluded middle entails that humans have wildly implausible abilities. The proposed synthesis between anti-realism and classical mathematics thus fails. (shrink)
This paper uses neo-Fregean-style abstraction principles to develop the integers from the natural numbers (assuming Hume’s principle), the rational numbers from the integers, and the real numbers from the rationals. The first two are first-order abstractions that treat pairs of numbers: (DIF) INT(a,b)=INT(c,d) ≡ (a+d)=(b+c). (QUOT) Q(m,n)=Q(p,q) ≡ (n=0 & q=0) ∨ (n≠0 & q≠0 & m⋅q=n⋅p). The development of the real numbers is an adaption of the Dedekind program involving “cuts” of rational numbers. Let P be a property (of (...) rational numbers) and r a rational number. Say that r is an upper bound of P, written P≤r, if for any rational number s, if Ps then either s<r or s=r. In other words, P≤r if r is greater than or equal to any rational number that P applies to. Consider the Cut Abstraction Principle: (CP) ∀P∀Q(C(P)=C(Q) ≡ ∀r(P≤r ≡ Q≤r)). In other words, the cut of P is identical to the cut of Q if and only if P and Q share all of their upper bounds. The axioms of second-order real analysis can be derived from (CP), just as the axioms of second-order Peano arithmetic can be derived from Hume’s principle. The paper raises some of the philosophical issues connected with the neo-Fregean program, using the above abstraction principles as case studies. (shrink)
Since virtually every mathematical theory can be interpreted in Zermelo-Fraenkel set theory, it is a foundation for mathematics. There are other foundations, such as alternate set theories, higher-order logic, ramified type theory, and category theory. Whether set theory is the right foundation for mathematics depends on what a foundation is for. One purpose is to provide the ultimate metaphysical basis for mathematics. A second is to assure the basic epistemological coherence of all mathematical knowledge. A third is to serve mathematics, (...) by lending insight into the various fields and suggesting fruitful techniques of research. A fourth purpose of a foundation is to provide an arena for exploring relations and interactions between mathematical fields. While set theory does better with regard to some of these and worse with regard to others, it has become the de facto arena for deciding questions of existence, something one might expect of a foundation. Given the different goals, there is little point to determining a single foundation for all of mathematics. (shrink)
This unique book by Stewart Shapiro looks at a range of philosophical issues and positions concerning mathematics in four comprehensive sections. Part I describes questions and issues about mathematics that have motivated philosophers since the beginning of intellectual history. Part II is an historical survey, discussing the role of mathematics in the thought of such philosophers as Plato, Aristotle, Kant, and Mill. Part III covers the three major positions held throughout the twentieth century: the idea that mathematics is logic (logicism), (...) the view that the essence of mathematics is the rule-governed manipulation of characters (formalism), and a revisionist philosophy that focuses on the mental activity of mathematics (intuitionism). Finally, Part IV brings the reader up-to-date with a look at contemporary developments within the discipline. This sweeping introductory guide to the philosophy of mathematics makes these fascinating concepts accessible to those with little background in either mathematics or philosophy. (shrink)
The neo-logicist argues tliat standard mathematics can be derived by purely logical means from abstraction principles—such as Hume's Principle— which are held to lie 'epistcmically innocent'. We show that the second-order axiom of comprehension applied to non-instantiated properties and the standard first-order existential instantiation and universal elimination principles are essential for the derivation of key results, specifically a theorem of infinity, but have not been shown to be epistemically innocent. We conclude that the epistemic innocence of mathematics has not been (...) established by the neo-logicist. (shrink)
The purpose of this article is to delimit what can and cannot be claimed on behalf of second-order logic. The starting point is some of the discussions surrounding my Foundations without Foundationalism: A Case for Secondorder Logic.
We examine George Boolos's proposed abstraction principle for extensions based on the limitation-of-size conception, New V, from several perspectives. Crispin Wright once suggested that New V could serve as part of a neo-logicist development of real analysis. We show that it fails both of the conservativeness criteria for abstraction principles that Wright proposes. Thus, we support Boolos against Wright. We also show that, when combined with the axioms for Boolos's iterative notion of set, New V yields a system equivalent to (...) full Zermelo-Fraenkel set theory with a principle of global choice. This advances Boolos's longstanding interest in the foundations of set theory. (shrink)
Do numbers, sets, and so forth, exist? What do mathematical statements mean? Are they literally true or false, or do they lack truth values altogether? Addressing questions that have attracted lively debate in recent years, Stewart Shapiro contends that standard realist and antirealist accounts of mathematics are both problematic. As Benacerraf first noted, we are confronted with the following powerful dilemma. The desired continuity between mathematical and, say, scientific language suggests realism, but realism in this context suggests seemingly intractable epistemic (...) problems. As a way out of this dilemma, Shapiro articulates a structuralist approach. On this view, the subject matter of arithmetic, for example, is not a fixed domain of numbers independent of each other, but rather is the natural number structure, the pattern common to any system of objects that has an initial object and successor relation satisfying the induction principle. Using this framework, realism in mathematics can be preserved without troublesome epistemic consequences. Shapiro concludes by showing how a structuralist approach can be applied to wider philosophical questions such as the nature of an "object" and the Quinean nature of ontological commitment. Clear, compelling, and tautly argued, Shapiro's work, noteworthy both in its attempt to develop a full-length structuralist approach to mathematics and to trace its emergence in the history of mathematics, will be of deep interest to both philosophers and mathematicians. (shrink)
Around the turn of the century, Poincare and Hilbert each published an account of geometry that took the discipline to be an implicit definition of its concepts. The terms ‘point’, ‘line’, and ‘plane’ can be applied to any system of objects that satisfies the axioms. Each mathematician found spirited opposition from a different logicist—Russell against Poincare' and Frege against Hilbert— who maintained the dying view that geometry essentially concerns space or spatial intuition. The debates illustrate the emerging idea of mathematics (...) as the science of structure. The article closes with some remarks on structuralist and nonstructuralist themes in Frege's own development of arithmetic. (shrink)
The idea that logic and reasoning are somehow related goes back to antiquity. It clearly underlies much of the work in logic, as witnessed by the development of computability, and formal and mechanical deductive systems, for example. On the other hand, a platitude is that logic is the study of correct reasoning; and reasoning is cognitive if anything Is. Thus, the relationship between logic, computation, and correct reasoning makes an interesting and historically central case study for mechanism. The purpose of (...) this article is to begin the articulation of this relationship, pointing out its sources and its limitations. (shrink)
The purpose of this note is to examine the relationship between the practice of mathematics and the philosophy of mathematics, ontology in particular. One conclusion is that the enterprises are (or should be) closely related, with neither one dominating the other. One cannot 'read off' the correct way to do mathematics from the true ontology, for example, nor can one ‘read off’ the true ontology from mathematics as practiced.
The central contention of this book is that second-order logic has a central role to play in laying the foundations of mathematics. In order to develop the argument fully, the author presents a detailed description of higher-order logic, including a comprehensive discussion of its semantics. He goes on to demonstrate the prevalence of second-order concepts in mathematics and the extent to which mathematical ideas can be formulated in higher-order logic. He also shows how first-order languages are often insufficient to codify (...) many concepts in contemporary mathematics, and thus that both first- and higher-order logics are needed to fully reflect current work. Throughout, the emphasis is on discussing the associated philosophical and historical issues and the implications they have for foundational studies. For the most part, the author assumes little more than a familiarity with logic comparable to that provided in a beginning graduate course which includes the incompleteness of arithmetic and the Lowenheim-Skolem theorems. All those concerned with the foundations of mathematics will find this a thought-provoking discussion of some of the central issues in the field today. (shrink)
Among the aims of this book are: - The discussion of some important philosophical issues using the precision of mathematics. - The development of formal systems that contain both classical and constructive components. This allows the study of constructivity in otherwise classical contexts and represents the formalization of important intensional aspects of mathematical practice. - The direct formalization of intensional concepts (such as computability) in a mixed constructive/classical context.
D. GABBAY and F. GUENTHNER (eds.), Handbook of philosophical logic. Volume 1: Elements of classical logic. Dordrecht, Boston, and Lancaster: D. Reidel Publishing Company, 1983. xiv + 497 pp. Dfl225/$98.00.
The subject of this paper is the philosophical problem of accounting for the relationship between mathematics and non-mathematical reality. The first section, devoted to the importance of the problem, suggests that many of the reasons for engaging in philosophy at all make an account of the relationship between mathematics and reality a priority, not only in philosophy of mathematics and philosophy of science, but also in general epistemology/metaphysics. This is followed by a (rather brief) survey of the major, traditional philosophies (...) of mathematics indicating how each is prepared to deal with the present problem. It is shown that (the standard formulations of) some views seem to deny outright that there is a relationship between mathematics and any non-mathematical reality; such philosophies are clearly unacceptable. Other views leave the relationship rather mysterious and, thus, are incomplete at best. The final, more speculative section provides the direction of a positive account. A structuralist philosophy of mathematics is outlined and it is proposed that mathematics applies to reality though the discovery of mathematical structures underlying the non-mathematical universe. (shrink)
The purpose of this article is to examine aspects of the development of the concept and theory of computability through the theory of recursive functions. Following a brief introduction, Section 2 is devoted to the presuppositions of computability. It focuses on certain concepts, beliefs and theorems necessary for a general property of computability to be formulated and developed into a mathematical theory. The following two sections concern situations in which the presuppositions were realized and the theory of computability was developed. (...) It is suggested in Section 3 that a central item was the problem of generalizing Gödel's incompleteness theorem. It is shown that this involved both the characterization of recursiveness and the attempt to clarify and formulate the notion of an effective process as it relates to the syntax of deductive systems. Section 4 concerns the decision problems which grew from the Hilbert program. Section 5 is devoted to the development of an informal' technique in the theory of computability often called ?argument by Church's thesis? (shrink)
This paper focuses on two notions of effectiveness which are not treated in detail elsewhere. Unlike the standard computability notion, which is a property of functions themselves, both notions of effectiveness are properties of interpreted linguistic presentations of functions. It is shown that effectiveness is epistemically at least as basic as computability in the sense that decisions about computability normally involve judgments concerning effectiveness. There are many occurrences of the present notions in the writings of logicians; moreover, consideration of these (...) notions can contribute to the clarification and, perhaps, solution of various philosophical problems, confusions and disputes. (shrink)
Typically, a logic consists of a formal or informal language together with a deductive system and/or a model-theoretic semantics. The language is, or corresponds to, a part of a natural language like English or Greek. The deductive system is to capture, codify, or simply record which inferences are correct for the given language, and the semantics is to capture, codify, or record the meanings, or truth-conditions, or possible truth conditions, for at least part of the language.