We establish by elementary proof-theoretic means the conservativeness of two subsystems of analysis over primitive recursive arithmetic. The one subsystem was introduced by Friedman [6], the other is a strengthened version of a theory of Minc [14]; each has been shown to be of considerable interest for both mathematical practice and metamathematical investigations. The foundational significance of such conservation results is clear: they provide a direct finitist justification of the part of mathematical practice formalizable in these subsystems. The results are (...) generalized to relate a hierarchy of subsystems, all contained in the theory of arithmetic properties, to a corresponding hierarchy of fragments of arithmetic. The proof theoretic tools employed there are used to re-establish in a uniform, elementary way relationships between various fragments of arithmetic due to Parsons, Paris and Kirby, and Friedman. (shrink)
Hilbert's finitist program was not created at the beginning of the twenties solely to counteract Brouwer's intuitionism, but rather emerged out of broad philosophical reflections on the foundations of mathematics and out of detailed logical work; that is evident from notes of lecture courses that were given by Hilbert and prepared in collaboration with Bernays during the period from 1917 to 1922. These notes reveal a dialectic progression from a critical logicism through a radical constructivism toward finitism; the progression has (...) to be seen against the background of the stunning presentation of mathematical logic in the lectures given during the winter term 1917/18. In this paper, I sketch the connection of Hilbert's considerations to issues in the foundations of mathematics during the second half of the 19th century, describe the work that laid the basis of modern mathematical logic, and analyze the first steps in the new subject of proof theory. A revision of the standard view of Hilbert's and Bernays's contributions to the foundational discussion in our century has long been overdue. It is almost scandalous that their carefully worked out notes have not been used yet to understand more accurately the evolution of modern logic in general and of Hilbert's Program in particular. One conclusion will be obvious: the dogmatic formalist Hilbert is a figment of historical (de)construction! Indeed, the study and analysis of these lectures reveal a depth of mathematical-logical achievement and of philosophical reflection that is remarkable. In the course of my presentation many questions are raised and many more can be explored; thus, I hope this paper will stimulate interest for new historical and systematic work. (shrink)
David Hilbert was one of the great mathematicians who expounded the centrality of their subject in human thought. In this collection of essays, Wilfried Sieg frames Hilbert's foundational work, from 1890 to 1939, in a comprehensive way and integrates it with modern proof theoretic investigations.
The incompleteness theorems constitute the mathematical core of Gödel’s philosophical challenge. They are given in their “most satisfactory form”, as Gödel saw it, when the formality of theories to which they apply is characterized via Turing machines. These machines codify human mechanical procedures that can be carried out without appealing to higher cognitive capacities. The question naturally arises, whether the theorems justify the claim that the human mind has mathematical abilities that are not shared by any machine. Turing admits that (...) non-mechanical steps of intuition are needed to transcend particular formal theories. Thus, there is a substantive point in comparing Turing’s views with Gödel’s that is expressed by the assertion, “The human mind infinitely surpasses any finite machine”. The parallelisms and tensions between their views are taken as an inspiration for beginning to explore, computationally, the capacities of the human mathematical mind. (shrink)
Dedekind’s structuralism is a crucial source for the structuralism of mathematical practice—with its focus on abstract concepts like groups and fields. It plays an equally central role for the structuralism of philosophical analysis—with its focus on particular mathematical objects like natural and real numbers. Tensions between these structuralisms are palpable in Dedekind’s work, but are resolved in his essay Was sind und was sollen die Zahlen? In a radical shift, Dedekind extends his mathematical approach to “the” natural numbers. He creates (...) the abstract concept of a simply infinite system, proves the existence of a “model”, insists on the stepwise derivation of theorems, and defines structure-preserving mappings between different systems that fall under the abstract concept. Crucial parts of these considerations were added, however, only to the penultimate manuscript, for example, the very concept of a simply infinite system. The methodological consequences of this radical shift are elucidated by an analysis of Dedekind’s metamathematics. Our analysis provides a deeper understanding of the essay and, in addition, illuminates its impact on the evolution of the axiomatic method and of “semantics” before Tarski. This understanding allows us to make connections to contemporary issues in the philosophy of mathematics and science. (shrink)
Hilbert gave lectures on the foundations of mathematics throughout his career. Notes for many of them have been preserved and are treasures of information; they allow us to reconstruct the path from Hilbert's logicist position, deeply influenced by Dedekind and presented in lectures starting around 1890, to the program of finitist proof theory in the early 1920s. The development toward proof theory begins, in some sense, in 1917 when Hilbert gave his talk Axiomatisches Denken in Zürich. This talk is rooted (...) in the past and points to the future. As to the future, Hilbert suggested:[. . .] we must[—]that is my conviction[—]take the concept of the specifically mathematical proof as an object of investigation, just as .. (shrink)
Alonzo Church's mathematical work on computability and undecidability is well-known indeed, and we seem to have an excellent understanding of the context in which it arose. The approach Church took to the underlying conceptual issues, by contrast, is less well understood. Why, for example, was "Church's Thesis" put forward publicly only in April 1935, when it had been formulated already in February/March 1934? Why did Church choose to formulate it then in terms of Gödel's general recursiveness, not his own λ (...) -definability as he had done in 1934? A number of letters were exchanged between Church and Paul Bernays during the period from December 1934 to August 1937; they throw light on critical developments in Princeton during that period and reveal novel aspects of Church's distinctive contribution to the analysis of the informal notion of effective calculability. In particular, they allow me to give informed, though still tentative answers to the questions I raised; the character of my answers is reflected by an alternative title for this paper, Why Church needed Gödel's recursiveness for his Thesis. In Section 5, I contrast Church's analysis with that of Alan Turing and explore, in the very last section, an analogy with Dedekind's investigation of continuity. (shrink)
Natural Formalization proposes a concrete way of expanding proof theory from the meta-mathematical investigation of formal theories to an examination of “the concept of the specifically mathematical proof.” Formal proofs play a role for this examination in as much as they reflect the essential structure and systematic construction of mathematical proofs. We emphasize three crucial features of our formal inference mechanism: (1) the underlying logical calculus is built for reasoning with gaps and for providing strategic directions, (2) the mathematical frame (...) is a definitional extension of Zermelo–Fraenkel set theory and has a hierarchically organized structure of concepts and operations, and (3) the construction of formal proofs is deeply connected to the frame through rules for definitions and lemmas. To bring these general ideas to life, we examine, as a case study, proofs of the Cantor–Bernstein Theorem that do not appeal to the principle of choice. A thorough analysis of the multitude of “different” informal proofs seems to reduce them to exactly one. The natural formalization confirms that there is one proof, but that it comes in two variants due to Dedekind and Zermelo, respectively. In this way it enhances the conceptual understanding of the represented informal proofs. The formal, computational work is carried out with the proof search system AProS that serves as a proof assistant and implements the above inference mechanism; it can be fully inspected at (see link below). (shrink)
Herbrand's Theorem, in the form of $$\underset{\raise0.3em\hbox{$\smash{\scriptscriptstyle-}$}}{\exists } $$ -inversion lemmata for finitary and infinitary sequent calculi, is the crucial tool for the determination of the provably total function(al)s of a variety of theories. The theories are (second order extensions of) fragments of classical arithmetic; the classes of provably total functions include the elements of the Polynomial Hierarchy, the Grzegorczyk Hierarchy, and the extended Grzegorczyk Hierarchy $\mathfrak{E}^\alpha $ , α < ε0. A subsidiary aim of the paper is to show (...) that the proof theoretic methods used here are distinguished by technical elegance, conceptual clarity, and wide-ranging applicability. (shrink)
Church's and Turing's theses dogmatically assert that an informal notion of effective calculability is adequately captured by a particular mathematical concept of computability. I present an analysis of calculability that is embedded in a rich historical and philosophical context, leads to precise concepts, but dispenses with theses. To investigate effective calculability is to analyze symbolic processes that can in principle be carried out by calculators. This is a philosophical lesson we owe to Turing. Drawing on that lesson and recasting work (...) of Gandy, I formulate boundedness and locality conditions for two types of calculators, namely, human computing agents and mechanical computing devices (discrete machines). The distinctive feature of the latter is that they can carry out parallel computations. The analysis leads to axioms for discrete dynamical systems (representing human and machine computations) and allows the reduction of models of these axioms to Turing machines. Cellular automata and a variety of artificial neural nets can be shown to satisfy the axioms for machine computations. (shrink)
On June 4, 1925, Hilbert delivered an address to the Westphalian Mathematical Society in Miinster; that was, as a quick calculation will convince you, almost exactly sixty years ago. The address was published in 1926 under the title Über dasUnendlicheand is perhaps Hilbert's most comprehensive presentation of his ideas concerning the finitist justification of classical mathematics and the role his proof theory was to play in it. But what has become of the ambitious program for securing all of mathematics, once (...) and for all? What of proof theory, the very subject Hilbert invented to carry out his program? The Hilbertian ambition reached out too far: in its original form, the program was refuted by Gödel's Incompleteness Theorems. And even allowing more than finitist means in metamathematics, the Hilbertian expectations for proof theory have not been realized: a constructive consistency proof for second-order arithmetic is still out of reach. Nevertheless, remarkable progress has been made. Two separate, but complementary directions of research have led to surprising insights: classical analysis can be formally developed in conservative extensions of elementary number theory; relative consistency proofs can be given by constructive means for impredicative parts of second order arithmetic. The mathematical and metamathematical developments have been accompanied by sustained philosophical reflections on the foundations of mathematics. This indicates briefly the main themes of the contributions to the symposium; in my introductory remarks I want to give a very schematic perspective, that is partly historical and partly systematic. (shrink)
Dedekind's mathematical work is integral to the transformation of mathematics in the nineteenth century and crucial for the emergence of structuralist mathematics in the twentieth century. We investigate the essential components of what Emmy Noether called, his ‘axiomatic standpoint’: abstract concepts, models, and mappings.
Two young logicians, whose work had a dramatic impact on the direction of logic, exchanged two letters in early 1931. Jacques Herbrand initiated the correspondence on 7 April and Kurt Gödel responded on 25 July, just two days before Herbrand died in a mountaineering accident at La Bérarde (Isère). Herbrand's letter played a significant role in the development of computability theory. Gödel asserted in his 1934 Princeton Lectures and on later occasions that it suggested to him a crucial part of (...) the definition of a general recursive function. Understanding this role in detail is of great interest as the notion is absolutely central. The full text of the letter had not been available until recently, and its content (as reported by Gödel) was not in accord with Herbrand's contemporaneous published work. Together, the letters reflect broader intellectual currents of the time: they are intimately linked to the discussion of the incompleteness theorems and their potential impact on Hilbert's Program. (shrink)
The Carnegie Mellon Proof Tutor project was motivated by pedagogical concerns: we wanted to use a "mechanical" (i.e. computerized) tutor for teaching students..
Natural deduction (for short: nd-) calculi have not been used systematically as a basis for automated theorem proving in classical logic. To remove objective obstacles to their use we describe (1) a method that allows to give semantic proofs of normal form theorems for nd-calculi and (2) a framework that allows to search directly for normal nd-proofs. Thus, one can try to answer the question: How do we bridge the gap between claims and assumptions in heuristically motivated ways? This informal (...) question motivates the formulation of intercalation calculi. Ic-calculi are the technical underpinnings for (1) and (2), and our paper focuses on their detailed presentation and meta-mathematical investigation in the case of classical predicate logic. As a central theme emerges the connection between restricted forms of nd-proofs and (strategies for) proof search: normal forms are not obtained by removing local "detours", but rather by constructing proofs that directly reflect proof-strategic considerations. That theme warrants further investigation. (shrink)
The paper discusses tools for teaching logic used in Logic & Proofs, a web-based introduction to modern logic that has been taken by more than 1,300 students since the fall of 2003. The tools include a wide array of interactive learning environments or cognitive mini-tutors; most important among them is the Carnegie Proof Lab. The Proof Lab is a sophisticated interface for constructing natural deduction proofs and is central, as strategically guided discovery of proofs is the distinctive focus of the (...) course. My discussion makes explicit the broader intellectual context, but also the pursuit of pedagogical goals and their experimental examination. The intellectual context includes i) the theoretical background for the proof search algorithm AProS and its use for a dynamic Proof Tutor, and ii) the programmatic expansion of the course to Computational Logic. 1. (shrink)
ââ¬â via appropriate substitutions ââ¬â syntactically identical. The method can be applied directly to quantifierfree formulae and, in this paper, will b e extended in a natural and strai ghlforward way to quantified formulae.
This is a summary of developments analysed in my (1997A). A first version of that paper was presented at the workshop Modern Mathematical Thought in Pittsburgh (September 21-24, 1995).
This volume honours the life and work of Solomon Feferman, one of the most prominent mathematical logicians of the latter half of the 20th century. In the collection of essays presented here, researchers examine Feferman’s work on mathematical as well as specific methodological and philosophical issues that tie into mathematics. Feferman’s work was largely based in mathematical logic, but also branched out into methodological and philosophical issues, making it well known beyond the borders of the mathematics community. With regard to (...) methodological issues, Feferman supported concrete projects. On the one hand, these projects calibrate the proof theoretic strength of subsystems of analysis and set theory and provide ways of overcoming the limitations imposed by Gödel’s incompleteness theorems through appropriate conceptual expansions. On the other, they seek to identify novel axiomatic foundations for mathematical practice, truth theories, and category theory. In his philosophical research, Feferman explored questions such as “What is logic?” and proposed particular positions regarding the foundations of mathematics including, for example, his “conceptual structuralism.” The contributing authors of the volume examine all of the above issues. Their papers are accompanied by an autobiography presented by Feferman that reflects on the evolution and intellectual contexts of his work. The contributing authors critically examine Feferman’s work and, in part, actively expand on his concrete mathematical projects. The volume illuminates Feferman’s distinctive work and, in the process, provides an enlightening perspective on the foundations of mathematics and logic. (shrink)
In the fall of 1985 Carnegie Mellon University established a Department of Philosophy. The focus of the department is logic broadly conceived, philos ophy of science, in particular of the social sciences, and linguistics. To mark the inauguration of the department, a daylong celebration was held on April 5, 1986. This celebration consisted of two keynote addresses by Patrick Sup pes and Thomas Schwartz, seminars directed by members of the department, and a panel discussion on the computational model of mind (...) moderated by Dana S. Scott. The various contributions, in modified and expanded form, are the core of this collection of essays, and they are, I believe, of more than parochial interest: they turn attention to substantive and reflective interdis ciplinary work. The collection is divided into three parts. The first part gives perspec tives on general features of the interdisciplinary enterprise in philosophy, and on a particular topic that invites such interaction, namely computational models of the mind. The second part con tains reports on concrete research done within that enter prise; the research topics range from decision theory and the philosophy of economics through foundational problems in mathematics to issues in aes thetics and computational linguistics. The third part is a postscriptum by Isaac Levi, analyzing directions of work from his perspective. (shrink)
Any thorough discussion of computing machines requires the examination of rigorous concepts of computation and is facilitated by the distinction between mathematical, symbolic and physical computations. The delicate connection between the three kinds of computations and the underlying questions, "What are machines?" and "When are they computing?", motivate an extensive theoretical and historical discussion. The relevant outcome of this..
Machines were introduced as calculating devices to simulate operations carried out by human computors following fixed algorithms: this is true for the early mechanical calculators devised by Pascal and Leibniz, for the analytical engine built by Babbage, and the theoretical machines introduced by Turing. The distinguishing feature of the latter is their universality: They are claimed to be able to capture any algorithm whatsoever and, conversely, any procedure they can carry out is evidently algorithmic. The study of such "paper machines" (...) by mathematical means is the topic of our contribution. This is not only in accord with its usual understanding in computer science, but conceptually and historically right, when we recall the purpose for which Turing machines were introduced. (shrink)