The framework of natural deduction is extended by permitting rules as assumptions which may be discharged in the course of a derivation. this leads to the concept of rules of higher levels and to a general schema for introduction and elimination rules for arbitrary n-ary sentential operators. with respect to this schema, (functional) completeness "or", "if..then" and absurdity is proved.
The standard approach to what I call “proof-theoretic semantics”, which is mainly due to Dummett and Prawitz, attempts to give a semantics of proofs by defining what counts as a valid proof. After a discussion of the general aims of proof-theoretic semantics, this paper investigates in detail various notions of proof-theoretic validity and offers certain improvements of the definitions given by Prawitz. Particular emphasis is placed on the relationship between semantic validity concepts and validity concepts used in normalization theory. It (...) is argued that these two sorts of concepts must be kept strictly apart. (shrink)
This volume is the first ever collection devoted to the field of proof-theoretic semantics. Contributions address topics including the systematics of introduction and elimination rules and proofs of normalization, the categorial characterization of deductions, the relation between Heyting's and Gentzen's approaches to meaning, knowability paradoxes, proof-theoretic foundations of set theory, Dummett's justification of logical laws, Kreisel's theory of constructions, paradoxical reasoning, and the defence of model theory. The field of proof-theoretic semantics has existed for almost 50 years, but the term (...) itself was proposed by Schroeder-Heister in the 1980s. Proof-theoretic semantics explains the meaning of linguistic expressions in general and of logical constants in particular in terms of the notion of proof. This volume emerges from presentations at the Second International Conference on Proof-Theoretic Semantics in Tübingen in 2013, where contributing authors were asked to provide a self-contained description and analysis of a significant research question in this area. The contributions are representative of the field and should be of interest to logicians, philosophers, and mathematicians alike. (shrink)
From the point of view of proof-theoretic semantics, it is argued that the sequent calculus with introduction rules on the assertion and on the assumption side represents deductive reasoning more appropriately than natural deduction. In taking consequence to be conceptually prior to truth, it can cope with non-well-founded phenomena such as contradictory reasoning. The fact that, in its typed variant, the sequent calculus has an explicit and separable substitution schema in form of the cut rule, is seen as a crucial (...) advantage over natural deduction, where substitution is built into the general framework. (shrink)
Several proof-theoretic notions of validity have been proposed in the literature, for which completeness of intuitionistic logic has been conjectured. We define validity for intuitionistic propositional logic in a way which is common to many of these notions, emphasizing that an appropriate notion of validity must be closed under substitution. In this definition we consider atomic systems whose rules are not only production rules, but may include rules that allow one to discharge assumptions. Our central result shows that Harrop’s rule (...) is valid under substitution, which refutes the completeness conjecture for intuitionistic logic. (shrink)
The hypothetical notion of consequence is normally understood as the transmission of a categorical notion from premisses to conclusion. In model-theoretic semantics this categorical notion is 'truth', in standard proof-theoretic semantics it is 'canonical provability'. Three underlying dogmas, (I) the priority of the categorical over the hypothetical, (II) the transmission view of consequence, and (III) the identification of consequence and correctness of inference are criticized from an alternative view of proof-theoretic semantics. It is argued that consequence is a basic semantical (...) concept which is directly governed by elementary reasoning principles such as definitional closure and definitional reflection, and not reduced to a categorical concept. This understanding of consequence allows in particular to deal with non-wellfounded phenomena as they arise from circular definitions. (shrink)
The new area of logic and computation is now undergoing rapid development. This has affected the social pattern of research in the area. A new topic may rise very quickly with a significant body of research around it. The community, however, cannot wait the traditional two years for a book to appear. This has given greater importance to thematic collections of papers, centred around a topic and addressing it from several points of view, usually as a result of a workshop, (...) summer school, or just a scientific initiative. Such a collection may not be as coherent as a book by one or two authors yet it is more focused than a collection of key papers on a certain topic. It is best thought of as a thematic collection, a study in the area of logic and computation. The new series Studies in Logic and Computation is intended to provide a home for such thematic collections. Substructural logics are nonclassical logics, which arose in response to problems in foundations of mathematics and logic, theoretical computer science, mathematical linguistics, and category theory. They include intuitionistic logic, relevant logic, BCK logic, linear logic, and Lambek's calculus of syntactic categories. Substructural logics differ from classical logics, and from each other, in their presuppositions about Gentzen's structural rules, although their presuppositions about the deductive role of logical constants are invariant. Substructural logics have been a subject of study for logicians during the last sixty years. Specialists have often worked in isolation, however, largely unaware of the contributions of others. This book brings together new papers by some of the most eminent authorities in these varioustraditions to produce a unified view of substructural logics. (shrink)
Prawitz proposed certain notions of proof-theoretic validity and conjectured that intuitionistic logic is complete for them [11, 12]. Considering propositional logic, we present a general framework of five abstract conditions which any proof-theoretic semantics should obey. Then we formulate several more specific conditions under which the intuitionistic propositional calculus turns out to be semantically incomplete. Here a crucial role is played by the generalized disjunction principle. Turning to concrete semantics, we show that prominent proposals, including Prawitz’s, satisfy at least one (...) of these conditions, thus rendering IPC semantically incomplete for them. Only for Goldfarb’s [1] proof-theoretic semantics, which deviates from standard approaches, IPC turns out to be complete. Overall, these results show that basic ideas of proof-theoretic semantics for propositional logic are not captured by IPC. (shrink)
We present our calculus of higher-level rules, extended with propositional quantification within rules. This makes it possible to present general schemas for introduction and elimination rules for arbitrary propositional operators and to define what it means that introductions and eliminations are in harmony with each other. This definition does not presuppose any logical system, but is formulated in terms of rules themselves. We therefore speak of a foundational account of proof-theoretic harmony. With every set of introduction rules a canonical elimination (...) rule, and with every set of elimination rules a canonical introduction rule is associated in such a way that the canonical rule is in harmony with the set of rules it is associated with. An example given by Hazen and Pelletier is used to demonstrate that there are significant connectives, which are characterized by their elimination rules, and whose introduction rule is the canonical introduction rule associated with these elimination rules. Due to the availabiliy of higher-level rules and propositional quantification, the means of expression of the framework developed are sufficient to ensure that the construction of canonical elimination or introduction rules is always possible and does not lead out of this framework. (shrink)
Prawitz observed that Russell’s paradox in naive set theory yields a derivation of absurdity whose reduction sequence loops. Building on this observation, and based on numerous examples, Tennant claimed that this looping feature, or more generally, the fact that derivations of absurdity do not normalize, is characteristic of the paradoxes. Striking results by Ekman show that looping reduction sequences are already obtained in minimal propositional logic, when certain reduction steps, which are prima facie plausible, are considered in addition to the (...) standard ones. This shows that the notion of reduction is in need of clarification. Referring to the notion of identity of proofs in general proof theory, we argue that reduction steps should not merely remove redundancies, but must respect the identity of proofs. Consequentially, we propose to modify Tennant’s paradoxicality test by basing it on this refined notion of reduction. (shrink)
. The term inversion principle goes back to Lorenzen who coined it in the early 1950s. It was later used by Prawitz and others to describe the symmetric relationship between introduction and elimination inferences in natural deduction, sometimes also called harmony. In dealing with the invertibility of rules of an arbitrary atomic production system, Lorenzen’s inversion principle has a much wider range than Prawitz’s adaptation to natural deduction. It is closely related to definitional reflection, which is a principle for reasoning (...) on the basis of rule-based atomic definitions, proposed by Hallnäs and Schroeder-Heister. After presenting definitional reflection and the inversion principle, it is shown that the inversion principle can be formally derived from definitional reflection, when the latter is viewed as a principle to establish admissibility. Furthermore, the relationship between definitional reflection and the inversion principle is investigated on the background of a universalization principle, called the ω- principle, which allows one to pass from the set of all defined substitution instances of a sequent to the sequent itself. (shrink)
In their Basic Logic, Sambin, Battilotti and Faggian give a foundation of logical inference rules by reference to certain reflection principles. We investigate the relationship between these principles and the principle of Definitional Reflection proposed by Hallnäs and Schroeder-Heister.
In the 1920s, Paul Hertz (1881-1940) developed certain calculi based on structural rules only and established normal form results for proofs. It is shown that he anticipated important techniques and results of general proof theory as well as of resolution theory, if the latter is regarded as a part of structural proof theory. Furthermore, it is shown that Gentzen, in his first paper of 1933, which heavily draws on Hertz, proves a normal form result which corresponds to the completeness of (...) prepositional SLD-resolution in logic programming. (shrink)
Developing early results of Prawitz, Tennant proposed a criterion for an expression to count as a paradox in the framework of Gentzen’s natural deduction: paradoxical expressions give rise to non-normalizing derivations. Two distinct kinds of cases, going back to Crabbé and Tennant, show that the criterion overgenerates, that is, there are derivations which are intuitively non-paradoxical but which fail to normalize. Tennant’s proposed solution consists in reformulating natural deduction elimination rules in general form. Developing intuitions of Ekman we show that (...) the adoption of general rules has the consequence of hiding redundancies within derivations. Once reductions to get rid of the hidden redundancies are devised, it is clear that the adoption of general elimination rules offers no remedy to the overgeneration of the Prawitz–Tennant analysis. In this way, we indirectly provide further support for a solution to one of the two overgeneration cases developed in previous work. (shrink)
This paper deals with Popper's little-known work on deductive logic, published between 1947 and 1949. According to his theory of deductive inference, the meaning of logical signs is determined by certain rules derived from ?inferential definitions? of those signs. Although strong arguments have been presented against Popper's claims (e.g. by Curry, Kleene, Lejewski and McKinsey), his theory can be reconstructed when it is viewed primarily as an attempt to demarcate logical from non-logical constants rather than as a semantic foundation for (...) logic. A criterion of logicality is obtained which is based on conjunction, implication and universal quantification as fundamental logical operations. (shrink)
The interpretation of implications as rules motivates a different left-introduction schema for implication in the sequent calculus, which is conceptually more basic than the implication-left schema proposed by Gentzen. Corresponding to results obtained for systems with higher-level rules, it enjoys the subformula property and cut elimination in a weak form.
In Section 10 of Grundgesetze, Volume I, Frege advances a mathematical argument (known as the permutation argument), by means of which he intends to show that an arbitrary value-range may be identified with the True, and any other one with the False, without contradicting any stipulations previously introduced (we shall call this claim the identifiability thesis, following Schroeder-Heister (1987)). As far as we are aware, there is no consensus in the literature as to (i) the proper interpretation of the permutation (...) argument and the identifiability thesis, (ii) the validity of the permutation argument, and (iii) the truth of the identifiability thesis. In this paper, we undertake a detailed technical study of the two main lines of interpretation, and gather some evidence for favoring one interpretation over the other. (shrink)
Theories in the usual sense, as characterized by a language and a set of theorems in that language ("statement view"), are related to theories in the structuralist sense, in turn characterized by a set of potential models and a subset thereof as models ("non-statement view", J. Sneed, W. Stegmüller). It is shown that reductions of theories in the structuralist sense (that is, functions on structures) give rise to so-called "representations" of theories in the statement sense and vice versa, where representations (...) are understood as functions that map sentences of one theory into another theory. It is argued that commensurability between theories should be based on functions on open formulas and open terms so that reducibility does not necessarily imply commensurability. This is in accordance with a central claim by Stegmüller on the compatibility of reducibility and incommensurability that has recently been challenged by D. Pearce. (shrink)
We reconstruct Frege?s treatment of certain deducibility problems posed by Boole. It turns out that in his formalization and solution of Boole?s problems Frege anticipates the idea of propositional resolution.
This book constitutes the refereed proceedings of the 5th International Workshop on Extensions of Logic Programming, ELP '96, held in Leipzig, Germany in March 1996. The 18 full papers included were carefully selected by the program committee and are presented together with three invited papers. Among the topics addressed in this book are categorical logic programming, correctness of logic programs, functional-logic languages, implementation issues, linear logic programming, nonmonotonic reasoning, and proof search.
This volume contains finalized versions of papers presented at an international workshop on extensions of logic programming, held at the Seminar for Natural Language Systems at the University of Tübingen in December 1989. Several recent extensions of definite Horn clause programming, especially those with a proof-theoretic background, have much in common. One common thread is a new emphasis on hypothetical reasoning, which is typically inspired by Gentzen-style sequent or natural deduction systems. This is not only of theoretical significance, but also (...) bears upon computational issues. It was one purpose of the workshop to bring some of these recent developments together. The volume covers topics such as the languages Lambda-Prolog, N-Prolog, and GCLA, the relationship between logic programming and functional programming, and the relationship between extensions of logic programming and automated theorem proving. It contains the results of the first conference concentrating on proof-theoretic approaches to logic programming. (shrink)
MICHAEL D. RESNIK, Frege and the philosophy of mathematics. Ithaca and London: Cornell University Press, 1980. 244 pp. $16.50. HANS D. SLUGA, Gottlob Frege. London, Boston and Henley: Routledge & Kegan Paul, 1980. xi + 203 pp. £ 12.95.