The framework of natural deduction is extended by permitting rules as assumptions which may be discharged in the course of a derivation. this leads to the concept of rules of higher levels and to a general schema for introduction and elimination rules for arbitrary n-ary sentential operators. with respect to this schema, (functional) completeness "or", "if..then" and absurdity is proved.
The hypothetical notion of consequence is normally understood as the transmission of a categorical notion from premisses to conclusion. In model-theoretic semantics this categorical notion is 'truth', in standard proof-theoretic semantics it is 'canonical provability'. Three underlying dogmas, (I) the priority of the categorical over the hypothetical, (II) the transmission view of consequence, and (III) the identification of consequence and correctness of inference are criticized from an alternative view of proof-theoretic semantics. It is argued that consequence is a basic semantical (...) concept which is directly governed by elementary reasoning principles such as definitional closure and definitional reflection, and not reduced to a categorical concept. This understanding of consequence allows in particular to deal with non-wellfounded phenomena as they arise from circular definitions. (shrink)
. The term inversion principle goes back to Lorenzen who coined it in the early 1950s. It was later used by Prawitz and others to describe the symmetric relationship between introduction and elimination inferences in natural deduction, sometimes also called harmony. In dealing with the invertibility of rules of an arbitrary atomic production system, Lorenzen’s inversion principle has a much wider range than Prawitz’s adaptation to natural deduction. It is closely related to definitional reflection, which is a principle for reasoning (...) on the basis of rule-based atomic definitions, proposed by Hallnäs and Schroeder-Heister. After presenting definitional reflection and the inversion principle, it is shown that the inversion principle can be formally derived from definitional reflection, when the latter is viewed as a principle to establish admissibility. Furthermore, the relationship between definitional reflection and the inversion principle is investigated on the background of a universalization principle, called the ω- principle, which allows one to pass from the set of all defined substitution instances of a sequent to the sequent itself. (shrink)
Several proof-theoretic notions of validity have been proposed in the literature, for which completeness of intuitionistic logic has been conjectured. We define validity for intuitionistic propositional logic in a way which is common to many of these notions, emphasizing that an appropriate notion of validity must be closed under substitution. In this definition we consider atomic systems whose rules are not only production rules, but may include rules that allow one to discharge assumptions. Our central result shows that Harrop’s rule (...) is valid under substitution, which refutes the completeness conjecture for intuitionistic logic. (shrink)
We present our calculus of higher-level rules, extended with propositional quantification within rules. This makes it possible to present general schemas for introduction and elimination rules for arbitrary propositional operators and to define what it means that introductions and eliminations are in harmony with each other. This definition does not presuppose any logical system, but is formulated in terms of rules themselves. We therefore speak of a foundational account of proof-theoretic harmony. With every set of introduction rules a canonical elimination (...) rule, and with every set of elimination rules a canonical introduction rule is associated in such a way that the canonical rule is in harmony with the set of rules it is associated with. An example given by Hazen and Pelletier is used to demonstrate that there are significant connectives, which are characterized by their elimination rules, and whose introduction rule is the canonical introduction rule associated with these elimination rules. Due to the availabiliy of higher-level rules and propositional quantification, the means of expression of the framework developed are sufficient to ensure that the construction of canonical elimination or introduction rules is always possible and does not lead out of this framework. (shrink)
The standard approach to what I call “proof-theoretic semantics”, which is mainly due to Dummett and Prawitz, attempts to give a semantics of proofs by defining what counts as a valid proof. After a discussion of the general aims of proof-theoretic semantics, this paper investigates in detail various notions of proof-theoretic validity and offers certain improvements of the definitions given by Prawitz. Particular emphasis is placed on the relationship between semantic validity concepts and validity concepts used in normalization theory. It (...) is argued that these two sorts of concepts must be kept strictly apart. (shrink)
In their Basic Logic, Sambin, Battilotti and Faggian give a foundation of logical inference rules by reference to certain reflection principles. We investigate the relationship between these principles and the principle of Definitional Reflection proposed by Hallnäs and Schroeder-Heister.
From the point of view of proof-theoretic semantics, it is argued that the sequent calculus with introduction rules on the assertion and on the assumption side represents deductive reasoning more appropriately than natural deduction. In taking consequence to be conceptually prior to truth, it can cope with non-well-founded phenomena such as contradictory reasoning. The fact that, in its typed variant, the sequent calculus has an explicit and separable substitution schema in form of the cut rule, is seen as a crucial (...) advantage over natural deduction, where substitution is built into the general framework. (shrink)
In the 1920s, Paul Hertz (1881-1940) developed certain calculi based on structural rules only and established normal form results for proofs. It is shown that he anticipated important techniques and results of general proof theory as well as of resolution theory, if the latter is regarded as a part of structural proof theory. Furthermore, it is shown that Gentzen, in his first paper of 1933, which heavily draws on Hertz, proves a normal form result which corresponds to the completeness of (...) prepositional SLD-resolution in logic programming. (shrink)
Prawitz observed that Russell’s paradox in naive set theory yields a derivation of absurdity whose reduction sequence loops. Building on this observation, and based on numerous examples, Tennant claimed that this looping feature, or more generally, the fact that derivations of absurdity do not normalize, is characteristic of the paradoxes. Striking results by Ekman show that looping reduction sequences are already obtained in minimal propositional logic, when certain reduction steps, which are prima facie plausible, are considered in addition to the (...) standard ones. This shows that the notion of reduction is in need of clarification. Referring to the notion of identity of proofs in general proof theory, we argue that reduction steps should not merely remove redundancies, but must respect the identity of proofs. Consequentially, we propose to modify Tennant’s paradoxicality test by basing it on this refined notion of reduction. (shrink)
The interpretation of implications as rules motivates a different left-introduction schema for implication in the sequent calculus, which is conceptually more basic than the implication-left schema proposed by Gentzen. Corresponding to results obtained for systems with higher-level rules, it enjoys the subformula property and cut elimination in a weak form.
Theories in the usual sense, as characterized by a language and a set of theorems in that language ("statement view"), are related to theories in the structuralist sense, in turn characterized by a set of potential models and a subset thereof as models ("non-statement view", J. Sneed, W. Stegmüller). It is shown that reductions of theories in the structuralist sense (that is, functions on structures) give rise to so-called "representations" of theories in the statement sense and vice versa, where representations (...) are understood as functions that map sentences of one theory into another theory. It is argued that commensurability between theories should be based on functions on open formulas and open terms so that reducibility does not necessarily imply commensurability. This is in accordance with a central claim by Stegmüller on the compatibility of reducibility and incommensurability that has recently been challenged by D. Pearce. (shrink)
This paper deals with Popper's little-known work on deductive logic, published between 1947 and 1949. According to his theory of deductive inference, the meaning of logical signs is determined by certain rules derived from ?inferential definitions? of those signs. Although strong arguments have been presented against Popper's claims (e.g. by Curry, Kleene, Lejewski and McKinsey), his theory can be reconstructed when it is viewed primarily as an attempt to demarcate logical from non-logical constants rather than as a semantic foundation for (...) logic. A criterion of logicality is obtained which is based on conjunction, implication and universal quantification as fundamental logical operations. (shrink)
any other one with the False, without contradicting any stipulations previously introduced (we shall call this claim the identiability thesis, following Schroeder-Heister ). As far as we are aware, there is no consensus in the literature as to (i) the proper interpretation of the permutation argument and the identiability thesis, (ii) the validity of the permutation argument, and (iii) the truth of the identiability thesis.1 In this paper, we undertake a detailed technical study of the two main lines of interpretation, (...) and gather some evidence for favoring one interpretation over the other. (shrink)
We reconstruct Frege?s treatment of certain deducibility problems posed by Boole. It turns out that in his formalization and solution of Boole?s problems Frege anticipates the idea of propositional resolution.
MICHAEL D. RESNIK, Frege and the philosophy of mathematics. Ithaca and London: Cornell University Press, 1980. 244 pp. $16.50. HANS D. SLUGA, Gottlob Frege. London, Boston and Henley: Routledge & Kegan Paul, 1980. xi + 203 pp. £ 12.95.