In this short paper I note that a key metatheorem does not hold for the bilateralist inferential framework: harmony does not entail consistency. I conclude that the requirement of harmony will not suffice for a bilateralist to maintain a proof theoretic account of classical logic. I conclude that a proof theoretic account of meaning based on the bilateralist framework has no natural way of distinguishing legitimate definitional inference rules from illegitimate ones (such as those for tonk). Finally, as an appendix (...) to the main argument, I propose an alternative non-bilateral formal solution to the problem of providing a proof-theoretic account of classical logic. (shrink)
In this paper I present a formalist philosophy mathematics and apply it directly to Arithmetic. I propose that formalists concentrate on presenting compositional truth theories for mathematical languages that ultimately depend on formal methods. I argue that this proposal occupies a lush middle ground between traditional formalism, fictionalism, logicism and realism.
In this paper I present the strategy behind the proof-theoretic justification of logical inference. I then discuss how this strategy leads to the famous requirement that the inference rules for the logical constants should be in harmony. I argue that the proof-theoretic justification of the logical constants can be used to justify classical logic. To substantiate this I present a new normalisation theorem for first order classical logic involving Sheffer Stroke. The proof of this theorem can be modified to yield (...) a normalisation result for classical logic with conjunction, negation and the universal quantifier. (shrink)
A famous proposed solution to the one over many problem is found in Plato. For example, it appears in The Parmenedies and is introduce by Zeno arguing that . . . if being is many, it must be both like and unlike, and that this is impossible, for neither can the like be unlike, nor the unlike like-is that your position? and Socrates responds: do you not further think that there is an idea of likeness in itself, and another idea (...) of unlikeness, which is the opposite of likeness, and that in these two, you and I and all other things to which we apply the term many, participate-things which participate in likeness become in that degree and manner like; and so far as they participate in unlikeness become in that degree unlike, or both like and unlike in the degree in which they participate in both? And may not all things partake of both opposites, and be both like and unlike, by reason of this participation? Where is the wonder? and the discussion moves on to the interesting question that.. (shrink)
We build on an existing a term-sequent logic for the λ-calculus. We formulate a general sequent system that fully integrates αβη-reductions between untyped λ-terms into first order logic. We prove a cut-elimination result and then offer an application of cut-elimination by giving a notion of uniform proof for λ-terms. We suggest how this allows us to view the calculus of untyped αβ-reductions as a logic programming language (as well as a functional programming language, as it is traditionally seen).
Ways out: The proper relata of causal statements are large complexes of (macroscopic) conditions. For example: the spark, in the presence of oxygen together with flammable material and low humidity etc. The entailment is made virtue of the general macroscopic laws of flammable materials, humidity etc.
An intensional sentential context is one where such intersubstitutivity fails: It is informative that A It is known that A It is necessary that A The fact that A caused it to be that B Some authors characterise intensional contexts by the failure of the intersubstitution of co-referring terms. Such a characterisation is problematic.
In this paper we make some observations about Natural Deduction derivations [Prawitz, 1965, van Dalen, 1986, Bell and Machover, 1977]. We assume the reader is familiar with it and with proof-theory in general. Our development will be simple, even simple-minded, and concrete. However, it will also be evident that general ideas motivate our examples, and we think both our specific examples and the ideas behind them are interesting and may be useful to some readers. In a sentence, the bare technical (...) content of this paper is: Extending natural deduction with global well-formedness conditions can neatly and cheaply capture classical and intermediate logics. The interest here is in the ‘neatly’ and ‘cheaply’. By ‘neatly’ we mean ‘preserving proof-normalisation’,1 and ‘maintaining the subformula property’, and by ‘cheaply’ we mean ‘preserving the formal structure of deductions’ (so that a deduction in the original system is still, formally, a deduction in the extended system, and in particular it requires no extra effort to write just because it is in the extended system). To illustrate what we have in mind consider intuitionistic first-order logic (FOL) [van Dalen, 1986] as a paradigmatic example of a formal notion of deduction. A natural deduction derivation (or deduction) is an inductively defined tree structure where each node contains an instance of a formula. A deduction is valid when each successive node follows from its predecessors in accordance with some predetermined inference rules. A particular attraction of Natural Deduction is its clean and economical presentation. Here for example are deduction (fragments) proving A ∧ B from A and B, and ∀x. (P (x) ∧ Q(x)) from ∀x. P (x) and ∀x. Q(x): ∀x. P (x) (∀E). (shrink)
This concise text treats logic as a tool, “generated so that half the work involved in thinking is done for you by somebody else (the rules and laws of the logic).” Gabbay explains in a clear and careful manner how formal features of, and formal relations between, ordinary declarative sentences are captured by the systems of propositional and predicate logic.