Gentzen's systems of natural deduction and sequent calculus were byproducts in his program of proving the consistency of arithmetic and analysis. It is suggested that the central component in his results on logical calculi was the use of a tree form for derivations. It allows the composition of derivations and the permutation of the order of application of rules, with a full control over the structure of derivations as a result. Recently found documents shed new light on the discovery of (...) these calculi. In particular, Gentzen set up five different forms of natural calculi and gave a detailed proof of normalization for intuitionistic natural deduction. An early handwritten manuscript of his thesis shows that a direct translation from natural deduction to the axiomatic logic of Hilbert and Ackermann was, in addition to the influence of Paul Hertz, the second component in the discovery of sequent calculus. A system intermediate between the sequent calculus LI and axiomatic logic, denoted LIG in unpublished sources, is implicit in Gentzen's published thesis of 1934—35. The calculus has half rules, half “groundsequents,” and does not allow full cut elimination. Nevertheless, a translation from LI to LIG in the published thesis gives a subformula property for a complete class of derivations in LIG. After the thesis, Gentzen continued to work on variants of sequent calculi for ten more years, in the hope to find a consistency proof for arithmetic within an intuitionistic calculus. (shrink)
The Löwenheim-Skolem theorem was published in Skolem's long paper of 1920, with the first section dedicated to the theorem. The second section of the paper contains a proof-theoretical analysis of derivations in lattice theory. The main result, otherwise believed to have been established in the late 1980s, was a polynomial-time decision algorithm for these derivations. Skolem did not develop any notation for the representation of derivations, which makes the proofs of his results hard to follow. Such a formal notation is (...) given here by which these proofs become transparent. A third section of Skolem's paper gives an analysis for derivations in plane projective geometry. To clear a gap in Skolem's result, a new conservativity property is shown for projective geometry, to the effect that a proper use of the axiom that gives the uniqueness of connecting lines and intersection points requires a conclusion with proper cases (logically, a disjunction in a positive part) to be proved. The forgotten parts of Skolem's first paper on the Löwenheim-Skolem theorem are the perhaps earliest combinatorial analyses of formal mathematical proofs, and at least the earliest analyses with profound results. (shrink)
A proof-theoretical analysis of elementary theories of order relations is effected through the formulation of order axioms as mathematical rules added to contraction-free sequent calculus. Among the results obtained are proof-theoretical formulations of conservativity theorems corresponding to Szpilrajn’s theorem on the extension of a partial order into a linear one. Decidability of the theories of partial and linear order for quantifier-free sequents is shown by giving terminating methods of proof-search.
Attention is drawn to the fact that what is alternatively known as Dummett logic, Gödel logic, or Gödel-Dummett logic, was actually introduced by Skolem already in 1913. A related work of 1919 introduces implicative lattices, or Heyting algebras in today's terminology.
A sequent calculus is given in which the management of weakening and contraction is organized as in natural deduction. The latter has no explicit weakening or contraction, but vacuous and multiple discharges in rules that discharge assumptions. A comparison to natural deduction is given through translation of derivations between the two systems. It is proved that if a cut formula is never principal in a derivation leading to the right premiss of cut, it is a subformula of the conclusion. Therefore (...) it is sufficient to eliminate those cuts that correspond to detour and permutation conversions in natural deduction. (shrink)
Gentzen's original proof of the Hauptsatz used a rule of multicut in the case that the right premiss of cut was derived by contraction. Cut elimination is here proved without multicut, by transforming suitably the derivation of the premiss of the contraction.
The structure of derivations in natural deduction is analyzed through isomorphism with a suitable sequent calculus, with twelve hidden convertibilities revealed in usual natural deduction. A general formulation of conjunction and implication elimination rules is given, analogous to disjunction elimination. Normalization through permutative conversions now applies in all cases. Derivations in normal form have all major premisses of elimination rules as assumptions. Conversion in any order terminates.Through the condition that in a cut-free derivation of the sequent Γ⇒C, no inactive weakening (...) or contraction formulas remain in Γ, a correspondence with the formal derivability relation of natural deduction is obtained: All formulas of Γ become open assumptions in natural deduction, through an inductively defined translation. Weakenings are interpreted as vacuous discharges, and contractions as multiple discharges. In the other direction, non-normal derivations translate into derivations with cuts having the cut formula principal either in both premisses or in the right premiss only. (shrink)
A way is found to add axioms to sequent calculi that maintains the eliminability of cut, through the representation of axioms as rules of inference of a suitable form. By this method, the structural analysis of proofs is extended from pure logic to free-variable theories, covering all classical theories, and a wide class of constructive theories. All results are proved for systems in which also the rules of weakening and contraction can be eliminated. Applications include a system of predicate logic (...) with equality in which also cuts on the equality axioms are eliminated. (shrink)
Three things are presented: How Hilbert changed the original construction postulates of his geometry into existential axioms; In what sense he formalized geometry; How elementary geometry is formalized to present day's standards.
Mathematical Astronomy as the most developed branch of ancient exact sciences has been widely discussed - especially epistemological issues e.g. concerning astronomy as a prime example of the distinction between instrumentalist and realist understanding of theories. In contrast to these the very methodology of ancient astronomy has received little attention. Following the work of Jaakko Hintikka and Unto Remes Aristarchus' method of determining the distance of the Sun is sketched and Ptolemy's solar model is discussed in detail.
Bruno de Finetti's earliest works on the foundations of probability are reviewed. These include the notion of exchangeability and the theory of random processes with independent increments. The latter theory relates to de Finetti's ideas for a probabilistic science more generally. Different aspects of his work are united by his foundational programme for a theory of subjective probabilities.
A formulation of probabilistic causality is given in terms of the theory of abstract dynamical systems. Causal factors are identified as invariants of motion of a system. Repetition of an experiment leads to the notion of stationarity, and causal factors yield a decomposition of the stationary probability law of the experiment into ergodic components. In these, statistical behaviour is uniform. Control of identified causal factors leads to a corresponding statistical law for the events, which is offered as a notion of (...) probabilistic causality. After a suggestion by Feller, randomization is identified as mixing, formulated in above terms. (shrink)
This paper discusses different interpretations of probability in relation to determinism. It is argued that both objective and subjective views on probability can be compatible with deterministic as well as indeterministic situations. The possibility of a conceptual independence between probability and determinism is argued to hold on a general level. The subsequent philosophical analysis of recent advances in classical statistical mechanics (ergodic theory) is of independent interest, but also adds weight to the claim that it is possible to justify an (...) objective interpretation of probabilities in a theory having as a basis the paradigmatically deterministic theory of classical mechanics. (shrink)
de Finetti's representation theorem of exchangeable probabilities as unique mixtures of Bernoullian probabilities is a special case of a result known as the ergodic decomposition theorem. It says that stationary probability measures are unique mixtures of ergodic measures. Stationarity implies convergence of relative frequencies, and ergodicity the uniqueness of limits. Ergodicity therefore captures exactly the idea of objective probability as a limit of relative frequency (up to a set of measure zero), without the unnecessary restriction to probabilistically independent events as (...) in de Finetti's theorem. The ergodic decomposition has in some applications to dynamical systems a physical content, and de Finetti's reductionist interpretation of his result is not adequate in these cases. (shrink)
De Finetti's representation theorem is a special case of the ergodic decomposition of stationary probability measures. The problems of the interpretation of probabilities centred around de Finetti's theorem are extended to this more general situation. The ergodic decomposition theorem has a physical background in the ergodic theory of dynamical systems. Thereby the interpretations of probabilities in the cases of de Finetti's theorem and its generalization and in ergodic theory are systematically connected to each other.