This presentation of Aristotle's natural deduction system supplements earlier presentations and gives more historical evidence. Some fine-tunings resulted from conversations with Timothy Smiley, Charles Kahn, Josiah Gould, John Kearns,John Glanvillle, and William Parry.The criticism of Aristotle's theory of propositions found at the end of this 1974 presentation was retracted in Corcoran's 2009 HPL article "Aristotle's demonstrative logic".
In previous articles, it has been shown that the deductive system developed by Aristotle in his "second logic" is a natural deduction system and not an axiomatic system as previously had been thought. It was also stated that Aristotle's logic is self-sufficient in two senses: First, that it presupposed no other logical concepts, not even those of propositional logic; second, that it is (strongly) complete in the sense that every valid argument expressible in the language of the system is deducible (...) by means of a formal deduction in the system. Review of the system makes the first point obvious. The purpose of the present article is to prove the second. Strong completeness is demonstrated for the Aristotelian system. (shrink)
Argumentations are at the heart of the deductive and the hypothetico-deductive methods, which are involved in attempts to reduce currently open problems to problems already solved. These two methods span the entire spectrum of problem-oriented reasoning from the simplest and most practical to the most complex and most theoretical, thereby uniting all objective thought whether ancient or contemporary, whether humanistic or scientific, whether normative or descriptive, whether concrete or abstract. Analysis, synthesis, evaluation, and function of argumentations are described. Perennial philosophic (...) problems, epistemic and ontic, related to argumentations are put in perspective. So much of what has been regarded as logic is seen to be involved in the study of argumentations that logic may be usefully defined as the systematic study of argumentations, which is virtually identical to the quest of objective understanding of objectivity. (shrink)
For each positive n , two alternative axiomatizations of the theory of strings over n alphabetic characters are presented. One class of axiomatizations derives from Tarski's system of the Wahrheitsbegriff and uses the n characters and concatenation as primitives. The other class involves using n character-prefixing operators as primitives and derives from Hermes' Semiotik. All underlying logics are second order. It is shown that, for each n, the two theories are definitionally equivalent [or synonymous in the sense of deBouvere]. It (...) is further shown that each member of one class is synonymous with each member of the other class; thus that all of the theories are definitionally equivalent with each other and with Peano arithmetic. Categoricity of Peano arithmetic then implies categoricity of each of the above theories. (shrink)
After a short preface, the first of the three sections of this paper is devoted to historical and philosophic aspects of categoricity. The second section is a self-contained exposition, including detailed definitions, of a proof that every mathematical system whose domain is the closure of its set of distinguished individuals under its distinguished functions is categorically characterized by its induction principle together with its true atoms (atomic sentences and negations of atomic sentences). The third section deals with applications especially those (...) involving the distinction between characterizing a system and axiomatizing the truths of a system. (shrink)
Published with the aid of a grant from the National Endowment for the Humanities. Contains the only complete English-language text of The Concept of Truth in Formalized Languages. Tarski made extensive corrections and revisions of the original translations for this edition, along with new historical remarks. It includes a new preface and a new analytical index for use by philosophers and linguists as well as by historians of mathematics and philosophy.
Prior Analytics by the Greek philosopher Aristotle (384 – 322 BCE) and Laws of Thought by the English mathematician George Boole (1815 – 1864) are the two most important surviving original logical works from before the advent of modern logic. This article has a single goal: to compare Aristotle’s system with the system that Boole constructed over twenty-two centuries later intending to extend and perfect what Aristotle had started. This comparison merits an article itself. Accordingly, this article does not discuss (...) many other historically and philosophically important aspects of Boole’s book, e.g. his confused attempt to apply differential calculus to logic, his misguided effort to make his system of ‘class logic’ serve as a kind of ‘truth-functional logic’, his now almost forgotten foray into probability theory, or his blindness to the fact that a truth-functional combination of equations that follows from a given truth-functional combination of equations need not follow truth-functionally. One of the main conclusions is that Boole’s contribution widened logic and changed its nature to such an extent that he fully deserves to share with Aristotle the status of being a founding figure in logic. By setting forth in clear and systematic fashion the basic methods for establishing validity and for establishing invalidity, Aristotle became the founder of logic as formal epistemology. By making the first unmistakable steps toward opening logic to the study of ‘laws of thought’—tautologies and laws such as excluded middle and non-contradiction—Boole became the founder of logic as formal ontology. (shrink)
This book treats ancient logic: the logic that originated in Greece by Aristotle and the Stoics, mainly in the hundred year period beginning about 350 BCE. Ancient logic was never completely ignored by modern logic from its Boolean origin in the middle 1800s: it was prominent in Boole’s writings and it was mentioned by Frege and by Hilbert. Nevertheless, the first century of mathematical logic did not take it seriously enough to study the ancient logic texts. A renaissance in ancient (...) logic studies occurred in the early 1950s with the publication of the landmark Aristotle’s Syllogistic by Jan Łukasiewicz, Oxford UP 1951, 2nd ed. 1957. Despite its title, it treats the logic of the Stoics as well as that of Aristotle. Łukasiewicz was a distinguished mathematical logician. He had created many-valued logic and the parenthesis-free prefix notation known as Polish notation. He co-authored with Alfred Tarski’s an important paper on metatheory of propositional logic and he was one of Tarski’s the three main teachers at the University of Warsaw. Łukasiewicz’s stature was just short of that of the giants: Aristotle, Boole, Frege, Tarski and Gödel. No mathematical logician of his caliber had ever before quoted the actual teachings of ancient logicians. -/- Not only did Łukasiewicz inject fresh hypotheses, new concepts, and imaginative modern perspectives into the field, his enormous prestige and that of the Warsaw School of Logic reflected on the whole field of ancient logic studies. Suddenly, this previously somewhat dormant and obscure field became active and gained in respectability and importance in the eyes of logicians, mathematicians, linguists, analytic philosophers, and historians. Next to Aristotle himself and perhaps the Stoic logician Chrysippus, Łukasiewicz is the most prominent figure in ancient logic studies. A huge literature traces its origins to Łukasiewicz. -/- This Ancient Logic and Its Modern Interpretations, is based on the 1973 Buffalo Symposium on Modernist Interpretations of Ancient Logic, the first conference devoted entirely to critical assessment of the state of ancient logic studies. (shrink)
Information-theoretic approaches to formal logic analyse the "common intuitive" concept of propositional implication (or argumental validity) in terms of information content of propositions and sets of propositions: one given proposition implies a second if the former contains all of the information contained by the latter; an argument is valid if the conclusion contains no information beyond that of the premise-set. This paper locates information-theoretic approaches historically, philosophically and pragmatically. Advantages and disadvantages are identified by examining such approaches in themselves and (...) by contrasting them with standard transformation-theoretic approaches. Transformation-theoretic approaches analyse validity (and thus implication) in terms of transformations that map one argument onto another: a given argument is valid if no transformation carries it onto an argument with all true premises and false conclusion. Model-theoretic, set-theoretic, and substitution-theoretic approaches, which dominate current literature, can be construed as transformation-theoretic, as can the so-called possible-worlds approaches. Ontic and epistemic presuppositions of both types of approaches are considered. Attention is given to the question of whether our historically cumulative experience applying logic is better explained from a purely information-theoretic perspective or from a purely transformation-theoretic perspective or whether apparent conflicts between the two types of approaches need to be reconciled in order to forge a new type of approach that recognizes their basic complementarity. (shrink)
Since the time of Aristotle's students, interpreters have considered Prior Analytics to be a treatise about deductive reasoning, more generally, about methods of determining the validity and invalidity of premise-conclusion arguments. People studied Prior Analytics in order to learn more about deductive reasoning and to improve their own reasoning skills. These interpreters understood Aristotle to be focusing on two epistemic processes: first, the process of establishing knowledge that a conclusion follows necessarily from a set of premises (that is, on the (...) epistemic process of extracting information implicit in explicitly given information) and, second, the process of establishing knowledge that a conclusion does not follow. Despite the overwhelming tendency to interpret the syllogistic as formal epistemology, it was not until the early 1970s that it occurred to anyone to think that Aristotle may have developed a theory of deductive reasoning with a well worked-out system of deductions comparable in rigor and precision with systems such as propositional logic or equational logic familiar from mathematical logic. When modern logicians in the 1920s and 1930s first turned their attention to the problem of understanding Aristotle's contribution to logic in modern terms, they were guided both by the Frege-Russell conception of logic as formal ontology and at the same time by a desire to protect Aristotle from possible charges of psychologism. They thought they saw Aristotle applying the informal axiomatic method to formal ontology, not as making the first steps into formal epistemology. They did not notice Aristotle's description of deductive reasoning. Ironically, the formal axiomatic method (in which one explicitly presents not merely the substantive axioms but also the deductive processes used to derive theorems from the axioms) is incipient in Aristotle's presentation. Partly in opposition to the axiomatic, ontically-oriented approach to Aristotle's logic and partly as a result of attempting to increase the degree of fit between interpretation and text, logicians in the 1970s working independently came to remarkably similar conclusions to the effect that Aristotle indeed had produced the first system of formal deductions. They concluded that Aristotle had analyzed the process of deduction and that his achievement included a semantically complete system of natural deductions including both direct and indirect deductions. Where the interpretations of the 1920s and 1930s attribute to Aristotle a system of propositions organized deductively, the interpretations of the 1970s attribute to Aristotle a system of deductions, or extended deductive discourses, organized epistemically. The logicians of the 1920s and 1930s take Aristotle to be deducing laws of logic from axiomatic origins; the logicians of the 1970s take Aristotle to be describing the process of deduction and in particular to be describing deductions themselves, both those deductions that are proofs based on axiomatic premises and those deductions that, though deductively cogent, do not establish the truth of the conclusion but only that the conclusion is implied by the premise-set. Thus, two very different and opposed interpretations had emerged, interestingly both products of modern logicians equipped with the theoretical apparatus of mathematical logic. The issue at stake between these two interpretations is the historical question of Aristotle's place in the history of logic and of his orientation in philosophy of logic. This paper affirms Aristotle's place as the founder of logic taken as formal epistemology, including the study of deductive reasoning. A by-product of this study of Aristotle's accomplishments in logic is a clarification of a distinction implicit in discourses among logicians--that between logic as formal ontology and logic as formal epistemology. (shrink)
The syllogistic figures and moods can be taken to be argument schemata as can the rules of the Stoic propositional logic. Sentence schemata have been used in axiomatizations of logic only since the landmark 1927 von Neumann paper [31]. Modern philosophers know the role of schemata in explications of the semantic conception of truth through Tarski’s 1933 Convention T [42]. Mathematical logicians recognize the role of schemata in first-order number theory where Peano’s second-order Induction Axiom is approximated by Herbrand’s Induction-Axiom (...) Schema [23]. Similarly, in first-order set theory, Zermelo’s second-order Separation Axiom is approximated by Fraenkel’s first-order Separation Schema [17]. In some of several closely related senses, a schema is a complex system having multiple components one of which is a template-text or scheme-template, a syntactic string composed of one or more “blanks” and also possibly significant words and/or symbols. In accordance with a side condition the template-text of a schema is used as a “template” to specify a multitude, often infinite, of linguistic expressions such as phrases, sentences, or argument-texts, called instances of the schema. The side condition is a second component. The collection of instances may but need not be regarded as a third component. The instances are almost always considered to come from a previously identified language (whether formal or natural), which is often considered to be another component. This article reviews the often-conflicting uses of the expressions ‘schema’ and ‘scheme’ in the literature of logic. It discusses the different definitions presupposed by those uses. And it examines the ontological and epistemic presuppositions circumvented or mooted by the use of schemata, as well as the ontological and epistemic presuppositions engendered by their use. In short, this paper is an introduction to the history and philosophy of schemata. (shrink)
Demonstrative logic, the study of demonstration as opposed to persuasion, is the subject of Aristotle's two-volume Analytics. Many examples are geometrical. Demonstration produces knowledge (of the truth of propositions). Persuasion merely produces opinion. Aristotle presented a general truth-and-consequence conception of demonstration meant to apply to all demonstrations. According to him, a demonstration, which normally proves a conclusion not previously known to be true, is an extended argumentation beginning with premises known to be truths and containing a chain of reasoning showing (...) by deductively evident steps that its conclusion is a consequence of its premises. In particular, a demonstration is a deduction whose premises are known to be true. Aristotle's general theory of demonstration required a prior general theory of deduction presented in the Prior Analytics. His general immediate-deduction-chaining conception of deduction was meant to apply to all deductions. According to him, any deduction that is not immediately evident is an extended argumentation that involves a chaining of intermediate immediately evident steps that shows its final conclusion to follow logically from its premises. To illustrate his general theory of deduction, he presented an ingeniously simple and mathematically precise special case traditionally known as the categorical syllogistic. (shrink)
Thirteen meanings of 'implication' are described and compared. Among them are relations that have been called: logical implication, material implication,deductive implication, formal implication, enthymemic implication, and factual implication. In a given context, implication is the homogeneous two-place relation expressed by the relation verb 'implies'. For heuristic and expository reasons this article skirts many crucial issues including use-mention, the nature of the entities that imply and are implied, and the processes by which knowledge of these relations are achieved. This paper is (...) better thought of as an early stage of a dialogue than as a definitive treatise. (shrink)
One innovation in this paper is its identification, analysis, and description of a troubling ambiguity in the word ‘argument’. In one sense ‘argument’ denotes a premise-conclusion argument: a two-part system composed of a set of sentences—the premises—and a single sentence—the conclusion. In another sense it denotes a premise-conclusion-mediation argument—later called an argumentation: a three-part system composed of a set of sentences—the premises—a single sentence—the conclusion—and complex of sentences—the mediation. The latter is often intended to show that the conclusion follows from (...) the premises. The complementarity and interrelation of premise-conclusion arguments and premise-conclusion-mediation arguments resonate throughout the rest of the paper which articulates the conceptual structure found in logic from Aristotle to Tarski. This 1972 paper can be seen as anticipating Corcoran’s signature work: the more widely read 1989 paper, Argumentations and Logic, Argumentation 3, 17–43. MR91b:03006. The 1972 paper was translated into Portuguese. The 1989 paper was translated into Spanish, Portuguese, and Persian. (shrink)
In the present article we attempt to show that Aristotle's syllogistic is an underlying logiC which includes a natural deductive system and that it isn't an axiomatic theory as had previously been thought. We construct a mathematical model which reflects certain structural aspects of Aristotle's logic. We examine the relation of the model to the system of logic envisaged in scattered parts of Prior and Posterior Analytics. Our interpretation restores Aristotle's reputation as a logician of consummate imagination and skill. Several (...) attributions of shortcomings and logical errors to Aristotle are shown to be without merit. Aristotle's logic is found to be self-sufficient in several senses: his theory of deduction is logically sound in every detail. (His indirect deductions have been criticized, but incorrectly on our account.) Aristotle's logic presupposes no other logical concepts, not even those of propositional logic. The Aristotelian system is seen to be complete in the sense that every valid argument expressible in his system admits of a deduction within his deductive system: every semantically valid argument is deducible. (shrink)
This study concerns logical systems considered as theories. By searching for the problems which the traditionally given systems may reasonably be intended to solve, we clarify the rationales for the adequacy criteria commonly applied to logical systems. From this point of view there appear to be three basic types of logical systems: those concerned with logical truth; those concerned with logical truth and with logical consequence; and those concerned with deduction per se as well as with logical truth and logical (...) consequence. Adequacy criteria for systems of the first two types include: effectiveness, soundness, completeness, Post completeness, "strong soundness" and strong completeness. Consideration of a logical system as a theory of deduction leads us to attempt to formulate two adequacy criteria for systems of proofs. The first deals with the concept of rigor or "gaplessness" in proofs. The second is a completeness condition for a system of proofs. An historical note at the end of the paper suggests a remarkable parallel between the above hierarchy of systems and the actual historical development of this area of logic. (shrink)
This paper discusses the history of the confusion and controversies over whether the definition of consequence presented in the 11-page 1936 Tarski consequence-definition paper is based on a monistic fixed-universe framework?like Begriffsschrift and Principia Mathematica. Monistic fixed-universe frameworks, common in pre-WWII logic, keep the range of the individual variables fixed as the class of all individuals. The contrary alternative is that the definition is predicated on a pluralistic multiple-universe framework?like the 1931 Gödel incompleteness paper. A pluralistic multiple-universe framework recognizes multiple (...) universes of discourse serving as different ranges of the individual variables in different interpretations?as in post-WWII model theory. In the early 1960s, many logicians?mistakenly, as we show?held the ?contrary alternative? that Tarski 1936 had already adopted a Gödel-type, pluralistic, multiple-universe framework. We explain that Tarski had not yet shifted out of the monistic, Frege?Russell, fixed-universe paradigm. We further argue that between his Principia-influenced pre-WWII Warsaw period and his model-theoretic post-WWII Berkeley period, Tarski's philosophy underwent many other radical changes. (shrink)
It is one thing for a given proposition to follow or to not follow from a given set of propositions and it is quite another thing for it to be shown either that the given proposition follows or that it does not follow.* Using a formal deduction to show that a conclusion follows and using a countermodel to show that a conclusion does not follow are both traditional practices recognized by Aristotle and used down through the history of logic. These (...) practices presuppose, respectively, a criterion of validity and a criterion of invalidity each of which has been extended and refined by modern logicians: deductions are studied in formal syntax (proof theory) and coun¬termodels are studied in formal semantics (model theory). The purpose of this paper is to compare these two criteria to the corresponding criteria employed in Boole’s first logical work, The Mathematical Analysis of Logic (1847). In particular, this paper presents a detailed study of the relevant metalogical passages and an analysis of Boole’s symbolic derivations. It is well known, of course, that Boole’s logical analysis of compound terms (involving ‘not’, ‘and’, ‘or’, ‘except’, etc.) contributed to the enlargement of the class of propositions and arguments formally treatable in logic. The present study shows, in addition, that Boole made significant contributions to the study of deduc¬tive reasoning. He identified the role of logical axioms (as opposed to inference rules) in formal deductions, he conceived of the idea of an axiomatic deductive sys¬tem (which yields logical truths by itself and which yields consequences when ap¬plied to arbitrary premises). Nevertheless, surprisingly, Boole’s attempt to imple¬ment his idea of an axiomatic deductive system involved striking omissions: Boole does not use his own formal deductions to establish validity. Boole does give symbolic derivations, several of which are vitiated by “Boole’s Solutions Fallacy”: the fallacy of supposing that a solution to an equation is necessarily a logical consequence of the equation. This fallacy seems to have led Boole to confuse equational calculi (i.e., methods for gen-erating solutions) with deduction procedures (i.e., methods for generating consequences). The methodological confusion is closely related to the fact, shown in detail below, that Boole had adopted an unsound criterion of validity. It is also shown that Boole totally ignored the countermodel criterion of invalid¬ity. Careful examination of the text does not reveal with certainty a test for invalidity which was adopted by Boole. However, we have isolated a test that he seems to use in this way and we show that this test is ineffectual in the sense that it does not serve to identify invalid arguments. We go beyond the simple goal stated above. Besides comparing Boole’s earliest criteria of validity and invalidity with those traditionally (and still generally) employed, this paper also investigates the framework and details of THE MATHEMATICAL ANALYSIS OF LOGIC. (shrink)
Prior Analytics by the Greek philosopher Aristotle and Laws of Thought by the English mathematician George Boole are the two most important surviving original logical works from before the advent of modern logic. This article has a single goal: to compare Aristotle's system with the system that Boole constructed over twenty-two centuries later intending to extend and perfect what Aristotle had started. This comparison merits an article itself. Accordingly, this article does not discuss many other historically and philosophically important aspects (...) of Boole's book, e.g. his confused attempt to apply differential calculus to logic, his misguided effort to make his system of ‘class logic’ serve as a kind of ‘truth-functional logic’, his now almost forgotten foray into probability theory, or his blindness to the fact that a truth-functional combination of equations that follows from a given truth-functional combination of equations need not follow truth-functionally. One of the main conclusions is that Boole's contribution widened logic and changed its nature to such an extent that he fully deserves to share with Aristotle the status of being a founding figure in logic. By setting forth in clear and systematic fashion the basic methods for establishing validity and for establishing invalidity, Aristotle became the founder of logic as formal epistemology. By making the first unmistakable steps toward opening logic to the study of ‘laws of thought’—tautologies and laws such as excluded middle and non-contradiction—Boole became the founder of logic as formal ontology. … using mathematical methods … has led to more knowledge about logic in one century than had been obtained from the death of Aristotle up to … when Boole's masterpiece was published. Paul Rosenbloom 1950. (shrink)
Corcoran, John. 1974. Aristotelian Syllogisms: Valid arguments or true generalized conditionals?, Mind 83, 278–81. MR0532928 (58 #27178) This tightly-written and self-contained four-page paper must be studied and not just skimmed. It meticulously analyses quotations from Aristotle and Lukasiewicz to establish that Aristotle was using indirect deductions—as required by the natural-deduction interpretation—and not indirect proofs—as required by the axiomatic interpretation. Lukasiewicz was explicit and clear about the subtle fact that Aristotle’s practice could not be construed as correctly performed indirect proof. Lukasiewicz (...) evidence is presented fully; it is irrefutable. But, instead of considering the possibility that Aristotle’s discourses were not intended to express indirect proofs of universalized conditions presupposing axiomatic premises, Lukasiewicz came to the amazing conclusion that Aristotle did not understand indirect proof. This paper builds on the admirable Lukasiewicz scholarship to establish a conclusion diametrically opposed to the one Lukasiewicz asserted. This paper points out that if Aristotle had not been establishing an underlying logic but he was instead axiomatizing a theory of terms, Aristotelians could not claim for Aristotle the title Founder of Logic. People who take Euclid, Peano, and Zermelo to have first axiomatized geometry, arithmetic, and set theory, respectively, do not think that this qualifies them for the titles Founder of Geometry, Founder of Arithmetic, and Founder of Set Theory, respectively. Such people do think that this would qualify the three for the titles Founder of Axiomatic Geometry, Founder of Axiomatic Arithmetic, and Founder of Axiomatic Set Theory, respectively: titles that carry no honors in logic. Being the Founder of Axiomatic Term Theory would likewise carry no honors in the field of logic. CORRECTION: The paper inadvertently implied that Euclid was the first to axiomatize geometry. In order to understand Aristotle’s Analytics as a treatise on demonstration it helps to realize that axiomatized geometry was studied in Plato’s Academy when Aristotle was a student. Euclid was the author of the only ancient axiomatized geometry now available. (shrink)
The five English words—sentence, proposition, judgment, statement, and fact—are central to coherent discussion in logic. However, each is ambiguous in that logicians use each with multiple normal meanings. Several of their meanings are vague in the sense of admitting borderline cases. In the course of displaying and describing the phenomena discussed using these words, this paper juxtaposes, distinguishes, and analyzes several senses of these and related words, focusing on a constellation of recommended senses. One of the purposes of this paper (...) is to demonstrate that ordinary English properly used has the resources for intricate and philosophically sound investigation of rather deep issues in logic and philosophy of language. No mathematical, logical, or linguistic symbols are used. Meanings need to be identified and clarified before being expressed in symbols. We hope to establish that clarity is served by deferring the extensive use of formalized or logically perfect languages until a solid “informal” foundation has been established. Questions of “ontological status”—e.g., whether propositions or sentences, or for that matter characters, numbers, truth-values, or instants, are “real entities”, are “idealizations”, or are “theoretical constructs”—plays no role in this paper. As is suggested by the title, this paper is written to be read aloud. -/- I hope that reading this aloud in groups will unite people in the enjoyment of the humanistic spirit of analytic philosophy. (shrink)
“Second-order Logic” in Anderson, C.A. and Zeleny, M., Eds. Logic, Meaning, and Computation: Essays in Memory of Alonzo Church. Dordrecht: Kluwer, 2001. Pp. 61–76. -/- Abstract. This expository article focuses on the fundamental differences between second- order logic and first-order logic. It is written entirely in ordinary English without logical symbols. It employs second-order propositions and second-order reasoning in a natural way to illustrate the fact that second-order logic is actually a familiar part of our traditional intuitive logical framework and (...) that it is not an artificial formalism created by specialists for technical purposes. To illustrate some of the main relationships between second-order logic and first-order logic, this paper introduces basic logic, a kind of zero-order logic, which is more rudimentary than first-order and which is transcended by first-order in the same way that first-order is transcended by second-order. The heuristic effectiveness and the historical importance of second-order logic are reviewed in the context of the contemporary debate over the legitimacy of second-order logic. Rejection of second-order logic is viewed as radical: an incipient paradigm shift involving radical repudiation of a part of our scientific tradition, a tradition that is defended by classical logicians. But it is also viewed as reactionary: as being analogous to the reactionary repudiation of symbolic logic by supporters of “Aristotelian” traditional logic. But even if “genuine” logic comes to be regarded as excluding second-order reasoning, which seems less likely today than fifty years ago, its effectiveness as a heuristic instrument will remain and its importance for understanding the history of logic and mathematics will not be diminished. Second-order logic may someday be gone, but it will never be forgotten. Technical formalisms have been avoided entirely in an effort to reach a wide audience, but every effort has been made to limit the inevitable sacrifice of rigor. People who do not know second-order logic cannot understand the modern debate over its legitimacy and they are cut-off from the heuristic advantages of second-order logic. And, what may be worse, they are cut-off from an understanding of the history of logic and thus are constrained to have distorted views of the nature of the subject. As Aristotle first said, we do not understand a discipline until we have seen its development. It is a truism that a person's conceptions of what a discipline is and of what it can become are predicated on their conception of what it has been. (shrink)
Contrary to common misconceptions, today's logic is not devoid of existential import: the universalized conditional ∀ x [S→ P] implies its corresponding existentialized conjunction ∃ x [S & P], not in all cases, but in some. We characterize the proexamples by proving the Existential-Import Equivalence: The antecedent S of the universalized conditional alone determines whether the universalized conditional has existential import, i.e. whether it implies its corresponding existentialized conjunction.A predicate is an open formula having only x free. An existential-import predicate (...) Q is one whose existentialization, ∃ x Q, is logically true; otherwise, Q is existential-import-free or simply import-free.How abundant or widespread is existential import? How abundant or widespread are existential-import predicates in themselves or in comparison to import-free predicates? We show that existential-import predicates are quite abundant, and no less so than import-free predicates. Existential.. (shrink)
This interesting and imaginative monograph is based on the author’s PhD dissertation supervised by Saul Kripke. It is dedicated to Timothy Smiley, whose interpretation of PRIOR ANALYTICS informs its approach. As suggested by its title, this short work demonstrates conclusively that Aristotle’s syllogistic is a suitable vehicle for fruitful discussion of contemporary issues in logical theory. Aristotle’s syllogistic is represented by Corcoran’s 1972 reconstruction. The review studies Lear’s treatment of Aristotle’s logic, his appreciation of the Corcoran-Smiley paradigm, and his understanding (...) of modern logical theory. In the process Corcoran and Scanlan present new, previously unpublished results. Corcoran regards this review as an important contribution to contemporary study of PRIOR ANALYTICS: both the book and the review deserve to be better known. (shrink)
Chapin reviewed this 1972 ZEITSCHRIFT paper that proves the completeness theorem for the logic of variable-binding-term operators created by Corcoran and his student John Herring in the 1971 LOGIQUE ET ANALYSE paper in which the theorem was conjectured. This leveraging proof extends completeness of ordinary first-order logic to the extension with vbtos. Newton da Costa independently proved the same theorem about the same time using a Henkin-type proof. This 1972 paper builds on the 1971 “Notes on a Semantic Analysis of (...) Variable Binding Term Operators” (Co-author John Herring), Logique et Analyse 55, 646–57. MR0307874 (46 #6989). A variable binding term operator (vbto) is a non-logical constant, say v, which combines with a variable y and a formula F containing y free to form a term (vy:F) whose free variables are exact ly those of F, excluding y. Kalish-Montague 1964 proposed using vbtos to formalize definite descriptions “the x: x+x=2”, set abstracts {x: F}, minimization in recursive function theory “the least x: x+x>2”, etc. However, they gave no semantics for vbtos. Hatcher 1968 gave a semantics but one that has flaws described in the 1971 paper and admitted by Hatcher. In 1971 we give a correct semantic analysis of vbtos. We also give axioms for using them in deductions. And we conjecture strong completeness for the deductions with respect to the semantics. The conjecture, proved in this paper with Hatcher’s help, was proved independently about the same time by Newton da Costa. (shrink)
This essay takes logic and ethics in broad senses: logic as the science of evidence; ethics as the science justice. One of its main conclusions is that neither science can be fruitfully pursued without the virtues fostered by the other: logic is pointless without fairness and compassion; ethics is pointless without rigor and objectivity. The logician urging us to be dispassionate is in resonance and harmony with the ethicist urging us to be compassionate.
This accessible essay treats knowledge and belief in a usable and applicable way. Many of its basic ideas have been developed recently in Corcoran-Hamid 2014: Investigating knowledge and opinion. The Road to Universal Logic. Vol. I. Arthur Buchsbaum and Arnold Koslow, Editors. Springer. Pp. 95-126. http://www.springer.com/birkhauser/mathematics/book/978-3-319-10192-7 .
-/- A schema (plural: schemata, or schemas), also known as a scheme (plural: schemes), is a linguistic template or pattern together with a rule for using it to specify a potentially infinite multitude of phrases, sentences, or arguments, which are called instances of the schema. Schemas are used in logic to specify rules of inference, in mathematics to describe theories with infinitely many axioms, and in semantics to give adequacy conditions for definitions of truth. -/- 1. What is a Schema? (...) 2. Uses of Schemas 3. The Ontological Status of Schemas 4. Schemas in the History of Logic Bibliography. (shrink)
We are much better equipped to let the facts reveal themselves to us instead of blinding ourselves to them or stubbornly trying to force them into preconceived molds. We no longer embarrass ourselves in front of our students, for example, by insisting that “Some Xs are Y” means the same as “Some X is Y”, and lamely adding “for purposes of logic” whenever there is pushback. Logic teaching in this century can exploit the new spirit of objectivity, humility, clarity, observationalism, (...) contextualism, and pluralism. Besides the new spirit there have been quiet developments in logic and its history and philosophy that could radically improve logic teaching. This lecture expands points which apply equally well in first, second, and third courses, i.e. in “critical thinking”, “deductive logic”, and “symbolic logic”. (shrink)
This paper raises obvious questions undermining any residual confidence in Mates work and revealing our embarrassing ignorance of true nature of Stoic deduction. It was inspired by the challenging exploratory work of JOSIAH GOULD.
We are much better equipped to let the facts reveal themselves to us instead of blinding ourselves to them or stubbornly trying to force them into preconceived molds. We no longer embarrass ourselves in front of our students, for example, by insisting that “Some Xs are Y” means the same as “Some X is Y”, and lamely adding “for purposes of logic” whenever there is pushback. Logic teaching in this century can exploit the new spirit of objectivity, humility, clarity, observationalism, (...) contextualism, and pluralism. Besides the new spirit there have been quiet developments in logic and its history and philosophy that could radically improve logic teaching. One rather conspicuous example is that the process of refining logical terminology has been productive. Future logic students will no longer be burdened by obscure terminology and they will be able to read, think, talk, and write about logic in a more careful and more rewarding manner. Closely related is increased use and study of variable-enhanced natural language as in “Every proposition x that implies some proposition y that is false also implies some proposition z that is true”. Another welcome development is the culmination of the slow demise of logicism. No longer is the teacher blocked from using examples from arithmetic and algebra fearing that the students had been indoctrinated into thinking that every mathematical truth was a tautology and that every mathematical falsehood was a contradiction. A fifth welcome development is the separation of laws of logic from so-called logical truths, i.e., tautologies. Now we can teach the logical independence of the laws of excluded middle and non-contradiction without fear that students had been indoctrinated into thinking that every logical law was a tautology and that every falsehood of logic was a contradiction. This separation permits the logic teacher to apply logic in the clarification of laws of logic. This lecture expands the above points, which apply equally well in first, second, and third courses, i.e. in “critical thinking”, “deductive logic”, and “symbolic logic”. (shrink)
An information recovery problem is the problem of constructing a proposition containing the information dropped in going from a given premise to a given conclusion that folIows. The proposition(s) to beconstructed can be required to satisfy other conditions as well, e.g. being independent of the conclusion, or being “informationally unconnected” with the conclusion, or some other condition dictated by the context. This paper discusses various types of such problems, it presents techniques and principles useful in solving them, and it develops (...) algorithmic methods for certain classes of such problems. The results are then applied to classical number theory, in particular, to questions concerning possible refinements of the 1931 Gödel Axiom Set, e.g. whether any of its axioms can be analyzed into “informational atoms”. Two propositions are “informationally unconnected” [with each other] if no informative (nontautological) consequence of one also follows from the other. A proposition is an “informational atom” if it is informative but no information can be dropped from it without rendering it uninformative (tautological). Presentation, employment, and investigation of these two new concepts are prominent features of this paper. (shrink)
This expository paper on Aristotle’s prototype underlying logic is intended for a broad audience that includes non-specialists. It requires as background a discussion of Aristotle’s demonstrative logic. Demonstrative logic or apodictics is the study of demonstration as opposed to persuasion. It is the subject of Aristotle’s two-volume Analytics, as its first sentence says. Many of Aristotle’s examples are geometrical. A typical geometrical demonstration requires a theorem that is to be demonstrated, known premises from which the theorem is to be deduced, (...) and a deductive logic by which the steps of the deduction proceed. Every demonstration produces knowledge of its conclusion for every person who comprehends the demonstration. Aristotle presented a general truth-and-consequence theory of demonstration meant to apply to all demonstrations: a demonstration is an extended argumentation that begins with premises known to be truths and that involves a chain of reasoning showing by deductively evident steps that its conclusion is a consequence of its premises. In short, a demonstration is a deduction whose premises are known to be true. Aristotle’s general theory of demonstration required a prior general theory of deduction presented in the Prior Analytics. His general immediate-deduction-chaining theory of deduction was meant to apply to all deductions: any deduction that is not immediately evident is an extended argumentation that involves a chaining of immediately evident steps that shows its final conclusion to follow logically from its premises. His deductions, both direct and indirect, were rule-based and not tautology-based. The idea of tautology-based deduction, which dominated modern logic in the early years of the 1900s, is nowhere to be found in Analytics. Rule-based deduction was rediscovered by modern logicians. To illustrate his general theory of deduction, Aristotle presented a prototype: an ingeniously simple and mathematically precise special case traditionally known as the categorical syllogistic. With reference only to propositions of the four so-called categorical forms, he painstakingly worked out exactly what those immediately evident deductive steps are and how they are chained to complete deductions. In his specialized prototype theory, Aristotle explained how to deduce from a given categorical premise set, no matter how large, any categorical conclusion implied by the given set. He did not extend this treatment to non-categorical deductions, thus setting a program for future logicians. The prototype, categorical syllogistic, was seen by Boole as a “first approximation” to a comprehensive logic. Today, however it appears more as the first of the dozens of logics already created and as the first exemplification of a family that continues to expand. (shrink)
This work treats the correlative concepts knowledge and opinion, in various senses. In all senses of ‘knowledge’ and ‘opinion’, a belief known to be true is knowledge; a belief not known to be true is opinion. In this sense of ‘belief’, a belief is a proposition thought to be true—perhaps, but not necessarily, known to be true. All knowledge is truth. Some but not all opinion is truth. Every proposition known to be true is believed to be true. Some but (...) not every proposition believed to be true is known to be true. Our focus is thus on propositional belief (“belief-that”): the combination of propositional knowledge (“knowledge-that”) and propositional opinion (“opinion-that”). Each of a person’s beliefs, whether knowledge or opinion, is the end result of a particular thought process that continued during a particular time interval and ended at a particular time with a conclusive act—a judgment that something is the case. This work is mainly about beliefs in substantive informative propositions—not empty tautologies. We also treat objectual knowledge (knowledge of objects in the broadest sense, or “knowledge-of”), operational knowledge (abilities and skills, “knowledge-how-to”, or “know-how”), and expert knowledge (expertise). Most points made in this work have been made by previous writers, but to the best of our knowledge, they have never before been collected into a coherent work accessible to a wide audience. -/- Key words: belief, knowledge/opinion, propositional, operational, objectual, cognition, . (shrink)
This presentation includes a complete bibliography of John Corcoran’s publications devoted at least in part to Aristotle’s logic. Sections I–IV list 20 articles, 43 abstracts, 3 books, and 10 reviews. It starts with two watershed articles published in 1972: the Philosophy & Phenomenological Research article that antedates Corcoran’s Aristotle’s studies and the Journal of Symbolic Logic article first reporting his original results; it ends with works published in 2015. A few of the items are annotated with endnotes connecting them with (...) other work. In addition, Section V “Discussions” is a nearly complete secondary bibliography of works describing, interpreting, extending, improving, supporting, and criticizing Corcoran’s work: 8 items published in the 1970s, 22 in the 1980s, 39 in the 1990s, 56 in the 2000s, and 65 in the current decade. The secondary bibliography is annotated with endnotes: some simply quoting from the cited item, but several answering criticisms and identifying errors. As is evident from the Acknowledgements sections, all of Corcoran’s publications benefited from correspondence with other scholars, most notably Timothy Smiley, Michael Scanlan, and Kevin Tracy. All of Corcoran’s Greek translations were done in consultation with two or more classicists. Corcoran never published a sentence without discussing it with his colleagues and students. REQUEST: Please send errors, omissions, and suggestions. I am especially interested in citations made in non-English publications. (shrink)
This paper develops a modal, Sentential logic having "not", "if...Then" and necessity as logical constants. The semantics (system of meanings) of the logic is the most obvious generalization of the usual truth-Functional semantics for sentential logic and its deductive system (system of demonstrations) is an obvious generalization of a suitable (jaskowski-Type) natural deductive system for sentential logic. Let a be a set of sentences and p a sentence. "p is a logical consequence of a" is defined relative to the semantics (...) and "p is demonstrable from a" is defined relative to the deductive system. Main meta-Theorem: p is demonstrable from a if and only if p is a logical consequence of a. Henkin-Type methods are used. The theorems of the logic are exactly those of s5. The deductive system is rigorously developed as a system of linear strings. (shrink)