The logician's central concern is with the validity of argument. A logical theory ought, therefore, to provide a general criterion of validity. This book sets out to find such a criterion, and to describe the philosophical basis and the formal theory of a logic in which the premises of a valid argument are relevant to its conclusion. The notion of relevance required for this theory is obtained by an analysis of the grounds for asserting a formula in a proof.
Anti-exceptionalism about logic is the doctrine that logic does not require its own epistemology, for its methods are continuous with those of science. Although most recently urged by Williamson, the idea goes back at least to Lakatos, who wanted to adapt Popper's falsicationism and extend it not only to mathematics but to logic as well. But one needs to be careful here to distinguish the empirical from the a posteriori. Lakatos coined the term 'quasi-empirical' `for the counterinstances to putative mathematical (...) and logical theses. Mathematics and logic may both be a posteriori, but it does not follow that they are empirical. Indeed, as Williamson has demonstrated, what counts as empirical knowledge, and the role of experience in acquiring knowledge, are both unclear. Moreover, knowledge, even of necessary truths, is fallible. Nonetheless, logical consequence holds in virtue of the meaning of the logical terms, just as consequence in general holds in virtue of the meanings of the concepts involved; and so logic is both analytic and necessary. In this respect, it is exceptional. But its methodologyand its epistemology are the same as those of mathematics and science in being fallibilist, and counterexamples to seemingly analytic truths are as likely as those in any scientic endeavour. What is needed is a new account of the evidential basis of knowledge, one which is, perhaps surprisingly, found in Aristotle. (shrink)
Inferentialism claims that expressions are meaningful by virtue of rules governing their use. In particular, logical expressions are autonomous if given meaning by their introduction-rules, rules specifying the grounds for assertion of propositions containing them. If the elimination-rules do no more, and no less, than is justified by the introduction-rules, the rules satisfy what Prawitz, following Lorenzen, called an inversion principle. This connection between rules leads to a general form of elimination-rule, and when the rules have this form, they may (...) be said to exhibit “general-elimination” harmony. Ge-harmony ensures that the meaning of a logical expression is clearly visible in its I-rule, and that the I- and E-rules are coherent, in encapsulating the same meaning. However, it does not ensure that the resulting logical system is normalizable, nor that it satisfies the conservative extension property, nor that it is consistent. Thus harmony should not be identified with any of these notions. (shrink)
In this book, Stephen Read sets out to rescue logic from its undeserved reputation as an inflexible, dogmatic discipline by demonstrating that its technicalities and processes are founded on assumptions which are themselves amenable to philosophical investigation. He examines the fundamental principles of consequence, logical truth and correct inference within the context of logic, and shows that the principles by which we delineate consequences are themselves not guaranteed free from error. Central to the notion of truth is the beguiling issue (...) of paradox. Its philosophical value, Read shows, lies in exposing the invalid assumption on which the paradox is built. Thinking About Logic also discusses logical puzzles which introduce questions relating to language, the world, and their relationship. (shrink)
Michael Dummett and Dag Prawitz have argued that a constructivist theory of meaning depends on explicating the meaning of logical constants in terms of the theory of valid inference, imposing a constraint of harmony on acceptable connectives. They argue further that classical logic, in particular, classical negation, breaks these constraints, so that classical negation, if a cogent notion at all, has a meaning going beyond what can be exhibited in its inferential use. I argue that Dummett gives a mistaken elaboration (...) of the notion of harmony, an idea stemming from a remark of Gerhard Gentzen's. The introduction-rules are autonomous if they are taken fully to specify the meaning of the logical constants, and the rules are harmonious if the elimination-rule draws its conclusion from just the grounds stated in the introduction-rule. The key to harmony in classical logic then lies in strengthening the theory of the conditional so that the positive logic contains the full classical theory of the conditional. This is achieved by allowing parametric formulae in the natural deduction proofs, a form of multiple-conclusion logic. (shrink)
General-elimination harmony articulates Gentzen’s idea that the elimination-rules are justified if they infer from an assertion no more than can already be inferred from the grounds for making it. Dummett described the rules as not only harmonious but stable if the E-rules allow one to infer no more and no less than the I-rules justify. Pfenning and Davies call the rules locally complete if the E-rules are strong enough to allow one to infer the original judgement. A method is given (...) of generating harmonious general-elimination rules from a collection of I-rules. We show that the general-elimination rules satisfy Pfenning and Davies’ test for local completeness, but question whether that is enough to show that they are stable. Alternative conditions for stability are considered, including equivalence between the introduction- and elimination-meanings of a connective, and recovery of the grounds for assertion, finally generalizing the notion of local completeness to capture Dummett’s notion of stability satisfactorily. We show that the general-elimination rules meet the last of these conditions, and so are indeed not only harmonious but also stable. (shrink)
The correspondence theory of truth has experienced something of a revival recently in the form of the Truthmaker Axiom: whatever is true, something makes it true. We consider various postulates which have been proposed to characterize truthmaking, in particular, the Disjunction Thesis (DT), that whatever makes a disjunction true must make one or other disjunct true. In conjunction with certain other assumptions, DT leads to triviality. We show that there are elaborations of truthmaking on which DT holds (which must therefore (...) take steps to avoid the triviality); but that there are more plausible accounts of truthmaking on which DT fails. (shrink)
Roger Swyneshed, in his treatise on insolubles, dating from the early 1330s, drew three notorious corollaries from his solution. The third states that there is a contradictory pair of propositions both of which are false. This appears to contradict what Whitaker, in his iconoclastic reading of Aristotle’s De Interpretatione, dubbed “The Rule of Contradictory Pairs”, which requires that in every such pair, one must be true and the other false. Whitaker argued that, immediately after defining the notion of a contradictory (...) pair, in which one statement affirms what the other denies of the same thing, Aristotle himself gave counterexamples to the rule. This gives some credence to Swyneshed’s claim that his solution to the logical paradoxes is not contrary to Aristotle’s teaching, as many of Swyneshed’s contemporaries claimed. Insolubles are false, he said, because they falsify themselves; and their contradictories are false because they falsely deny that the insoluble itself is false. Swyneshed’s solution depends crucially on the revision he makes to the acount of truth and falsehood, brought out in his first thesis: that a false proposition can signify as it is, or as Paul of Venice, who took up and developed Swyneshed’s solution some sixty years later, puts it, a false proposition can have a true significate. Swyneshed gave a further counterexample to when he claimed that some insolubles, like future contingents, are neither true nor false. Dialetheism, the contemporary claim that some propositions are both true and false, is wedded to the Rule, and in consequence divorces denial from the assertion of the contradictory negation. Consequently, Swyneshed’s logical heresy is very different from that found in dialetheism. (shrink)
Jan Lukasiewicz's treatise on Aristotle's Syllogistic, published in the 1950s, has been very influential in framing contemporary understanding of Aristotle's logical systems. However, Lukasiewicz's interpretation is based on a number of tendentious claims, not least, the claim that the syllogistic was intended to apply only to non-empty terms. I show that this interpretation is not true to Aristotle's text and that a more coherent and faithful interpretation admits empty terms while maintaining all the relations of the traditional square of opposition.
Logical pluralism is the claim that different accounts of validity can be equally correct. Beall and Restall have recently defended this position. Validity is a matter of truth-preservation over cases, they say: the conclusion should be true in every case in which the premises are true. Each logic specifies a class of cases, but differs over which cases should be considered. I show that this account of logic is incoherent. Validity indeed is truth-preservation, provided this is properly understood. Once understood, (...) there is one true logic, relevance logic. The source of Beall and Restall’s error is a recent habit of using a classical metalanguage to analyse non-classical logics generally, including relevance logic. (shrink)
The idea of proof-theoretic validity originated in the work of Gentzen, when he suggested that the meaning of each logical expression was encapsulated in its introduction-rules. The idea was developed by Prawitz and Dummett, but came under attack by Prior under the soubriquet 'analytic validity'. Logical truths and logical consequences are deemed analytically valid by virtue of following, in a way which the present chapter clarifies, from the meaning of the logical constants. But different logics are based on different rules, (...) confer different meanings and so validate different theorems and consequences, some of which are arguably not true or valid at all. It seems to follow that some analytic statements are in fact false. The moral is that we must be careful what rules we adopt and what meanings we use our rules to determine. (shrink)
Logical inferentialism claims that the meaning of the logical constants should be given, not model-theoretically, but by the rules of inference of a suitable calculus. It has been claimed that certain proof-theoretical systems, most particularly, labelled deductive systems for modal logic, are unsuitable, on the grounds that they are semantically polluted and suffer from an untoward intrusion of semantics into syntax. The charge is shown to be mistaken. It is argued on inferentialist grounds that labelled deductive systems are as syntactically (...) pure as any formal system in which the rules define the meanings of the logical constants. (shrink)
The recovery of Aristotle’s logic during the twelfth century was a great stimulus to medieval thinkers. Among their own theories developed to explain Aristotle’s theories of valid and invalid reasoning was a theory of consequence, of what arguments were valid, and why. By the fourteenth century, two main lines of thought had developed, one at Oxford, the other at Paris. Both schools distinguished formal from material consequence, but in very different ways. In Buridan and his followers in Paris, formal consequence (...) was that preserved under uniform substitution. In Oxford, in contrast, formal consequence included analytic consequences such as ‘If it’s a man, then it’s an animal’. Aristotle’s notion of syllogistic consequence was subsumed under the treatment of formal consequence. Buridan developed a general theory embracing the assertoric syllogism, the modal syllogism and syllogisms with oblique terms. The result was a thoroughly systematic and extensive treatment of logical theory and logical consequence which repays investigation. (shrink)
In recent years, speech-act theory has mooted the possibility that one utterance can signify a number of different things. This pluralist conception of signification lies at the heart of Thomas Bradwardine’s solution to the insolubles, logical puzzles such as the semantic paradoxes, presented in Oxford in the early 1320s. His leading assumption was that signification is closed under consequence, that is, that a proposition signifies everything which follows from what it signifies. Then any proposition signifying its own falsity, he showed, (...) also signifies its own truth and so, since it signifies things which cannot both obtain, it is simply false. Bradwardine himself, and his contemporaries, did not elaborate this pluralist theory, or say much in its defence. It can be shown to accord closely, however, with the prevailing conception of logical consequence in England in the fourteenth century. Recent pluralist theories of signification, such as Grice’s, also endorse Bradwardine’s closure postulate as a plausible constraint on signification, and so his analysis of the semantic paradoxes is seen to be both well-grounded and plausible. (shrink)
This chapter focuses on alternative logics. It discusses a hierarchy of logical reform. It presents case studies that illustrate particular aspects of the logical revisionism discussed in the chapter. The first case study is of intuitionistic logic. The second case study turns to quantum logic, a system proposed on empirical grounds as a resolution of the antinomies of quantum mechanics. The third case study is concerned with systems of relevance logic, which have been the subject of an especially detailed reform (...) program. Finally, the fourth case study is paraconsistent logic, perhaps the most controversial of serious proposals. (shrink)
In this article, we discuss the notion of merely confused supposition as it arose in the medieval theory of suppositio personalis. The context of our analysis is our formalization of William of Ockham's theory of supposition sketched in Mind 86 (1977), 109-13. The present paper is, however, self-contained, although we assume a basic acquaintance with supposition theory. The detailed aims of the paper are: to look at the tasks that supposition theory took on itself and to use our formalization to (...) relate them to more modern ideas; to explain the notion of merely confused supposition and to defend it against certain criticisms; and to discuss two issues closely related to the idea of merely confused supposition which we could not broach in a shorter article: the mode of supposition of terms in intensional contexts, and the possible existence of a fourth mode, often called suppositio copulatim. (shrink)
One of the manuscripts of Buridan’s Summulae contains three figures, each in the form of an octagon. At each node of each octagon there are nine propositions. Buridan uses the figures to illustrate his doctrine of the syllogism, revising Aristotle's theory of the modal syllogism and adding theories of syllogisms with propositions containing oblique terms (such as ‘man’s donkey’) and with ‘propositions of non-normal construction’ (where the predicate precedes the copula). O-propositions of non-normal construction (i.e., ‘Some S (some) P is (...) not’) allow Buridan to extend and systematize the theory of the assertoric (i.e., non-modal) syllogism. Buridan points to a revealing analogy between the three octagons. To understand their importance we need to rehearse the medieval theories of signification, supposition, truth and consequence. (shrink)
" In 'Vagueness and Alternative Logic' (Realism and Reason, Cambridge 1983, pp. 271-86, especially 285-6), Hilary Putnam puts forward a suggestion for a formal treatment of the logic of vagueness. … Putnam admits that, at the time of writing, he had not thought this idea through. What will already be apparent to the alert reader is that, in order to disclose serious difficulties for the proposal, Putnam would not have had to think far.".
The ?no???no? paradox (so-called by Sorensen) consists of a pair of propositions each of which says of the other that it is false. It is not immediately paradoxical, since it has a solution in which one proposition is true, the other false. However, that is itself paradoxical, since there is no clear ground for determining which is which. The two propositions should have the same truth-value. The paper shows how a proposal by the medieval thinker Thomas Bradwardine solves not only (...) the Liar paradox, but also symmetric paradoxes like the ?no???no?, the descending ?no???no?, and the Truth-teller paradoxes. (shrink)
_ Source: _Page Count 21 Ian Rumfitt has recently drawn our attention to a couple of paradoxes of signification, claiming that although Thomas Bradwardine’s “multiple-meanings” account of truth and signification can solve the first of them, it cannot solve the second. The paradoxes of signification were in fact much discussed by Bradwardine’s successors in the fourteenth century. Bradwardine’s solution appears to turn on a distinction between the principal and the consequential signification of an utterance. However, although such a distinction played (...) an important role in his successors’ theories, it is shown that Bradwardine’s account of signification does not admit any such distinction, no part being prior to the others. Accordingly his solution, unlike those of his successors, does not fall prey to Rumfitt’s paradoxes. (shrink)
Frege?s project has been characterized as an attempt to formulate a complete system of logic adequate to characterize mathematical theories such as arithmetic and set theory. As such, it was seen to fail by Gödel?s incompleteness theorem of 1931. It is argued, however, that this is to impose a later interpretation on the word ?complete? it is clear from Dedekind?s writings that at least as good as interpretation of completeness is categoricity. Whereas few interesting first-order mathematical theories are categorical or (...) complete, there are logical extensions of these theories into second-order and by the addition of generalized quantifiers which are categorical. Frege?s project really found success through Gödel?s completeness theorem of 1930 and the subsequent development of first- and higher-order model theory. (shrink)
Inferentialism claims that the rules for the use of an expression express its meaning without any need to invoke meanings or denotations for them. Logical inferentialism endorses inferentialism specically for the logical constants. Harmonic inferentialism, as the term is introduced here, usually but not necessarily a subbranch of logical inferentialism, follows Gentzen in proposing that it is the introduction-rules whch give expressions their meaning and the elimination-rules should accord harmoniously with the meaning so given. It is proposed here that the (...) logical expressions are those which can be given schematic rules that lie in a specific sort of harmony, general-elimination harmony, resulting from applying a certain operation, the ge-procedure, to produce ge-rules in accord with the meaning defined by the I-rules. Griffiths claims that identity cannot be given such rules, concluding that logical inferentialists are committed to ruling identity a non-logical expression. It is shown that the schematic rules for identity given in Read, slightly amended, are indeed ge-harmonious, so confirming that identity is a logical notion. (shrink)
Hartry Field's revised logic for the theory of truth in his new book, Saving Truth from Paradox , seeking to preserve Tarski's T-scheme, does not admit a full theory of negation. In response, Crispin Wright proposed that the negation of a proposition is the proposition saying that some proposition inconsistent with the first is true. For this to work, we have to show that this proposition is entailed by any proposition incompatible with the first, that is, that it is the (...) weakest proposition incompatible with the proposition whose negation it should be. To show that his proposal gave a full intuitionist theory of negation, Wright appealed to two principles, about incompatibility and entailment, and using them Field formulated a paradox of validity (or more precisely, of inconsistency). The medieval mathematician, theologian and logician, Thomas Bradwardine, writing in the fourteenth century, proposed a solution to the paradoxes of truth which does not require any revision of logic. The key principle behind Bradwardine's solution is a pluralist doctrine of meaning, or signification, that propositions can mean more than they explicitly say. In particular, he proposed that signification is closed under entailment. In light of this, Bradwardine revised the truth-rules, in particular, refining the T-scheme, so that a proposition is true only if everything that it signifies obtains. Thereby, he was able to show that any proposition which signifies that it itself is false, also signifies that it is true, and consequently is false and not true. I show that Bradwardine's solution is also able to deal with Field's paradox and others of a similar nature. Hence Field's logical revisions are unnecessary to save truth from paradox. (shrink)
Although the theory of the assertoric syllogism was Aristotle's great invention, one which dominated logical theory for the succeeding two millenia, accounts of the syllogism evolved and changed over that time. Indeed, in the twentieth century, doctrines were attributed to Aristotle which lost sight of what Aristotle intended. One of these mistaken doctrines was the very form of the syllogism: that a syllogism consists of three propositions containing three terms arranged in four figures. Yet another was that a syllogism is (...) a conditional proposition deduced from a set of axioms. There is even unclarity about what the basis of syllogistic validity consists in. Returning to Aristotle's text, and reading it in the light of commentary from late antiquity and the middle ages, we find a coherent and precise theory which shows all these claims to be based on a misunderstanding and misreading. (shrink)
The Oxford Calculator Roger Swyneshed put forward three provocative claims in his treatise on insolubles, written in the early 1330s, of which the second states that there is a formally valid inference with true premises and false conclusion. His example deployed the Liar paradox as the conclusion of the inference: ‘The conclusion of this inference is false, so this conclusion is false’. His account of insolubles supported his claim that the conclusion is false, and so the premise, referring to the (...) conclusion, would seem to be true. But what is his account of validity that can allow true premises to lead to a false conclusion? This article considers Roger’s own account, as well as that of Paul of Venice, writing some sixty years later, whose account of the truth and falsehood of insolubles followed Roger’s closely. Paul endorsed Roger’s three claims. But their accounts of validity were different. The question is whether these accounts are coherent and support Paul’s claim in his Logica Magna that he endorsed all the normal rules of inference. (shrink)
Thomas Bradwardine makes much of the fact that his solution to the insolubles is in accordance with Aristotle's diagnosis of the fallacy in the Liar paradox as that of secundum quid et simpliciter. Paul Spade, however, claims that this invocation of Aristotle by Bradwardine is purely "honorary" in order to confer specious respectability on his analysis and give it a spurious weight of authority. Our answer to Spade follows Bradwardine's response to the problem of revenge: any proposition saying of itself (...) that it is false says more than does Bradwardine's proposition saying of it that it is false, and so follows from that other proposition only in respect of part of what it says, and not simpliciter. (shrink)
What binds the constituents of a state of affairs together and provides unity to the fact they constitute? I argue that the fact that they are related is basic and fundamental. This is the thesis of Factualism: the world is a world of facts. I draw three corollaries: first, that the Identity of truth is mistaken, in conflating what represents (the proposition) with what is represented (the fact). Secondly, a popular interpretation of Wittgenstein's Tractatus, due to Steinus, whereby false propositions (...) are taken to picture non-existent state of affairs, cannot be right. For Wittgenstein, propositions had two poles, and a proposition and its negation picture the same fact. Finally, the metaphysics of modal realism must be wrong, for there are no non-actual states of affairs to constitute any world other than the actual world. (Published Online October 13 2005). (shrink)
What makes necessary truths true? I argue that all truth supervenes on how things are, and that necessary truths are no exception. What makes them true are proofs. But if so, the notion of proof needs to be generalized to include verification-transcendent proofs, proofs whose correctness exceeds our ability to verify it. It is incumbent on me, therefore, to show that arguments, such as Dummett's, that verification- truth is not compatible with the theory of meaning, are mistaken. The answer is (...) that what we can conceive and construct far outstrips our actual abilities. I conclude by proposing a proof-theoretic account of modality, rejecting a claim of Armstrong's that modality can reside in non-modal truthmakers. (shrink)
Thomas Reid was one of the greatest philosophers of the eighteenth century and a contemporary of Kant's. This volume is part of a new wave of international interest in Reid from a new generation of scholars. The volume opens with an introduction to Reid's life and work, including biographical material previously little known. A classic essay by Reid himself - 'Of Power' - is then reproduced, in which he sets out his distinctive account of causality and agency. This is followed (...) by ten original essays exploring different aspects of Reid's philosophy, as well as his relation to other thinkers, such as Kant, Priestley, and Moore. (shrink)
The focus of the paper is a sophism based on the proposition ‘This is Socrates’ found in a short treatise on obligational casus attributed to William Heytesbury. First, the background to the puzzle in Walter Burley’s traditional account of obligations (the responsio antiqua), and the objections and revisions made by Richard Kilvington and Roger Swyneshed, are presented. All six types of obligations described by Burley are outlined, including sit verum, the type used in the sophism. Kilvington and Swyneshed disliked the (...) dynamic nature of the responsio antiqua, and Kilvington proposed a revision to the rules for irrelevant propositions. This allowed him to use a form of reasoning, the “disputational meta-argument”, which is incompatible with Burley’s rules. Heytesbury explicitly rejected Kilvington’s revision and the associated meta-argument. Swyneshed also revised Burley’s account of obligations, formulating the so-called responsio nova, characterised by the apparently surprising thesis that a conjunction can be denied both of whose conjuncts are granted. On closer inspection, however, his account is found to be less radical than first appears. (shrink)