Ageing is ubiquitous to the human condition. The MRI correlates of healthy ageing have been extensively investigated using a range of modalities, including volumetric MRI, quantitative MRI and DTI. Despite this, the reported brainstem related changes remain sparse. This is, in part, due to the technical and methodological limitations in quantitatively assessing and statistically analysing this region. By utilising a new method of brainstem segmentation, a large cohort of 100 healthy adults were assessed in this study for the effects of (...) ageing within the human brainstem in vivo. Using quantitative MRI (qMRI), tensor based morphometry (TBM) and voxel based quantification (VBQ), the volumetric and quantitative changes across healthy adults between 19-75 years were characterised. In addition to the increased R2* in substantia nigra corresponding to increasing iron deposition with age, several novel findings were reported in the current study. These include selective volumetric loss of the brachium conjunctivum, with a corresponding decrease in magnetisation transfer (MT) and increase in proton density (PD), accounting for the previously described “midbrain shrinkage”. Additionally, we found increases in R1 and PD in several pontine and medullary structures. We consider these changes in the context of well-characterised, functional age-related changes, and propose potential biophysical mechanisms. This study provides detailed quantitative analysis of the internal architecture of the brainstem and provides a baseline for further studies of neurodegenerative diseases that are characterised by early, pre-clinical involvement of the brainstem, such as Parkinson’s and Alzheimer’s diseases. (shrink)
Free logic is an important field of philosophical logic that first appeared in the 1950s. J. Karel Lambert was one of its founders and coined the term itself. The essays in this collection (written over a period of 40 years) explore the philosophical foundations of free logic and its application to areas as diverse as the philosophy of religion and computer science. Amongst the applications on offer are those to the analysis of existence statements, to definite descriptions and to (...) partial functions. The volume contains a proof that free logics of any kind are non-extensional and then uses that proof to show that Quine's theory of predication and referential transparency must fail. The purpose of this collection is to bring an important body of work to the attention of a new generation of professional philosophers, computer scientists, and mathematicians. (shrink)
A free logic is one in which a singular term can fail to refer to an existent object, for example, `Vulcan' or `5/0'. This essay demonstrates the fruitfulness of a version of this non-classical logic of terms (negative free logic) by showing (1) how it can be used not only to repair a looming inconsistency in Quine's theory of predication, the most influential semantical theory in contemporary philosophical logic, but also (2) how Beeson, Farmer and Feferman, among others, use it (...) to provide a natural foundation for partial functions in programming languages. Vis à vis (2), the question is raised whether the Beeson-Farmer-Feferman approach is adequate to the treatment of partial functions in all programming languages. Gumb and the author say No, and suggest a way of handling the refractory cases by means of positive free logic. Finally, Antonelli's solution of a problem associated with the Gumb-Lambert proposal is mentioned. (shrink)
A 'free logic' for singular terms with restrictions on existential generalization and universal instantiation is set out and argued for. Weaker logics, Such as lambert's fd and fd1 are held incapable of proving instances of tarski's truth schema for languages containing non-Denoting terms. Stronger logics, Such as scott's and lambert's fd2, Are held to yield false theorems when given natural interpretations. The logic defended conforms essentially to russell's semantical intuitions. Some consequences are drawn for the theory of identity.
Free logic, an alternative to traditional logic, has been seen as a useful avenue of approach to a number of philosophical issues of contemporary interest. In this collection, Karel Lambert, one of the pioneers in, and the most prominent exponent of, free logic, brings together a variety of published essays bearing on the application of free logic to philosophical topics ranging from set theory and logic to metaphysics and the philosophy of religion. The work of such distinguished philosophers as (...) Bas van Fraassen, Dana Scott, Tyler Burge, and Jaakko Hintikka is represented. Lambert provides an introductory essay placing free logic in the logical tradition beginning with Aristotle, developing it as the natural culmination of a trend begun in the Port Royal logic of the 1600s, and continuing through current predicate logic--the trend to rid logic of existence assumptions. His Introduction also provides a useful systematic overview of free logic, including both a standard syntax and some semantical options. (shrink)
Ce article a déjà paru dans la Revue des Traditions Musicales du Monde Arabe et Méditerranéen, n° 6, Beyrouth, 2012, p. 19-42 . Nous remercions Jean Lambert de nous avoir autorisé à le reproduire ici. Au-delà de ses aspects techniques et érudits, la réflexion générale qui y est menée intéressera tous les lecteurs concernés par les rapports entre métrique poétique et rythmique musicale. La séparation historique des études sur la musique (musicologie) et des études sur la poésie (linguistique, poétique) (...) a (...) - Musique et Musicologie – GALERIE – Nouvel article. (shrink)
Honderich claims that for a person to be perceptually conscious is for a world to exist. I decide what this means, and whether it could be true, in the opening section Consciousness and Existence. In Honderich's Phenomenology, I show that Honderich's theory is essentially anticipated in the ideas and Ideas of Husserl. In the third section, Radical Interiority, I argue that although phenomenology putatively eschews ontology of mind, and Honderich construes his position as near- physicalism, Honderich's insights are only truths (...) because we are spiritual substances. (shrink)
According to Stephen Finlay, ‘A ought to X’ means that X-ing is more conducive to contextually salient ends than relevant alternatives. This in turn is analysed in terms of probability. I show why this theory of ‘ought’ is hard to square with a theory of a reason’s weight which could explain why ‘A ought to X’ logically entails that the balance of reasons favours that A X-es. I develop two theories of weight to illustrate my point. I first look (...) at the prospects of a theory of weight based on expected utility theory. I then suggest a simpler theory. Although neither allows that ‘A ought to X’ logically entails that the balance of reasons favours that A X-es, this price may be accepted. For there remains a strong pragmatic relation between these claims. (shrink)
[Stephen Makin] Aristotle draws two sets of distinctions in Metaphysics 9.2, first between non-rational and rational capacities, and second between one way and two way capacities. He then argues for three claims: [A] if a capacity is rational, then it is a two way capacity [B] if a capacity is non-rational, then it is a one way capacity [C] a two way capacity is not indifferently related to the opposed outcomes to which it can give rise I provide explanations (...) of Aristotle's terminology, and of how [A]-[C] should be understood. I then offer a set of arguments which are intended to show that the Aristotelian claims are plausible. \\\ [Nicholas Denyer] In De Caelo 1: 11-12 Aristotle argued that whatever is and always will be true is necessarily true. His argument works, once we grant him the highly plausible principle that if something is true, then it can be false if and only if it can come to be false. For example, assume it true that the sun is and always will be hot. No proposition of this form can ever come to be false. Hence this proposition cannot be false. Hence it is necessarily true, and so too is anything that follows from it. In particular, it is necessarily true that the sun is hot. Moreover, if the sun not only is and always will be hot, but also always has been, then it follows by similar reasoning that the sun not only cannot now fail to be hot, but also never could have failed. Anything everlastingly true is therefore, in the strictest sense of the term, necessarily true. (shrink)
Is there a contradiction in Stephen Colbert’s attitudes towards race? How can he consistently claim to be colorblind and yet hold a national search for a new "black friend"? I argue that Stephen is trying to claim rights and shirk responsibilities on matters of race relations in America, and that his famous notion of "truthiness" is an extension of this attitude to other areas of social and political discourse.
Kalam cosmological arguments have recently been the subject of criticisms, at least inter alia, by physicists---Paul Davies, Stephen Hawking---and philosophers of science---Adolf Grunbaum. In a series of recent articles, William Craig has attempted to show that these criticisms are “superficial, iII-conceived, and based on misunderstanding.” I argue that, while some of the discussion of Davies and Hawking is not philosophically sophisticated, the points raised by Davies, Hawking and Grunbaum do suffice to undermine the dialectical efficacy of kalam cosmological arguments.
[This is a nearly final review of Stephen Davies The Artful Species, for the British Journal of Aesthetics.] In this review, I outline the evolved roles of (a) emotions/drives/appetites, and (b) "telic" pleasure, which results from getting something beneficial. I argue that, contrary to Davies, aesthetic appreciation does not fall into either of the above categories (as normally understood). That is, aesthetic appreciation can neither be a drive to possess its object, nor pleasure in possessing that object. Rather, it (...) is pleasure in contemplating that object. Evolutionary accounts fail, therefore, if they rely on the benefits provided by aesthetic objects. They must rather demonstrate the benefits of contemplating these objects. (shrink)
J. H. Lambert proved important results of what we now think of as non-Euclidean geometries, and gave examples of surfaces satisfying their theorems. I use his philosophical views to explain why he did not think the certainty of Euclidean geometry was threatened by the development of what we regard as alternatives to it. Lambert holds that theories other than Euclid’s fall prey to skeptical doubt. So despite their satisfiability, for him these theories are not equal to Euclid’s in (...) justification. Contrary to recent interpretations, then, Lambert does not conceive of mathematical justification as semantic. According to Lambert, Euclid overcomes doubt by means of postulates. Euclid’s theory thus owes its justification not to the existence of the surfaces that satisfy it, but to the postulates according to which these “models” are constructed. To understand Lambert’s view of postulates and the doubt they answer, I examine his criticism of Christian Wolff’s views. I argue that Lambert’s view reflects insight into traditional mathematical practice and has value as a foil for contemporary, model-theoretic, views of justification. (shrink)
I would like to thank the editors of Philosophy East and West for courteously asking me if I would like to respond to Matthew Dasti and Stephen Phillips' very thoughtful remarks about the review I wrote of Phillips' translation and commentary on the pratyakṣa chapter of Gaṅgeśa's Tattvacintāmaṇi, prepared in collaboration with N. S. Ramanuja Tatacharya (Phillips and Tatacharya 2004). Let me begin by reaffirming what I said at the beginning of my review, that the book is "a monumental (...) and momentous achievement, one whose importance cannot be understated." I have indeed enormous admiration for the magnitude of their achievement and respect for the contribution they have made through this translation to the field of .. (shrink)
I argue that Stephen Houlgate misstates an element in the Kantian background to my reading of “Lordship and Bondage” (§2). He misreads my remarks about the need to see Hegel’s moves there in the context of the progression towards absolute knowing (§3), and, partly consequently, he fails to engage with the motivation for my reading (§4). And he does not understand the way my reading exploits the concept of allegory (§5).
This article is a critical review of Stephen Schiffers monograph The Things We Mean . The text discusses some novel contributions made by Schiffer to the philosophy of meaning, in particular, Schiffers proposal for the reification of certain abstract entities and the application of his argument to the philosophical problem of vagueness in natural language. Special attention is paid both to Schiffers ingenious use of the notion of conservative extension , here employed as a criterion for distinguishing legitimate from (...) illegitimate reifications and to Schiffers notion of vague partial belief and its relation to standard partial belief. Schiffers particular understanding of vagueness and its relation to the sorites paradox is also considered, with some remarks made concerning the relationship between these related philosophical problems and human perception. Key Words: meaning vagueness sorites perception conservative extension fictional entities. (shrink)
After Meaning, 1972, and The Remnants of Meaning , 1987, The Things We Mean is Stephen Schiffer's third major work on the foundations of the theory of linguistic meaning. In simplest possible outline, the development started with a positive attempt to base a meaning theory on a modified Gricean account of utterance meaning, but took a negative turn, with the problems of belief sentences as a major reason for thinking that a systematic (compositional) semantic theory for natural language was (...) not possible at all. In the recent book, things have again taken a more positive turn, but now constructive and destructive elements are mixed in complex ways in a complex account, rich in ideas and in detail, and a great challenge to the reader. It is not always obviously free of inner conflict. Nor can one always easily see how things hang together. I shall here try to accurately present the main ideas. Where my comments are not relegated to separate paragraphs, I mark the transition with a dash ( -). (shrink)
Note: The Simpson's, television's popular prime-time cartoon known for its satirical commentary on various social issues, recently took a shot at the creation-evolution debate by featuring Stephen Jay Gould prominently in one of its episodes. Here is Bill Dembski's review and observations of that episode.
It is now often taken for granted that facts are entia non grata, for there exists a powerful argument (dubbed the slingshot), which is backed by such great names as Frege or Gödel or Davidson (and so could hardly be wrong), that discredits their existence. There indeed is such an argument, and it indeed is not wrong on the straightforward sense of wrong. However, in how far it knocks down any conception of facts is another story, a story which is (...) anything but simple and perspicuous. In his book, Stephen Neale takes pains to excavate the origins of the argument and the presuppositions which it needs to be usable for the purpose of exorcising facts. In the introduction of the book, Neale expresses his conviction that his analysis of the slingshot will not only compromise its usability for the purpose of discrediting facts, but also save representationalist conceptions of language and mind from the attacks of the antirepresentationalist philosophers like Davidson and Rorty. „Representational philosophy,“ he claims, „survives the Davidson-Rorty onslaught because non-truth-functional logics and ontologies of facts, states of affairs, situations and propositions survive not only the actual arguments deployed against them, but also the most precise and powerful slingshot arguments that can be constructed.“ However, what he does take his analyses to show is that „the most precise and powerful slingshot arguments demonstrate conclusively that the logical and ontological theories originally targeted must satisfy non-trivial conditions if they are to avoid logical or ontological collapse.“ (P. 12) The book starts with the discussion of the philosophy of Donald Davidson, who appears to have brought the slingshot argument to the current prominence within philosophical discussions. Here we encounter the first variant of the slingshot: Consider two sentences φ and ψ and a proper name d. Consider the definite descriptions ‘the object x such that (x = d and φ)’ and ‘the object x such that (x = d and ψ)’.. (shrink)
A standard method for refuting a set of claims is to show that it implies a contradiction. Stephen Clark questions this method on the grounds that the Law of Non-Contradiction, together with the other fundamental laws of logic do not accord with everyday reality. He accounts for vagueness by suggesting that, for any vague predicate 'F', an ordinary object is typically to some extent both F and not-F, and that objects do not change abruptly from being F to being (...) not-F. I challenge Clark's 'deconstruction' of logic, and show that, in characterizing vagueness and dealing with the associated Sorites paradox, we can accommodate his observation that change from being F to being not-F is ineradically continuous without tampering with any fundamental logical laws. (shrink)
I have argued previously that the art of absolute music, unlike, for example, the art of literature, is not capable of profundity, which I characterized as treating a profound subject matter, at the highest artistic level, in a manner appropriate to its profundity. Stephen Davies has recently argued that there is another way of being profound, which he calls non-propositional profundity, and for which chess provides his principal example. He argues, further, that absolute music also exhibits this non-propositional profundity. (...) I argue in the present paper that Davies's attempt to rescue profundity for absolute music will not work, because it does not allow what I take to be the crucial distinction between great works of absolute music that are profound and great works of absolute music that are not. In other words, it has the unwelcome implication that all great works of absolute music are profound works. (shrink)
This paper explores the significance of the concept of power/knowledge in educational theory. The argument proceeds in two main parts. In the first, I consider aspects of Stephen J. Ball's highly influential work in educational theory. I examine his reception of Foucault's concept of power/knowledge and suggest that there are problems in his adoption of Foucault's thought. These problems arise from the way that he settles interpretations into received ideas. Foucault's thought, I try to show, is not to be (...) seen in a confined way. In the second part, I seek a different reading of Foucault's notion of power/knowledge in order to break with this tendency to confine, referring to the work of Gilles Deleuze. I draw particularly on Deleuze's thought of the outside as a means of manifesting the significance of power/knowledge in relation to processes of subjectification. At the end of the paper, I suggest how educational theory might be reconceived in the light of potencies of power/knowledge that the paper has demonstrated. (shrink)
Stephen Mumford's Dispositions1 is an interesting and thought-provoking addition to a recent surge of publications on the topic.2 Dispositions have not been such a hot topic since the heyday of behaviourism. But as Mumford argues in his first chapter, the importance of dispositions to contemporary philosophy can hardly be underestimated. Dispositions are fundamental to causal role functionalism in the philosophy of mind, response-dependent truth conditional accounts of moral and other concepts,3 capacity accounts of concepts more generally,4 theories of belief, (...) the compatibilist conception of free will, the philosophy of matter, probability (propensities) and more. So it is natural that conceptual and ontological issues about dispositions have come again to the fore. The only surprise is that it's taken so long. (shrink)
In this paper, the author analyzes critically some of the ideas found in Karel Lambert's recent book, Meinong and the Principle of Independence (Cambridge: Cambridge University Press, 1983). Lambert attempts to forge a link between the ideas of Meinong and the free logicians. The link comes in the form of a principle which, Lambert says, these philosophers adopt, namely, Mally's Principle of Independence, which Mally himself later abandoned. Instead of following Mally and attempting to formulate the principle (...) in the material mode as the claim that an object can have properties without having any sort of being, Lambert formulates the principle in the formal mode, as (something equivalent to) the rejection of the traditional constraint on the principle of predication. The principle of predication is that a formula of the form Fa' is true iff the general term F' is true of the object denoted by the object term a'. The traditional constraint on this predication principle is that for the sentence Fa' to be true, not only must the object term have a denotation, but it must also denote an object that has being. According to Lambert, the free logicians violate this constraint by suggesting that Fa' can be true even if the object term has no denotation, whereas Meinong violates this constraint by proposing Fa' can be true even when the object term denotes an object that has no being. Lambert then tries to `vindicate' the Principle of Independence, thereby justifying both the work of the free logicians and Meinong. (shrink)
Stephen Jay Gould’s monumental The Structure of Evolutionary Theory ‘‘attempts to expand and alter the premises of Darwinism, in order to build an enlarged and distinctive evolutionary theory . . . while remaining within the tradition, and under the logic, of Darwinian argument.’’ The three branches or ‘‘fundamental principles of Darwinian logic’’ are, according to Gould: agency (natural selection acting on individual organisms), efﬁcacy (producing new species adapted to their environments), and scope (accumulation of changes that through geological time (...) yield the living world’s panoply of diversity and morphological complexity). Gould’s efforts to contribute something important to each of these three fundamental components of Darwinian Theory are far from successful. (shrink)
Stephen Hawking and Leonard Mlodinow: The Grand Design Content Type Journal Article DOI 10.1007/s10806-010-9298-7 Authors Amitrajeet A. Batabyal, Department of Economics, Rochester Institute of Technology, Rochester, NY 14623-5604, USA Journal Journal of Agricultural and Environmental Ethics Online ISSN 1573-322X Print ISSN 1187-7863.
I review this fine collection of articles on ancient ethics ranging from the Presocratics to Sextus Empiricus. Eight of the nine chapters are published here for the first time. Contributors include Charles H. Kahn on "Pre-Platonic Ethics," C. C. W. Taylor on "Platonic Ethics," Stephen Everson on "Aristotle on Nature and Value," John McDowell on "Some Issues in Aristotle's Moral Psychology," David Sedley on "The Inferential Foundations of Epicurean Ethics," T. H. Irwin on "Socratic Paradox and Stoic Theory," Julia (...) Annas on "Doing Without Objective Values: Ancient and Modern Strategies," and Susan Sauvé Meyer on "Moral Responsibility: Aristotle and After." There is also an introductory essay by the editor, Stephen Everson. I summarize and then critique each chapter in this rather lengthy review. (shrink)
I offer an interpretation of the connection between judging and intuiting in Kant (§2). Next I try to clarify how the movement in the self-consciousness chapter, as I read it, fits in the Phenomenology’s progression towards absolute knowing (§3). In some detailed responses to Stephen Houlgate, I reiterate how my reading is motivated by the wish not to discard, or ignore, Hegel’s first formulation of what is to be achieved by the movement in the self-consciousness chapter, and I object (...) to Houlgate’s equation of thinking consciousness with Stoicism (§4). Finally, I try to clarify the point of my invocation of allegory (§5). (shrink)
Mumford and Anjum’s Getting Causes from Powers is an ambitious and original contribution to the literature on causation, a welcome departure from Humean approaches which reductively analyze causation in terms of regularities or counterfactual conditionals. The authors develop an account of causation as the exercising of powers, a view they call “causal dispositionalism.” This critique of Getting Causes from Powers is organized around its central heuristic—the vector model of causation. On this model, vectors represent the exercising of powers, those that (...) are operating upon a quality space. A quality space is a background against which events can occur, where two or more general properties are considered as possible for instantiation. A central line represents a starting point of a causal process, and vectors represent the powers in play. A vector is apt for representing a power because it has intensity and a direction, indicated by its length and the property term at which it points (24). A resultant vector R is also depicted, indicating the extent to which all of the powers in play collectively dispose toward one of the properties in the quality space. A threshold may also be depicted, representing a point on the quality space that may be of particular pragmatic interest, the passing of which would count as disposing toward an effect in question. Mumford and Anjum make the bold claim that all things can be represented by vectors (45–46). This claim is supported by the following theses: everything has properties; properties are clusters of powers; powers have intensity and direction; vectors represent intensity and direction. Even granting these theses, there is still much that the vectors do not represent. I discuss three things that are not represented by the vector model, in increasing order of significance for the account generally. (shrink)
Disagreements about the success of any given argument often arise because the suppositions of the critic differ from the suppositions of the author of the argument. In maintaining the plausibility of a metaethical argument for theism against the objections articulated by Stephen J. Sullivan, I will probe our differing suppositions with regard to the relation of theological to naturalistic metaethical theories, the starting point for the metaethical argument for theism, and the relation of the qualities of God's will to (...) our obligation to obey God. (shrink)
Abstract Stephen Schiffer, in his recent book, Remnants of Meaning, argues against the possibility of any compositional theory of meaning for natural language. Because the argument depends on the premise that there is no possible naturalistic reduction of the intentional to the physical, Schiffer's attack on theories of meaning is of central importance for theorists of mind. I respond to Schiffer's argument by showing that there is at least one reductive account of the mental that he has neglected to (...) consider?the computationalist account known as the Representational Theory of Mind. Not only is this view immune from the criticisms Schiffer mounts against other reductivist theories, but it solves problems that arise on Schiffer's own non?reductive account of the relation between the intentional and the physical. (shrink)
This paper takes as its focus one of the Edwardian period's most dramatic and little-understood paintings of a medical examination: George Washington Lambert's Chesham Street (1910). The painting shows an upper-class male patient lifting his shirt to reveal a muscular torso for examination by the doctor in the scene and the viewers outside it. The subject of a medical examination, I argue, legitimised the scrutiny of exposed male flesh and offered an opportunity for sensual pleasure between men. By way (...) of a comparison with other portraits of the artist from around the same period, I interpret Chesham Street as a patient self-portrait, which reveals the artist's dual personalities of bohemian artist and Australian boxer: two personae that did not combine seamlessly, as revealed by the composite nature of the patient in Chesham Street. From a discussion of the artist as patient, I move to an analysis of other self-portraits by Lambert in which the artist is shown flexing his muscles, especially in the context of his passion for boxing. I consider how these portraits serve as complex inscriptions of illness and health and how this relates to the experience of living and working as an Australian expatriate artist in London in the early twentieth century. (shrink)
The article begins from Stephen Hawking's well-known claim that philosophy is dead, and considers several other quotations in which philosophy is either belittled or subordinated outright to the natural sciences. This subordination requires a downward reductionism that is paralleled by the upward reductionism of the linguistic turn and social constructionist theories. Rather than undermining or overmining mid-sized individual entities, philosophy must deal with objects on their own terms. This suggests a possible tactical alliance between philosophy and the arts.
These papers are based on a Symposium at the COGSCI Conference in 2010. 1. Naturalizing the Mammalian Mind (Jaak Panksepp) 2. Modularity in Cognitive Psychology and Affective Neuroscience (Rami Gabriel) 3. Affective Neuroscience and the Philosophy of Self (Stephen Asma and Tom Greif) 4. Affective Neuroscience and Law (Glennon Curran and Rami Gabriel).
In his 2010 work, The Grand Design, Stephen Hawking, argues that ‘… philosophy is dead’ (2010: 5). While not a Philosopher, Hawking provides strong argument for his thesis, principally that philosophers have not taken science sufficiently seriously and so Philosophy is no longer relevant to knowledge claims. In this paper, Hawking’s claim is appraised and critiqued, becoming a meta-philosophical discussion. It is argued that Philosophy is dead, in some sense, due to particular philosophers having embarked on an intellectual path (...) no longer in keeping with the ancient definition of Philosophy. Philosophy as the seeking of wisdom necessarily includes the consideration of findings of other intellectual pursuits, including physical and natural science. While Philosophy has justifiably evolved through its long history, is it unrecognisable in the terms by which it historically defined itself? Seeking consistency, Hawking is critiqued for appearing to practise ‘dead’ Philosophy. Indeed, Hawking’s appeal to multiverse theory and his core discussion of the metaphysical problem of being are philosophical. The question of the death of Philosophy has contemporary relevance for the discipline which is particularly under threat for its survival in the academy, oftentimes assumed to be irrelevant. (shrink)
This paper explores some of the benefits informal logic may have for the analysis of mathematical inference. It shows how Stephen Toulmin’s pioneering treatment of defeasible argumentation may be extended to cover the more complex structure of mathematical proof. Several common proof techniques are represented, including induction, proof by cases, and proof by contradiction. Affinities between the resulting system and Imre Lakatos’s discussion of mathematical proof are then explored.