Many historians of the calculus deny significant continuity between infinitesimal calculus of the seventeenth century and twentieth century developments such as Robinson’s theory. Robinson’s hyperreals, while providing a consistent theory of infinitesimals, require the resources of modern logic; thus many commentators are comfortable denying a historical continuity. A notable exception is Robinson himself, whose identification with the Leibnizian tradition inspired Lakatos, Laugwitz, and others to consider the history of the infinitesimal in a more favorable light. Inspite of his Leibnizian sympathies, (...) Robinson regards Berkeley’s criticisms of the infinitesimal calculus as aptly demonstrating the inconsistency of reasoning with historical infinitesimal magnitudes. We argue that Robinson, among others, overestimates the force of Berkeley’s criticisms, by underestimating the mathematical and philosophical resources available to Leibniz. Leibniz’s infinitesimals are fictions, not logical fictions, as Ishiguro proposed, but rather pure fictions, like imaginaries, which are not eliminable by some syncategorematic paraphrase. We argue that Leibniz’s defense of infinitesimals is more firmly grounded than Berkeley’s criticism thereof. We show, moreover, that Leibniz’s system for differential calculus was free of logical fallacies. Our argument strengthens the conception of modern infinitesimals as a development of Leibniz’s strategy of relating inassignable to assignable quantities by means of his transcendental law of homogeneity. (shrink)
We apply Benacerraf’s distinction between mathematical ontology and mathematical practice to examine contrasting interpretations of infinitesimal mathematics of the seventeenth and eighteenth century, in the work of Bos, Ferraro, Laugwitz, and others. We detect Weierstrass’s ghost behind some of the received historiography on Euler’s infinitesimal mathematics, as when Ferraro proposes to understand Euler in terms of a Weierstrassian notion of limit and Fraser declares classical analysis to be a “primary point of reference for understanding the eighteenth-century theories.” Meanwhile, scholars like (...) Bos and Laugwitz seek to explore Eulerian methodology, practice, and procedures in a way more faithful to Euler’s own. Euler’s use of infinite integers and the associated infinite products are analyzed in the context of his infinite product decomposition for the sine function. Euler’s principle of cancellation is compared to the Leibnizian transcendental law of homogeneity. The Leibnizian law of continuity similarly finds echoes in Euler. We argue that Ferraro’s assumption that Euler worked with a classical notion of quantity is symptomatic of a post-Weierstrassian placement of Euler in the Archimedean track for the development of analysis, as well as a blurring of the distinction between the dual tracks noted by Bos. Interpreting Euler in an Archimedean conceptual framework obscures important aspects of Euler’s work. Such a framework is profitably replaced by a syntactically more versatile modern infinitesimal framework that provides better proxies for his inferential moves. (shrink)
The widespread idea that infinitesimals were “eliminated” by the “great triumvirate” of Cantor, Dedekind, and Weierstrass is refuted by an uninterrupted chain of work on infinitesimal-enriched number systems. The elimination claim is an oversimplification created by triumvirate followers, who tend to view the history of analysis as a pre-ordained march toward the radiant future of Weierstrassian epsilontics. In the present text, we document distortions of the history of analysis stemming from the triumvirate ideology of ontological minimalism, which identified the continuum (...) with a single number system. Such anachronistic distortions characterize the received interpretation of Stevin, Leibniz, d’Alembert, Cauchy, and others. (shrink)
Did Leibniz exploit infinitesimals and infinities à la rigueur or only as shorthand for quantified propositions that refer to ordinary Archimedean magnitudes? Hidé Ishiguro defends the latter position, which she reformulates in terms of Russellian logical fictions. Ishiguro does not explain how to reconcile this interpretation with Leibniz’s repeated assertions that infinitesimals violate the Archimedean property (i.e., Euclid’s Elements, V.4). We present textual evidence from Leibniz, as well as historical evidence from the early decades of the calculus, to undermine Ishiguro’s (...) interpretation. Leibniz frequently writes that his infinitesimals are useful fictions, and we agree, but we show that it is best not to understand them as logical fictions; instead, they are better understood as pure fictions. (shrink)
Recent accounts of the role of diagrams in mathematical reasoning take a Platonic line, according to which the proof depends on the similarity between the perceived shape of the diagram and the shape of the abstract object. This approach is unable to explain proofs which share the same diagram in spite of drawing conclusions about different figures. Saccheri’s use of the bi-rectangular isosceles quadrilateral in Euclides Vindicatus provides three such proofs. By forsaking abstract objects it is possible to give a (...) natural explanation of Saccheri’s proofs as well as standard geometric proofs and even number-theoretic proofs. (shrink)
Hume describes the sciences as "noble entertainments" that are "proper food and nourishment" for reasonable beings (EHU 1.5-6; SBN 8).1 But mathematics, in particular, is more than noble entertainment; for millennia, agriculture, building, commerce, and other sciences have depended upon applying mathematics.2 In simpler cases, applied mathematics consists in inferring one matter of fact from another, say, the area of a floor from its length and width. In more sophisticated cases, applied mathematics consists in giving scientific theory a mathematical form (...) and then explaining and predicting matters of fact by means of mathematics and the theory. Since Hume holds that, "All inferences from experience are . . . .. (shrink)
Abraham Robinson’s framework for modern infinitesimals was developed half a century ago. It enables a re-evaluation of the procedures of the pioneers of mathematical analysis. Their procedures have been often viewed through the lens of the success of the Weierstrassian foundations. We propose a view without passing through the lens, by means of proxies for such procedures in the modern theory of infinitesimals. The real accomplishments of calculus and analysis had been based primarily on the elaboration of novel techniques for (...) solving problems rather than a quest for ultimate foundations. It may be hopeless to interpret historical foundations in terms of a punctiform continuum, but arguably it is possible to interpret historical techniques and procedures in terms of modern ones. Our proposed formalisations do not mean that Fermat, Gregory, Leibniz, Euler, and Cauchy were pre-Robinsonians, but rather indicate that Robinson’s framework is more helpful in understanding their procedures than a Weierstrassian framework. (shrink)
In relation to a thesis put forward by Marx Wartofsky, we seek to show that a historiography of mathematics requires an analysis of the ontology of the part of mathematics under scrutiny. Following Ian Hacking, we point out that in the history of mathematics the amount of contingency is larger than is usually thought. As a case study, we analyze the historians’ approach to interpreting James Gregory’s expression ultimate terms in his paper attempting to prove the irrationality of \. Here (...) Gregory referred to the last or ultimate terms of a series. More broadly, we analyze the following questions: which modern framework is more appropriate for interpreting the procedures at work in texts from the early history of infinitesimal analysis? As well as the related question: what is a logical theory that is close to something early modern mathematicians could have used when studying infinite series and quadrature problems? We argue that what has been routinely viewed from the viewpoint of classical analysis as an example of an “unrigorous” practice, in fact finds close procedural proxies in modern infinitesimal theories. We analyze a mix of social and religious reasons that had led to the suppression of both the religious order of Gregory’s teacher degli Angeli, and Gregory’s books at Venice, in the late 1660s. (shrink)
Foundations of Science recently published a rebuttal to a portion of our essay it published 2 years ago. The author, G. Schubring, argues that our 2013 text treated unfairly his 2005 book, Conflicts between generalization, rigor, and intuition. He further argues that our attempt to show that Cauchy is part of a long infinitesimalist tradition confuses text with context and thereby misunderstands the significance of Cauchy’s use of infinitesimals. Here we defend our original analysis of various misconceptions and misinterpretations concerning (...) the history of infinitesimals and, in particular, the role of infinitesimals in Cauchy’s mathematics. We show that Schubring misinterprets Proclus, Leibniz, and Klein on non-Archimedean issues, ignores the Jesuit context of Moigno’s flawed critique of infinitesimals, and misrepresents, to the point of caricature, the pioneering Cauchy scholarship of D. Laugwitz. (shrink)
We examine Paul Halmos’ comments on category theory, Dedekind cuts, devil worship, logic, and Robinson’s infinitesimals. Halmos’ scepticism about category theory derives from his philosophical position of naive set-theoretic realism. In the words of an MAA biography, Halmos thought that mathematics is “certainty” and “architecture” yet 20th century logic teaches us is that mathematics is full of uncertainty or more precisely incompleteness. If the term architecture meant to imply that mathematics is one great solid castle, then modern logic tends to (...) teach us the opposite lesson, namely that the castle is floating in midair. Halmos’ realism tends to color his judgment of purely scientific aspects of logic and the way it is practiced and applied. He often expressed distaste for nonstandard models, and made a sustained effort to eliminate first-order logic, the logicians’ concept of interpretation, and the syntactic vs semantic distinction. He felt that these were vague, and sought to replace them all by his polyadic algebra. Halmos claimed that Robinson’s framework is “unnecessary” but Henson and Keisler argue that Robinson’s framework allows one to dig deeper into set-theoretic resources than is common in Archimedean mathematics. This can potentially prove theorems not accessible by standard methods, undermining Halmos’ criticisms. (shrink)
Psychologists debate whether mental attributes can be quantified or whether they admit only qualitative comparisons of more and less. Their disagreement is not merely terminological, for it bears upon the permissibility of various statistical techniques. This article contributes to the discussion in two stages. First it explains how temperature, which was originally a qualitative concept, came to occupy its position as an unquestionably quantitative concept (§§1–4). Specifically, it lays out the circumstances in which thermometers, which register quantitative (or cardinal) differences, (...) became distinguishable from thermoscopes, which register merely qualitative (or ordinal) differences. I argue that this distinction became possible thanks to the work of Joseph Black, ca. 1760. Second, the article contends that the model implicit in temperature’s quantitative status offers a better way for thinking about the quantitative status of mental attributes than models from measurement theory (§§5–6). (shrink)
To explore the extent of embeddability of Leibnizian infinitesimal calculus in first-order logic (FOL) and modern frameworks, we propose to set aside ontological issues and focus on pro- cedural questions. This would enable an account of Leibnizian procedures in a framework limited to FOL with a small number of additional ingredients such as the relation of infinite proximity. If, as we argue here, first order logic is indeed suitable for developing modern proxies for the inferential moves found in Leibnizian infinitesimal (...) calculus, then modern infinitesimal frameworks are more appropriate to interpreting Leibnizian infinitesimal calculus than modern Weierstrassian ones. (shrink)
Cauchy's sum theorem is a prototype of what is today a basic result on the convergence of a series of functions in undergraduate analysis. We seek to interpret Cauchy’s proof, and discuss the related epistemological questions involved in comparing distinct interpretive paradigms. Cauchy’s proof is often interpreted in the modern framework of a Weierstrassian paradigm. We analyze Cauchy’s proof closely and show that it finds closer proxies in a different modern framework.
Alexander’s "Infinitesimal. How a dangerous mathematical theory shaped the modern world"(London: Oneworld Publications, 2015) is right to argue that the Jesuits had a chilling effect on Italian mathematics, but I question his account of the Jesuit motivations for suppressing indivisibles. Alexander alleges that the Jesuits’ intransigent commitment to Aristotle and Euclid explains their opposition to the method of indivisibles. A different hypothesis, which Alexander doesn’t pursue, is a conflict between the method of indivisibles and the Catholic doctrine of the Eucharist. (...) This is a pity, for the conflict with the Eucharist has advantages over the Jesuit commitment to Aristotle and Euclid. The method of indivisibles was a method that developed in the course of the seventeenth century, and those who developed ‘beyond the Alps’ relied upon Aristotelian and Euclidean ideals. Alexander’s failure to recognize the importance of Aristotle and Euclid for the development of the method of indivisibles arises from an unwarranted conflation of indivisibles and infinitesimals. Once indivisibles and infinitesimals are distinguished, we observe that the development of the method of indivisibles exhibits an unmistakable sympathy for Aristotle and Euclid. Thus, it makes sense to consider an alternative explanation for the Jesuit abhorrence of indivisibles. And indeed, indivisibles but not infinitesimals conflict with the doctrine of the Eucharist, the central dogma of the Church. (shrink)
Classical logic yields counterintuitive results for numerous propositional argument forms. The usual alternatives (modal logic, relevance logic, etc.) generate counterintuitive results of their own. The counterintuitive results create problems—especially pedagogical problems—for informal logicians who wish to use formal logic to analyze ordinary argumentation. This paper presents a system, PL– (propositional logic minus the funny business), based on the idea that paradigmatic valid argument forms arise from justificatory or explanatory discourse. PL– avoids the pedagogical difficulties without sacrificing insight into argument.
Mathematics used to be portrayed as a deductive science. Stemming from Polya, however, is a philosophical movement which broadens the concept of mathematical reasoning to include inductive or quasi-empirical methods. Interest in inductive methods is a welcome turn from foundationalism toward a philosophy grounded in mathematical practice. Regrettably, though, the conception of mathematical reasoning embraced by quasi-empiricists is still too narrow to include the sort of thought-experiment which Mueller describes as traditional mathematical proof and which Lakatos examines in Proofs and (...) refutations. This paper extends the concept of mathematical reasoning along two further dimensions to accommodate thought-experiment.Keywords: Thought-experiment; Informal proof; Mathematical reasoning. (shrink)
In a ground-breaking essay Nagel contended that the controversy over impossible numbers influenced the development of modern logic. I maintain that Nagel was correct in outline only. He overlooked the fact that the controversy engendered a new account of reasoning, one in which the concept of a well-made language played a decisive role. Focusing on the new account of reasoning changes the story considerably and reveals important but unnoticed similarities between the development of algebraic logic and quantificational logic.
Professor Grünbaum's much-discussed refutation of Zeno's metrical paradox turns out to be ad hoc upon close examination of the relevant portion of measure theory. Although the modern theory of measure is able to defuse Zeno's reasoning, it is not capable of refuting Zeno in the sense of showing his error. I explain why the paradox is not refutable and argue that it is consequently more than a mere sophism.
In formal logic there is a premium on clever paraphrase, for it subsumes troublesome inferences under a familiar theory. (A paradigm is Davidson's analysis 1967 of inferences like ?He buttered his toast with a knife; so, he buttered his toast?.) But the need for paraphrase in formal logic runs deeper than the odd recalcitrant inference, and thus, I shall argue, commits logicians to some interesting consequences. First, the thesis that arguments are valid in virtue of their form must be severely (...) qualified (?4). And second, it is misleading to view a formal logical theory as a standard for justifying and criticizing inference (?7). The latter point depends on the nature and role of paraphrase, which permit a range of conflicting logical theories. Conflicting logical theories arise from the conflicting goals of logical theorists and the promiscuous nature of paraphrase makes reconciliation impossible. (shrink)
A plausible and popular rule governing the scope of truth-functional logic is shown to be indequate. The argument appeals to the existence of truth-functional paraphrases which are logically independent of their natural language counterparts. A more adequate rule is proposed.
The first half of the 17th century was a time of intellectual ferment when wars of natural philosophy were echoes of religious wars, as we illustrate by a case study of an apparently innocuous mathematical technique called adequality pioneered by the honorable judge Pierre de Fermat, its relation to indivisibles, as well as to other hocus-pocus. André Weil noted that simple applications of adequality involving polynomials can be treated purely algebraically but more general problems like the cycloid curve cannot be (...) so treated and involve additional tools–leading the mathematician Fermat potentially into troubled waters. Breger attacks Tannery for tampering with Fermat’s manuscript but it is Breger who tampers with Fermat’s procedure by moving all terms to the left-hand side so as to accord better with Breger’s own interpretation emphasizing the double root idea. We provide modern proxies for Fermat’s procedures in terms of relations of infinite proximity as well as the standard part function. (shrink)