Proposition I.1 is, by far, the most popular example used to justify the thesis that many of Euclid’s geometric arguments are diagram-based. Many scholars have recently articulated this thesis in different ways and argued for it. My purpose is to reformulate it in a quite general way, by describing what I take to be the twofold role that diagrams play in Euclid’s plane geometry (EPG). Euclid’s arguments are object-dependent. They are about geometric objects. Hence, they cannot be diagram-based unless diagrams (...) are supposed to have an appropriate relation with these objects. I take this relation to be a quite peculiar sort of representation. Its peculiarity depends on the two following claims that I shall argue for: ( i ) The identity conditions of EPG objects are provided by the identity conditions of the diagrams that represent them; ( ii ) EPG objects inherit some properties and relations from these diagrams. (shrink)
Diagrams are ubiquitous in mathematics. From the most elementary class to the most advanced seminar, in both introductory textbooks and professional journals, diagrams are present, to introduce concepts, increase understanding, and prove results. They thus fulfill a variety of important roles in mathematical practice. Long overlooked by philosophers focused on foundational and ontological issues, these roles have come to receive attention in the past two decades, a trend in line with the growing philosophical interest in actual mathematical practice.
Both Frege's Grundgesetze, and Lagrange's treatises on analytical functions pursue a foundational purpose. Still, the former's program is not only crucially different from the latter's. It also depends on a different idea of what foundation of mathematics should be like . Despite this contrast, the notion of function plays similar roles in their respective programs. The purpose of my paper is emphasising this similarity. In doing it, I hope to contribute to a better understanding of Frege's logicism, especially in relation (...) to its crucial differences with a set-theoretic foundational perspective. This should also spread some light on a question arisen by J. Hintikka and G. Sandu in a widely discussed paper, namely whether Frege should or should not be credited with the notion of arbitrary function underlying our standard interpretation of second-order logic. In section 1, I account for Lagrange's notion of function. In section 2, I advance some remarks on connected historical matters. This will provide an appropriate framework for discussing the role played by the notion of function in Frege's Grundgesetze. Section 3 is devoted to this. Some concluding remarks will close the paper. (shrink)
The indispensability argument comes in many different versions that all reduce to a general valid schema. Providing a sound IA amounts to providing a full interpretation of the schema according to which all its premises are true. Hence, arguing whether IA is sound results in wondering whether the schema admits such an interpretation. We discuss in full details all the parameters on which the specification of the general schema may depend. In doing this, we consider how different versions of IA (...) can be obtained, also through different specifications of the notion of indispensability. We then distinguish between schematic and genuine IA, and argue that no genuine sound IA is available or easily forthcoming. We then submit that this holds also in the particularly relevant case in which indispensability is conceived as explanatory indispensability. (shrink)
Most of the arguments usually appealed to in order to support the view that some abstraction principles are analytic depend on ascribing to them some sort of existential parsimony or ontological neutrality, whereas the opposite arguments, aiming to deny this view, contend this ascription. As a result, other virtues that these principles might have are often overlooked. Among them, there is an epistemic virtue which I take these principles to have, when regarded in the appropriate settings, and which I suggest (...) to call ‘epistemic economy’. My purpose is to isolate and clarify this notion by appealing to some examples concerning the definition of natural and real numbers. (shrink)
The aim I am pursuing here is to describe some general aspects of mathematical proofs. In my view, a mathematical proof is a warrant to assert a non-tautological statement which claims that certain objects (possibly a certain object) enjoy a certain property. Because it is proved, such a statement is a mathematical theorem. In my view, in order to understand the nature of a mathematical proof it is necessary to understand the nature of mathematical objects. If we understand them as (...) external entities whose 'existence' is independent of us and if we think that their enjoying certain properties is a fact, then we should argue that a theorem is a statement that claims that this fact occurs. If we also maintain that a mathematical proof is internal to a mathematical theory, then it becomes very difficult indeed to explain how a proof can be a warrant for such a statement. This is the essential content of a dilemma set forth by P. Benacerraf (cf. Benacerraf 1973). Such a dilemma, however, is dissolved if we understand mathematical objects as internal constructions of mathematical theories and think that they enjoy certain properties just because a mathematical theorem claims that they enjoy them. (shrink)
As a reply to the commentary (Humphreys in Found Sci, 2012), we explore the methodological implications of seeing artificial neural networks as generic classification tools, we show in which sense the use of descriptions and models in data analysis is not equivalent to the original empirical use of epicycles in describing planetary motion, and we argue that agnostic science is essentially related to the type of problems we ask about a phenomenon and to the processes used to find answers.
As a reply to the commentary (Lenhard in Found Sci, 2012), we stress here that structural understanding of data analysis techniques is the natural counterpart to the lack of understanding of phenomena in agnostic science. We suggest moreover that the dynamics of computational processes, and their parallels with the dynamics of natural processes, will increasingly be, possibly, the driving force of the development of data analysis.
François Viète is considered the father both of modern algebra and of modern cryptanalysis. The paper outlines Viète’s major contributions in these two mathematical fields and argues that, despite an obvious parallel between them, there is an essential difference. Viète’s ‘new algebra’ relies on his reform of the classical method of analysis and synthesis, in particular on a new conception of analysis and the introduction of a new formalism. The procedures he suggests to decrypt coded messages are particular forms of (...) analysis based on the use of formal methods. However, Viète’s algebraic analysis is not an analysis in the same sense as his cryptanalysis is. In Aristotelian terms, the first is a form of 'analysis,' while the second is a form of 'diaresis.' While the first is a top-down argument from the point of view of the human subject, since it is an argument going from what is not actual to what is actual for such a subject, the second one is a bottom-up argument from this same point of view, since it starts from what is first for us and proceed towards what is first by nature.Keywords: Analysis; Cryptanalysis; Algebra; Aristotle; Viète. (shrink)
Frege's definition of the real numbers, as envisaged in the second volume of Grundgesetze der Arithmetik, is fatally flawed by the inconsistency of Frege's ill-fated Basic Law V. We restate Frege's definition in a consistent logical framework and investigate whether it can provide a logical foundation of real analysis. Our conclusion will deem it doubtful that such a foundation along the lines of Frege's own indications is possible at all.
This paper aims at clarifying the nature of Frege's system of logic, as presented in the first volume of the Grundgesetze. We undertake a rational reconstruction of this system, by distinguishing its propositional and predicate fragments. This allows us to emphasise the differences and similarities between this system and a modern system of classical second-order logic.
In the Tractatus, it is stated that questions about logical formatting cannot be meaningfully formulated, since it is precisely the application of logical rules which enables the formulation of a question whatsoever; analogously, Wittgenstein’s celebrated infinite regress argument on rule-following seems to undermine any explanation of deduction, as relying on a logical argument. On the other hand, some recent mathematical developments of the Curry-Howard bridge between proof theory and type theory address the issue of describing the “subjective” side of logic, (...) that is, the concrete manipulation of rules and proofs in space and time. It is advocated that such developments can shed some light on the question of logical formatting and its apparently unintelligible paradoxes, thus reconsidering Wittgenstein’s verdict. (shrink)
In this paper we want to discuss the changing role of mathematics in science, as a way to discuss some methodological trends at work in big data science. More specifically, we will show how the role of mathematics has dramatically changed from its more classical approach. Classically, any application of mathematical techniques requires a previous understanding of the phenomena, and of the mutual relations among the relevant data; modern data analysis appeals, instead, to mathematics in order to identify possible invariants (...) uniquely attached to the specific questions we may ask about the phenomena of interest. In other terms, the new paradigm for the application of mathematics does not require any understanding of the phenomenon, but rather relies on mathematics to organize data in such a way as to reveal possible invariants that may or may not provide further understanding of the phenomenon per se, but that nevertheless provide an answer to the relevant question. (shrink)
Breathing fresh air into the philosophy of mathematics Content Type Journal Article DOI 10.1007/s11016-010-9470-8 Authors Marco Panza, IHPST, 13, rue du Four, 75006 Paris, France Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796.
Editorial NoteThe following Discussion Note is an edited transcription of the discussion on G. Aldo Antonelli’s paper “Semantic Nominalism: How I Learned to Stop Worrying and Love Universals”, held among participants at the IHPST-UC Davis Workshop Ontological Commitment in Mathematics which took place, in memoriam of Aldo Antonelli, at IHPST in Paris on December, 14–15, 2015. The note’s and volume’s editors would like to thank all participants in the discussion for their contributions, and Alberto Naibo, Michael Wright and the personnel (...) at IHPST for their technical support. (shrink)
Since the application of Postulate I.2 in Euclid’s Elements is not uniform, one could wonder in what way should it be applied in Euclid’s plane geometry. Besides legitimizing questions like this from the perspective of a philosophy of mathematical practice, we sketch a general perspective of conceptual analysis of mathematical texts, which involves an extended notion of mathematical theory as system of authorizations, and an audience-dependent notion of proof.
Recent discussions on Fregean and neo-Fregean foundations for arithmetic and real analysis pay much attention to what is called either ‘Application Constraint’ or ‘Frege Constraint’, the requirement that a mathematical theory be so outlined that it immediately allows explaining for its applicability. We distinguish between two constraints, which we, respectively, denote by the latter of these two names, by showing how$AC$generalizes Frege’s views while$FC$comes closer to his original conceptions. Different authors diverge on the interpretation of$FC$and on whether it applies to (...) definitions of both natural and real numbers. Our aim is to trace the origins of$FC$and to explore how different understandings of it can be faithful to Frege’s views about such definitions and to his foundational program. After rehearsing the essential elements of the relevant debate, we appropriately distinguish$AC$from$FC$. We discuss six rationales which may motivate the adoption of different instances of$AC$and$FC$. We turn to the possible interpretations of$FC$, and advance a Semantic$FC$, arguing that while it suits Frege’s definition of natural numbers, it cannot reasonably be imposed on definitions of real numbers, for reasons only partly similar to those offered by Crispin Wright. We then rehearse a recent exchange between Bob Hale and Vadim Batitzky to shed light on Frege’s conception of real numbers and magnitudes. We argue that an Architectonic version of$FC$is indeed faithful to Frege’s definition of real numbers, and compatible with his views on natural ones. Finally, we consider how attributing different instances of$FC$to Frege and appreciating the role of the Architectonic$FC$can provide a more perspicuous understanding of his foundational program, by questioning common pictures of his logicism. (shrink)
We reconstruct essential features of Lagrange’s theory of analytical functions by exhibiting its structure and basic assumptions, as well as its main shortcomings. We explain Lagrange’s notions of function and algebraic quantity, and we concentrate on power-series expansions, on the algorithm for derivative functions, and the remainder theorem—especially on the role this theorem has in solving geometric and mechanical problems. We thus aim to provide a better understanding of Enlightenment mathematics and to show that the foundations of mathematics did not, (...) for Lagrange, concern the solidity of its ultimate bases, but rather purity of method—the generality and internal organization of the discipline. (shrink)
Nam ibn M recently edited and translated in French by Roshdi Rashed and Christian Houzel bit ibn Qurras treatise is its large use of a form of inferences that can be said in a sense that will be explained. They occur both in proofs of theorems and in solutions of problems. In the latter case, they enter different sorts of problematic analyses that are mainly used to reduce the geometrical problems they are concerned with to al-Khw’s equations.
The paper argues that Frege’s primary foundational purpose concerning arithmetic was neither that of making natural numbers logical objects, nor that of making arithmetic a part of logic, but rather that of assigning to it an appropriate place in the architectonics of mathematics and knowledge, by immersing it in a theory of numbers of concepts and making truths about natural numbers, and/or knowledge of them transparent to reason without the medium of senses and intuition.
We argue that many optimization methods can be viewed as representatives of “forcing”, a methodological approach that attempts to bridge the gap between data and mathematics on the basis of an a priori trust in the power of a mathematical technique, even when detailed, credible models of a phenomenon are lacking or do not justify the use of this technique. In particular, we show that forcing is implied in particle swarms optimization methods, and in modeling image processing problems through optimization. (...) From these considerations, we extrapolate a principle for general data analysis methods, what we call ‘Brandt’s principle’, namely the assumption that an algorithm that approaches a steady state in its output has found a solution to a problem, or needs to be replaced. We finally propose that biological systems, and other phenomena that respect general rules of morphogenesis, are a natural setting for the application of this principle. (shrink)