In this paper we consider the major development of mathematicalanalysis during the mid-nineteenth century. On the basis of Jahnke’s (Hist Math 20(3):265–284, 1993 ) distinction between considering mathematics as an empirical science based on time and space and considering mathematics as a purely conceptual science we discuss the Swedish nineteenth century mathematician E.G. Björling’s general view of real- and complexvalued functions. We argue that Björling had a tendency to sometimes consider mathematical objects in a naturalistic way. (...) One example is how Björling interprets Cauchy’s definition of the logarithm function with respect to complex variables, which is investigated in the paper. Furthermore, in view of an article written by Björling (Kongl Vetens Akad Förh Stockholm 166–228, 1852 ) we consider Cauchy’s theorem on power series expansions of complex valued functions. We investigate Björling’s, Cauchy’s and the Belgian mathematician Lamarle’s different conditions for expanding a complex function of a complex variable in a power series. We argue that one reason why Cauchy’s theorem was controversial could be the ambiguities of fundamental concepts in analysis that existed during the mid-nineteenth century. This problem is demonstrated with examples from Björling, Cauchy and Lamarle. (shrink)
This fundamental and straightforward text addresses a weakness observed among present-day students, namely a lack of familiarity with formal proof. Beginning with the idea of mathematical proof and the need for it, associated technical and logical skills are developed with care and then brought to bear on the core material of analysis in such a lucid presentation that the development reads naturally and in a straightforward progression. Retaining the core text, the second edition has additional worked examples which (...) users have indicated a need for, in addition to more emphasis on how analysis can be used to tell the accuracy of the approximations to the quantities of interest which arise in analytical limits. (shrink)
Interest in the computational aspects of modeling has been steadily growing in philosophy of science. This paper aims to advance the discussion by articulating the way in which modeling and computational errors are related and by explaining the significance of error management strategies for the rational reconstruction of scientific practice. To this end, we first characterize the role and nature of modeling error in relation to a recipe for model construction known as Euler’s recipe. We then describe a general model (...) that allows us to assess the quality of numerical solutions in terms of measures of computational errors that are completely interpretable in terms of modeling error. Finally, we emphasize that this type of error analysis involves forms of perturbation analysis that go beyond the basic model-theoretical and statistical/probabilistic tools typically used to characterize the scientific method; this demands that we revise and complement our reconstructive toolbox in a way that can affect our normative image of science. (shrink)
George Boole collected ideas for the improvement of his Mathematicalanalysis of logic(1847) on interleaved copies of that work. Some of the notes on the interleaves are merely minor changes in explanation. Others amount to considerable extension of method in his mathematical approach to logic. In particular, he developed his technique in solving simultaneous elective equations and handling hypotheticals and elective functions. These notes and extensions provided a source for his later book Laws of thought(1854).
A remarkable development in twentieth-century mathematics is smooth infinitesimal analysis ('SIA'), introducing nilsquare and nilpotent infinitesimals, recovering the bulk of scientifically applicable classical analysis ('CA') without resort to the method of limits. Formally, however, unlike Robinsonian 'nonstandard analysis', SIA conflicts with CA, deriving, e.g., 'not every quantity is either = 0 or not = 0.' Internally, consistency is maintained by using intuitionistic logic (without the law of excluded middle). This paper examines problems of interpretation resulting from this (...) 'change of logic', arguing that standard arguments based on 'smoothness' requirements are question-begging. Instead, it is suggested that recent philosophical work on the logic of vagueness is relevant, especially in the context of a Hilbertian structuralist view of mathematical axioms (as implicitly defining structures of interest). The relevance of both topos models for SIA and modal-structuralism as appled to this theory is clarified, sustaining this remarkable instance of mathematical pluralism. (shrink)
A two strain HIV/AIDS model with treatment which allows AIDS patients with sensitive HIV-strain to undergo amelioration is presented as a system of non-linear ordinary differential equations. The disease-free equilibrium is shown to be globally asymptotically stable when the associated epidemic threshold known as the basic reproduction number for the model is less than unity. The centre manifold theory is used to show that the sensitive HIV-strain only and resistant HIV-strain only endemic equilibria are locally asymptotically stable when the associated (...) reproduction numbers are greater than unity. Qualitative analysis of the model including positivity, boundedness and persistence of solutions are presented. The model is numerically analysed to assess the effects of treatment with amelioration on the dynamics of a two strain HIV/AIDS model. Numerical simulations of the model show that the two strains co-exist whenever the reproduction numbers exceed unity. Further, treatment with amelioration may result in an increase in the total number of infective individuals (asymptomatic) but results in a decrease in the number of AIDS patients. Further, analysis of the reproduction numbers show that antiretroviral resistance increases with increase in antiretroviral use. (shrink)
We give an overview of recent results in ordinal analysis. Therefore, we discuss the different frameworks used in mathematical proof-theory, namely "subsystem of analysis" including "reverse mathematics", "Kripke-Platek set theory", "explicit mathematics", "theories of inductive definitions", "constructive set theory", and "Martin-Löf's type theory".
Mathematical proofs generally allow for various levels of detail and conciseness, such that they can be adapted for a particular audience or purpose. Using automated reasoning approaches for teaching proof construction in mathematics presupposes that the step size of proofs in such a system is appropriate within the teaching context. This work proposes a framework that supports the granularity analysis of mathematical proofs, to be used in the automated assessment of students' proof attempts and for the presentation (...) of hints and solutions at a suitable pace. Models for granularity are represented by classifiers, which can be generated by hand or inferred from a corpus of sample judgments via machine-learning techniques. This latter procedure is studied by modeling granularity judgments from four experts. The results provide support for the granularity of assertion-level proofs but also illustrate a degree of subjectivity in assessing step size. (shrink)
The unified theory of dose and effect, as indicated by the median-effect equation for single and multiple entities and for the first and higher order kinetic/dynamic, has been established by T.C. Chou and it is based on the physical/chemical principle of the massaction law (J. Theor. Biol. 59: 253-276, 1976 (質量作用中效定理) and Pharmacological Rev. 58: 621-681, 2006) (普世中效指數定理). The theory was developed by the principle of mathematical induction and deduction (數學演繹歸納法). Rearrangements of the median-effect equation lead to Michaelis-Menten, Hill, (...) Scatchard, and Henderson-Hasselbalch equations. The “median” serves as the universal reference point and the “common link” for the relationship of all entities and is also the “harmonic mean” of kinetic dissociation constants. Over 300 mechanism-specific equations have been derived and published using the mathematical induction-deduction process. These equations can be deduced into several general equations, including the median-mediated whole/part equation, combination index theorem, isobologram equation, and polygonogram. It is proven that “dose” and “effect” are interchangeable, thus, “substance” and “function” are interchangeable, which leads to “the unity theory” (劑效、心物、知行一元論) in quantitative mathematical philosophy (數學的定量哲學) in functional context. Therefore, a general theory centered on the “median” and based on equilibrium dynamics has evolved. In other words: [「中」的宇宙觀： 以「中」爲基凖的動力學生態平衡]. Based on the median-effect equation of the mass-action law, the fundamental claim is that we can draw “a specific cure” for only two data points, if they are determined accurately. This claim has far reaching consequences since it defies the general held belief that two points can dray only a straight line. Remarkably, the unity theory (一元論) providesscientific/mathematical interpretation in equations and in graphics of Chinese ancient philosophy, including Fu-Si Ba Gua (伏羲八卦), Dao’s Harmony (和諧), the Confucian doctrine of the mean (儒家中庸之道), Chou Dun-Yi’s (周敦頤, 1017-1073) From Wu-ji to Tai-ji and Taiji Tu Sho (無極而太極及太極圖說). The moderntopological analysis for trinity yields an exact correspondence to the Ba-Gua, which was introduced over 4,000 years ago. Furthermore, the median-centered algorithm, promotes modern ecological content (生態學) in the equilibral dynamic state of harmony. It is concluded that Western science and Eastern philosophy are directly linked and complementary to each other. Since the truth in mathematical quantitative philosophy (數學的定量哲學) has no boundaries, East and West philosophies can flourish together for the common goal and ideal in science and in humanity (世界大同). (shrink)
In Pninis grammar of Sanskrit one finds the ivastras, a table which defines the natural classes of phonological segments in Sanskrit by intervals. We present a formal argument which shows that, using his representation method, Pninis way of ordering the phonological segments to represent the natural classes is optimal. The argument is based on a strictly set-theoretical point of view depending only on the set of natural classes and does not explicitly take into account the phonological features of the segments, (...) which are, however, implicitly given in the way a language clusters its phonological inventory. The key idea is to link the graph of the Hasse-diagram of the set of natural classes closed under intersection to ivastra-style representations of the classes. Moreover, the argument is so general that it allows one to decide for each set of sets whether it can be represented with Pninis method. Actually, Pnini had to modify the set of natural classes to define it by the ivastras (the segment h plays a special role). We show that this modification was necessary and, in fact, the best possible modification. We discuss how every set of classes can be modified in such a way that it can be defined in a ivastra-style representation.1. (shrink)
The ancient Greek method of analysis has a rational reconstruction in the form of the tableau method of logical proof. This reconstruction shows that the format of analysis was largely determined by the requirement that proofs could be formulated by reference to geometrical figures. In problematic analysis, it has to be assumed not only that the theorem to be proved is true, but also that it is known. This means using epistemic logic, where instantiations of variables are (...) typically allowed only with respect to known objects. This requirement explains the preoccupation of Greek geometers with questions as to which geometrical objects are ?given?, that is, known or ?data?, as in the title of Euclid's eponymous book. In problematic analysis, constructions had to rely on objects that are known only hypothetically. This seems strange unless one relies on a robust idea of ?unknown? objects in the same sense as the unknowns of algebra. The Greeks did not have such a concept, which made their grasp of the analytic method shaky. (shrink)
This paper deals with meta-statistical questions concerning frequentist statistics. In Sections 2 to 4 I analyse the dispute between Fisher and Neyman on the so called logic of statistical inference, a polemic that has been concomitant of the development of mathematical statistics. My conclusion is that, whenever mathematical statistics makes it possible to draw inferences, it only uses deductive reasoning. Therefore I reject Fisher's inductive approach to the statistical estimation theory and adhere to Neyman's deductive one. On the (...) other hand, I assert that Neyman-Pearson's testing theory, as well as Fisher's tests of significance, properly belong to decision theory, not to logic, neither deductive nor inductive. I then also disagree with Costantini's view of Fisher's testing model as a theory of hypothetico-deductive inferences.In Section 5 I disapprove Hacking1's evidentialists criticisms of the Neyman-Pearson's theory of statistics (NPT), as well as Hacking2's interpretation of NPT as a theory of probable inference. In both cases Hacking misses the point. I conclude, by claiming that Mayo's conception of the Neyman-Pearson's testing theory, as a model of learning from experience, does not purport any advantages over Neyman's behavioristic model. (shrink)
In this commentary to Napoletani et al. (Found Sci 16:1–20, 2011), we argue that the approach the authors adopt suggests that neural nets are mathematical techniques rather than models of cognitive processing, that the general approach dates as far back as Ptolemy, and that applied mathematics is more than simply applying results from pure mathematics.
Estimation of the repartition of asynchronous cells in the cell cycle can be explained by two hypotheses:– - the cells are supposed to be distributed into three groups: cells with a 2c DNA content (G0/1 phase), cells with a 4c DNA content (G2+M phase) and cells with a DNA content ranging from 2c to 4c (S phase); – - there is a linear relationship between the amount of fluorescence emitted by the fluorescent probe which reveals the DNA and the DNA (...) content. According to these hypotheses, the cell cycle can be represented by the following equation. (shrink)
The work of Bertrand Russell had a decisive influence on the emergence of analytic philosophy, and on its subsequent development. The essays collected in this volume, by one of the leading authorities on Russell's philosophy, all aim at recapturing and articulating aspects of Russell's philosophical vision during his most influential and important period, the two decades following his break with Idealism in 1899. One theme of the collection concerns Russell's views about propositions and their analysis, and the relation of (...) those ideas to his rejection of Idealism. Another theme is the development of Russell's logicism, culminating in Whitehead's and Russell's Principia Mathematica, and Hylton offers a revealing view of the conception of logic which underlies it. Here again there is an emphasis on Russell's argument against Idealism, on the idea that his logicism was a crucial part of that argument. A further focus of the volume is Russell's views about functions and propositional functions. This theme is part of a contrast that Hylton draws between Russell's general philosophical position and that of Frege; in particular, there is a close parallel with the quite different views that the two philosophers held about the nature of philosophical analysis. Hylton also sheds valuable light on the much-disputed idea of an operation, which Wittgenstein advances in the Tractatus Logico-Philosophicus. (shrink)
In this paper we study a new approach to classify mathematical theorems according to their computational content. Basically, we are asking the question which theorems can be continuously or computably transferred into each other? For this purpose theorems are considered via their realizers which are operations with certain input and output data. The technical tool to express continuous or computable relations between such operations is Weihrauch reducibility and the partially ordered degree structure induced by it. We have identified certain (...) choice principles such as co-finite choice, discrete choice, interval choice, compact choice and closed choice, which are cornerstones among Weihrauch degrees and it turns out that certain core theorems in analysis can be classified naturally in this structure. In particular, we study theorems such as the Intermediate Value Theorem, the Baire Category Theorem, the Banach Inverse Mapping Theorem, the Closed Graph Theorem and the Uniform Boundedness Theorem. We also explore how existing classifications of the Hahn—Banach Theorem and Weak Kőnig's Lemma fit into this picture. Well-known omniscience principles from constructive mathematics such as LPO and LLPO can also naturally be considered as Weihrauch degrees and they play an important role in our classification. Based on this we compare the results of our classification with existing classifications in constructive and reverse mathematics and we claim that in a certain sense our classification is finer and sheds some new light on the computational content of the respective theorems. Our classification scheme does not require any particular logical framework or axiomatic setting, but it can be carried out in the framework of classical mathematics using tools of topology, computability theory and computable analysis. We develop a number of separation techniques based on a new parallelization principle, on certain invariance properties of Weihrauch reducibility, on the Low Basis Theorem of Jockusch and Soare and based on the Baire Category Theorem. Finally, we present a number of metatheorems that allow to derive upper bounds for the classification of the Weihrauch degree of many theorems and we discuss the Brouwer Fixed Point Theorem as an example. (shrink)
The work of Bertrand Russell had a decisive influence on the emergence of analytic philosophy, and on its subsequent development. The prize-winning Russell scholar Peter Hylton presents here some of his most celebrated essays from the last two decades, all of which strive to recapture and articulate Russell's monumental vision. Relating his work to that of other philosophers, particularly Frege and Wittgenstein, and featuring a previously unpublished essay and a helpful new introduction, the volume will be essential for anyone engaged (...) with the history of twentieth-century ideas. (shrink)
This paper traces August Comte’s attempts to get clear about the concept of mathematicalanalysis at various stages in his intellectual development. Comte was especially concerned with distinguishing a method of analysis for the resolution of complex prolems from analysis in the sense of a method of drawing inferences. Geometrical analysis serves as his model for the former. In his attempt to get clear about this notion, he discovers an historical succession of different methods all (...) of which may be labeled “analytic.” In modern terms, Comte reveals how each of these methods of analysis characterizes a research program in mathematics, even showing us how more powerful methods came to supplant less powerful methods of analysis. (shrink)
A Christian approach to scholarship, directed by the central biblical motive of creation, fall and redemption and guided by the theoretical idea that God subjected all of creation to His Law-Word, delimiting and determining the cohering diversity we experience within reality, in principle safe-guards those in the grip of this ultimate commitment and theoretical orientation from absolutizing or deifying anything within creation. In this article my over-all approach is focused on the one-sided legacy of mathematics, starting with Pythagorean arithmeticism (“everything (...) is number”), continuing with the geometrization of mathematics after the discovery of irrational numbers and once again, during the nineteenth century returning to an arithmeticistic position. The third option, never explored during the history of mathematics, guides our analysis: instead of reducing space to number or number to space it is argued that both the uniqueness of these two aspects and their mutual coherence ought to direct mathematics. The presence of different schools of thought is highlighted and then the argument proceeds by distinguishing numerical and spatial facts, while accounting for the strict correlation of operations on the law side of the numerical aspect and their correlated numerical subjects (numbers). Discussing the examples of 2 + 2 = 4 and the definition of a straight line as the shortest distance between two points provide the background for a brief sketch of the third alternative proposed (inter alia against the background of an assessment of infinity and continuity and the vicious circles present in contemporary mathematical arithmeticistic claims). (shrink)
The study that George Lakoff and Rafael Núñez call "idea analysis" and begin in their recent book Where mathematics comes from is intended to dissect mathematical concepts into their metaphorical parts, where metaphor is used in the cognitive-science sense promoted by Lakoff and Mark Johnson in Metaphors we live by and subsequent works by each of them and together. Lakoff and Núñez's analysis of the (modern) algebraic concept of group is based on the attribution to contemporary mathematics (...) of what will be widely recognizable by their name for it, the folk theory of essences. I argue that this philosophical basis for their analysis is spurious and supply an alternative analysis of the same concept within their "metaphorical" paradigm but without essences. This analysis, which I hope is more viable than theirs, is intended to support the general applicability of the paradigm by freeing it from outmoded philosophical baggage. (shrink)
The foundation of Mathematics is both a logico-formal issue and an epistemological one. By the first, we mean the explicitation and analysis of formal proof principles, which, largely a posteriori, ground proof on general deduction rules and schemata. By the second, we mean the investigation of the constitutive genesis of concepts and structures, the aim of this paper. This “genealogy of concepts”, so dear to Riemann, Poincaré and Enriques among others, is necessary both in order to enrich the foundational (...)analysis with an often disregarded aspect (the cognitive and historical constitution of mathematical structures) and because of the provable incompleteness of proof principles also in the analysis of deduction. For the purposes of our investigation, we will hint here to a philosophical frame as well as to some recent experimental studies on numerical cognition that support our claim on the cognitive origin and the constitutive role of mathematical intuition. (shrink)
Chihara here develops a mathematical system in which there are no existence assertions but only assertions of the constructibility of certain sorts of things. He utilizes this system in the analysis of the nature of mathematics, and discusses many recent works in the philosophy of mathematics from the viewpoint of the constructibility theory developed. This innovative analysis will appeal to mathematicians and philosophers of logic, mathematics, and science.
We now know of a number of ways of developing real analysis on a basis of abstraction principles and second-order logic. One, outlined by Shapiro in his contribution to this volume, mimics Dedekind in identifying the reals with cuts in the series of rationals under their natural order. The result is an essentially structuralist conception of the reals. An earlier approach, developed by Hale in his "Reals byion" program differs by placing additional emphasis upon what I here term Frege's (...) Constraint, that a satisfactory foundation for any branch of mathematics should somehow so explain its basic concepts that their applications are immediate. This paper is concerned with the meaning of and motivation for this constraint. Structuralism has to represent the application of a mathematical theory as always posterior to the understanding of it, turning upon the appreciation of structural affinities between the structure it concerns and a domain to which it is to be applied. There is, therefore, a case that Frege's Constraint has bite whenever there is a standing body of informal mathematical knowledge grounded in direct reflection upon sample, or schematic, applications of the concepts of the theory in question. It is argued that this condition is satisfied by simple arithmetic and geometry, but that in view of the gap between its basic concepts (of continuity and of the nature of the distinctions among the individual reals) and their empirical applications, it is doubtful that Frege's Constraint should be imposed on a neo-Fregean construction of analysis. (shrink)
The paper discusses how systems biology is working toward complex accounts that integrate explanation in terms of mechanisms and explanation by mathematical models—which some philosophers have viewed as rival models of explanation. Systems biology is an integrative approach, and it strongly relies on mathematical modeling. Philosophical accounts of mechanisms capture integrative in the sense of multilevel and multifield explanations, yet accounts of mechanistic explanation (as the analysis of a whole in terms of its structural parts and their (...) qualitative interactions) have failed to address how a mathematical model could contribute to such explanations. I discuss how mathematical equations can be explanatorily relevant. Several cases from systems biology are discussed to illustrate the interplay between mechanistic research and mathematical modeling, and I point to questions about qualitative phenomena (rather than the explanation of quantitative details), where quantitative models are still indispensable to the explanation. Systems biology shows that a broader philosophical conception of mechanisms is needed, which takes into account functional-dynamical aspects, interaction in complex networks with feedback loops, system-wide functional properties such as distributed functionality and robustness, and a mechanism’s ability to respond to perturbations (beyond its actual operation). I offer general conclusions for philosophical accounts of explanation. (shrink)
In this paper it is argued that the fundamental difference of the formal and the informal position in the philosophy of mathematics results from the collision of an object and a process centric perspective towards mathematics. This collision can be overcome by means of dialectical analysis, which shows that both perspectives essentially depend on each other. This is illustrated by the example of mathematical proof and its formal and informal nature. A short overview of the employed materialist dialectical (...) approach is given that rationalises mathematical development as a process of model production. It aims at placing more emphasis on the application aspects of mathematical results. Moreover, it is shown how such production realises subjective capacities as well as objective conditions, where the latter are mediated by mathematical formalism. The approach is further sustained by Polanyi’s theory of problem solving and Stegmaier’s philosophy of orientation. In particular, the tool and application perspective illuminates which role computer-based proofs can play in mathematics. (shrink)
It is shown how the historiographic purport of Lakatosian methodology of mathematics is structured on the theme of analysis and synthesis. This theme is explored and extended to the revolutionary phase around 1800. On the basis of this historical investigation it is argued that major innovations, crucial to the appraisal of mathematical progress, defy reconstruction as irreducibly rational processes and should instead essentially be understood as processes of social-cognitive interaction. A model of conceptual change is developed whose essential (...) ingredients are the variability of rational responses to new intellectual and practical challenges arising in the cultural environment of mathematics, and the shifting selective pressure of society. The resulting view of mathematical development is compared with Kuhn's theory of scientific paradigms in the light of some personal communications. (shrink)