This third edition, now available in paperback, is a follow up to the author's classic Boolean-Valued Models and Independence Proofs in Set Theory. It provides an exposition of some of the most important results in set theory obtained in the 20th century: the independence of the continuum hypothesis and the axiom of choice.
The usual meaning of the word continuous is “unbroken” or “uninterrupted”: thus a continuous entity —a continuum—has no “gaps.” We commonly suppose that space and time are continuous, and certain philosophers have maintained that all natural processes occur continuously: witness, for example, Leibniz's famous apothegm natura non facit saltus—“nature makes no jump.” In mathematics the word is used in the same general sense, but has had to be furnished with increasingly precise definitions. So, for instance, in the later 18th century (...) continuity of a function was taken to mean that infinitesimal changes in the value of the argument induced infinitesimal changes in the value of the function. With the abandonment of infinitesimals in the 19th century this definition came to be replaced by one employing the more precise concept of limit. (shrink)
This book explores and articulates the concepts of the continuous and the infinitesimal from two points of view: the philosophical and the mathematical. The first section covers the history of these ideas in philosophy. Chapter one, entitled ‘The continuous and the discrete in Ancient Greece, the Orient and the European Middle Ages,’ reviews the work of Plato, Aristotle, Epicurus, and other Ancient Greeks; the elements of early Chinese, Indian and Islamic thought; and early Europeans including Henry of Harclay, Nicholas of (...) Autrecourt, Duns Scotus, William of Ockham, Thomas Bradwardine and Nicolas Oreme. The second chapter of the book covers European thinkers of the sixteenth and seventeenth centuries: Galileo, Newton, Leibniz, Descartes, Arnauld, Fermat, and more. Chapter three, 'The age of continuity,’ discusses eighteenth century mathematicians including Euler and Carnot, and philosophers, among them Hume, Kant and Hegel. Examining the nineteenth and early twentieth centuries, the fourth chapter describes the reduction of the continuous to the discrete, citing the contributions of Bolzano, Cauchy and Reimann. Part one of the book concludes with a chapter on divergent conceptions of the continuum, with the work of nineteenth and early twentieth century philosophers and mathematicians, including Veronese, Poincaré, Brouwer, and Weyl. Part two of this book covers contemporary mathematics, discussing topology and manifolds, categories, and functors, Grothendieck topologies, sheaves, and elementary topoi. Among the theories presented in detail are non-standard analysis, constructive and intuitionist analysis, and smooth infinitesimal analysis/synthetic differential geometry. No other book so thoroughly covers the history and development of the concepts of the continuous and the infinitesimal. (shrink)
The principle of set theory known as the Axiom of Choice has been hailed as “probably the most interesting and, in spite of its late appearance, the most discussed axiom of mathematics, second only to Euclid's axiom of parallels which was introduced more than two thousand years ago” (Fraenkel, Bar-Hillel & Levy 1973, §II.4). The fulsomeness of this description might lead those unfamiliar with the axiom to expect it to be as startling as, say, the Principle of the Constancy of (...) the Velocity of Light or the Heisenberg Uncertainty Principle. But in fact the Axiom of Choice as it is usually stated appears humdrum, even self-evident. For it amounts to nothing more than the claim that, given any collection of mutually disjoint nonempty sets, it is possible to assemble a new set — a transversal or choice set — containing exactly one element from each member of the given collection. Nevertheless, this seemingly innocuous principle has far-reaching mathematical consequences — many indispensable, some startling — and has come to figure prominently in discussions on the foundations of mathematics. It (or its equivalents) have been employed in countless mathematical papers, and a number of monographs have been exclusively devoted to it. (shrink)
Hermann Weyl, one of the twentieth century's greatest mathematicians, was unusual in possessing acute literary and philosophical sensibilities—sensibilities to which he gave full expression in his writings. In this paper I use quotations from these writings to provide a sketch of Weyl's philosophical orientation, following which I attempt to elucidate his views on the mathematical continuum, bringing out the central role he assigned to intuition.
The centrality of the whole/part relation in mathematics is demonstrated through the presentation and analysis of examples from algebra, geometry, functional analysis,logic, topology and category theory.
IN THEIR WELL-KNOWN PAPER, Kochen and Specker (1967) introduce the concept of partial Boolean algebra (pBa) and show that certain (finitely generated) partial Boolean algebras arising in quantum theory fail to possess morphisms to any Boolean algebra (we call such pBa's intractable in the sequel). In this note we begin by discussing partial..
Logical Options introduces the extensions and alternatives to classical logic which are most discussed in the philosophical literature: many-sorted logic, second-order logic, modal logics, intuitionistic logic, three-valued logic, fuzzy logic, and free logic. Each logic is introduced with a brief description of some aspect of its philosophical significance, and wherever possible semantic and proof methods are employed to facilitate comparison of the various systems. The book is designed to be useful for philosophy students and professional philosophers who have learned some (...) classical first-order logic and would like to learn about other logics important to their philosophical work. (shrink)
Traditionally, expressions in formal systems have been regarded as signifying finite inscriptions which are—at least in principle—capable of actually being written out in primitive notation. However, the fact that (first-order) formulas may be identified with natural numbers (via "Gödel numbering") and hence with finite sets makes it no longer necessary to regard formulas as inscriptions, and suggests the possibility of fashioning "languages" some of whose formulas would be naturally identified as infinite sets . A "language" of this kind is called (...) an infinitary language : in this article I discuss those infinitary languages which can be obtained in a straightforward manner from first-order languages by allowing conjunctions, disjunctions and, possibly, quantifier sequences, to be of infinite length. In the course of the discussion it will be seen that, while the expressive power of such languages far exceeds that of their finitary (first-order) counterparts, very few of them possess the "attractive" features (e.g., compactness and completeness) of the latter. Accordingly, the infinitary languages that do in fact possess these features merit special attention. (shrink)
This essay is an attempt to sketch the evolution of type theory from its beginnings early in the last century to the present day. Central to the development of the type concept has been its close relationship with set theory to begin with and later its even more intimate relationship with category theory. Since it is effectively impossible to describe these relationships (especially in regard to the latter) with any pretensions to completeness within the space of a comparatively short article, (...) I have elected to offer detailed technical presentations of just a few important instances. (shrink)
This paper is concerned with Wittgenstein's early doctrine of the independence of elementary propositions. Using the notion of a free generator for a logical calculus–a concept we claim was anticipated by Wittgenstein–we show precisely why certain difficulties associated with his doctrine cannot be overcome. We then show that Russell's version of logical atomism–with independent particulars instead of elementary propositions–avoids the same difficulties.
We investigate Hilbert’s varepsilon -calculus in the context of intuitionistic type theories, that is, within certain systems of intuitionistic higher-order logic. We determine the additional deductive strength conferred on an intuitionistic type theory by the adjunction of closed varepsilon -terms. We extend the usual topos semantics for type theories to the varepsilon -operator and prove a completeness theorem. The paper also contains a discussion of the concept of “partially defined‘ varepsilon -term. MSC: 03B15, 03B20, 03G30.
is a presentation of mathematics in terms of the fundamental concepts of transformation, and composition of transformations. While the importance of these concepts had long been recognized in algebra (for example, by Galois through the idea of a group of permutations) and in geometry (for example, by Klein in his Erlanger Programm), the truly universal role they play in mathematics did not really begin to be appreciated until the rise of abstract algebra in the 1930s. In abstract algebra the idea (...) of transformation of structure (homomorphism) was central from the beginning, and it soon became apparent to algebraists that its most important concepts and constructions were in fact formulable in terms of that idea alone. Thus emerged the view that the essence of a mathematical structure is to be sought not in its internal constitution, but rather in the nature of its relationships with other structures of the same kind, as manifested through the network of transformations. This idea has achieved its fullest expression in category theory, an axiomatic framework within which the notions of transformation (as morphism or arrow) and composition (and also structure, as object) are fundamental, that is, are not defined in terms of anything else. (shrink)
then E has a subset which is the domain of a model of Peano's axioms for the natural numbers. (This result is proved explicitly, using classical reasoning, in section 3 of .) My purpose in this note is to strengthen this result in two directions: first, the premise will be weakened so as to require only that the map ν be defined on the family of (Kuratowski) finite subsets of the set E, and secondly, the argument will be constructive, i.e., (...) will involve no use of the law of excluded middle. To be precise, we will prove, in constructive (or intuitionistic) set theory3, the following.. (shrink)
A weak form of intuitionistic set theory WST lacking the axiom of extensionality is introduced. While WST is too weak to support the derivation of the law of excluded middle from the axiom of choice, we show that bee.ng up WST with moderate extensionality principles or quotient sets enables the derivation to go through.
If we imagine a chess-board with alternate blue and red squares, then this is something in which the individual red and blue areas allow themselves to be distinguished from each other in juxtaposition, and something similar holds also if we imagine each of the squares divided into four smaller squares also alternating between these two colours. If, however, we were to continue with such divisions until we had exceeded the boundary of noticeability for the individual small squares which result, then (...) it would no longer be possible to apprehend the individual red and blue areas in their respective positions. But would we then see nothing at all? Not in the least; rather we would see the whole chessboard as violet, i.e. apprehend it as something that participates simultaneously in red and blue. (shrink)
ABSTRACT: It is characteristic of a continuum that it be “all of one piece”, in the sense of being inseparable into two (or more) disjoint nonempty parts. By taking “part” to mean open (or closed) subset of the space, one obtains the usual topological concept of connectedness . Thus a space S is defined to be connected if it cannot be partitioned into two disjoint nonempty open (or closed) subsets – or equivalently, given any partition of S into two open (...) (or closed) subsets, one of the members of the partition must be empty. This holds, for example, for the space R of real numbers and for all of its open or closed intervals. Now a truly radical condition results from taking the idea of being “all of one piece” literally, that is, if it is taken to mean inseparability into any disjoint nonempty parts, or subsets, whatsoever . A space S satisfying this condition is called cohesive or indecomposable. While the law of excluded middle of classical logic reduces indecomposable spaces to the trivial empty space and one-point spaces, the use of intuitionistic logic makes it possible not only for nontrivial cohesive spaces to exist, but for every connected space to be cohesive.In this paper I describe the philosophical background to cohesiveness as well as some of the ways in which the idea is modelled in contemporary mathematics. (shrink)
In this paper the view is developed that classes should not be understood as individuals, but, rather, as "classes as many" of individuals. To correlate classes with individuals "labelling" and "colabelling" functions are introduced and sets identified with a certain subdomain of the classes on which the labelling and colabelling functions are mutually inverse. A minimal axiomatization of the resulting system is formulated and some of its extensions are related to various systems of set theory, including nonwellfounded set theories.
It is shown by Parsons  that the first-order fragment of Frege's logical system in the Grundgesetze der Arithmetic is consistent. In this note we formulate and prove a stronger version of this result for arbitrary first-order theories. We also show that a natural attempt to further strengthen our result runs afoul of Tarski's theorem on the undefinability of truth.
Full proofs of the Gödel incompleteness theorems are highly intricate affairs. Much of the intricacy lies in the details of setting up and checking the properties of a coding system representing the syntax of an object language (typically, that of arithmetic) within that same language. These details are seldom illuminating and tend to obscure the core of the argument. For this reason a number of efforts have been made to present the essentials of the proofs of Gödel’s theorems without getting (...) mired in syntactic or computational details. One of the most important of these efforts was made by Löb  in connection with his analysis of sentences asserting their own provability. Löb formulated three conditions (now known as the Hilbert-Bernays-Löb derivability conditions), on the provability predicate in a formal system which are jointly sufficient to yield the Gödel’s second incompleteness theorem for it. A key role in Löb’s analysis is played by (a special case of) what later became known as the diagonalization or fixed point property of formal systems, a property which had already, in essence, been exploited by Gödel in his original proofs of the incompleteness theorems. The fixed point property plays a central role in Lawvere’s  category-theoretic account of incompleteness phenomena (see also ). Incompleteness theorems have also been subjected to intensive investigation within the framework of modal logic (see, e.g., ). In this formulation the modal operator takes up the role previously played by the provability predicate, and the derivability conditions on the latter are translated into algebraic conditions (the so-called GL, i.e., Gödel–Löb, conditions) on the former. My purpose here is to present a framework for incompleteness phenomena, fully compatible with intuitionistic or constructive principles, in which the idea of a coding system is retained, only in a 2 simple, but very general form, a form wholly free of syntactical notions. As codes we shall take the elements of an arbitrary given nonempty set, possibly, but not necessarily, the set of natural numbers.. (shrink)
In this paper I reflect on the nature of mathematical beauty, and examine the connections between mathematics and the arts. I employ Plutarch’s distinction between the intelligible and the sensible, to compare the beauty of mathematics with the beauties of music, poetry and painting. While the beauty of mathematics is almost exclusively intelligible, and the beauties of these arts primarily sensible, it is pointed out that the latter share with mathematics a certain kind of intelligible beauty. The paper also contains (...) reflections on the formal beauty and timelessness of mathematics, beauty as richness flowing from simplicity, form and content in mathematics, and mathematics and fiction. It concludes with some remarks on the question of why mathematical beauty is so little appreciated by non-mathematicians. (shrink)
Why should one believe that conscious awareness is solely the result of organizational complexity? What is the connection between consciousness and combinatorics: transformation of quantity into quality? The claim that the former is reducible to the other seems unconvincing—as unlike as chalk and cheese! In his book1 Penrose is at least attempting to compare like with like: the enigma of consciousness with the progress of physics.
This book is written for those who are in sympathy with its spirit. This spirit is different from the one which informs the vast stream of European and American civilization in which all of us stand. That spirit expresses itself in an onwards movement, in building ever larger and more complicated structures; the other in striving in clarity and perspicuity in no matter what structure. The first tries to grasp the world by way of its periphery—in its variety; the second (...) at its centre—in its essence. And so the first adds one construction to another, moving on and up, as it were, from one thing to the next, while the other remains where it is and what it tries to grasp is always the same. (shrink)
Some aspects of the theory of Boolean algebras and distributive lattices–in particular, the Stone Representation Theorems and the properties of filters and ideals–are analyzed in a constructive setting.
This Element is an exposition of second- and higher-order logic and type theory. It begins with a presentation of the syntax and semantics of classical second-order logic, pointing up the contrasts with first-order logic. This leads to a discussion of higher-order logic based on the concept of a type. The second Section contains an account of the origins and nature of type theory, and its relationship to set theory. Section 3 introduces Local Set Theory, an important form of type theory (...) based on intuitionistic logic. In Section 4 number of contemporary forms of type theory are described, all of which are based on the so-called 'doctrine of propositions as types'. We conclude with an Appendix in which the semantics for Local Set Theory - based on category theory - is outlined. (shrink)
One of the most familiar uses of the Russell paradox, or, at least, of the idea underlying it, is in proving Cantor's theorem that the cardinality of any set is strictly less than that of its power set. The other method of proving Cantor's theorem Ã¢â¬â employed by Cantor himself in showing that the set of real numbers is uncountable Ã¢â¬â is that of diagonalization. Typically, diagonalization arguments are used to show that function spaces are "large" in a suitable sense. (...) Classically, these two methods are equivalent. But constructively they are not: while the argument for Russell's paradox is perfectly constructive, (i.e., employs intuitionistically acceptable principles of logic) the method of diagonalization fails to be so. I describe the ways in which these two methods.. (shrink)
Axioms for the continuum, or smooth real line R. These include the usual axioms for a commutative ring with unit expressed in terms of two operations + and i , and two distinguished elements 0 ≠ 1. In addition we stipulate that R is a local ring, i.e., the following axiom: ∃y x i y = 1 ∨ ∃y (1 – x) i y = 1. Axioms for the strict order relation < on R. These are: 1. a < b (...) and b < c implies a < c. 2. ¬(a < a) 3. a < b implies a + c < b + c for any c. ≤ 4. a < b and 0 < c implies acbc.. (shrink)
A weak form of intuitionistic set theory WST lacking the axiom of extensionality is introduced. While WST is too weak to support the derivation of the law of excluded middle from the axiom of choice, we show that beeﬁng up WST with moderate extensionality principles or quotient sets enables the derivation to go through.
Full proofs of the Gödel incompleteness theorems are highly intricate affairs. Much of the intricacy lies in the details of setting up and checking the properties of a coding system representing the syntax of an object language within that same language. These details are seldom illuminating and tend to obscure the core of the argument. For this reason a number of efforts have been made to present the essentials of the proofs of Gödel's theorems without getting mired in syntactic or (...) computational details. One of the most important of these efforts was made by Löb  in connection with his analysis of sentences asserting their own provability. Löb formulated three conditions, on the provability predicate in a formal system which are jointly sufficient to yield the Gödel's second incompleteness theorem for it. A key role in Löb's analysis is played by what later became known as the diagonalization or fixed point property of formal systems, a property which had already, in essence, been exploited by Gödel in his original proofs of the incompleteness theorems. The fixed point property plays a central role in Lawvere's  category-theoretic account of incompleteness phenomena. (shrink)
Bernard Bolzano , one of the leading figures of the Bohemian Enlightenment, made important contributions both to mathematics and philosophy which were virtually unknown in his lifetime and are still largely unacknowledged today. As a mathematician, he was a pioneer in the clarification and rigorization of mathematical analysis; as a philosopher, he may be considered a forerunner of the analytic movement later to emerge with Frege and Russell.Rusnock's account of Bolzano's work is laid out in five chapters and two appendices. (...) The introductory first chapter consists of an overview of Bolzano's philosophy and his account of scientific theories. For Bolzano, the importance of such theories—as in mathematics—is their formal structure. Unlike his philosophical predecessors, Bolzano held that the goal of scientific presentation is not to maximize certainty, but rather to set out the objective order of truth.Chapter 2 is an analysis of Bolzano's early work, Contributions to a Better Founded Presentation of Mathematics, in which a number of his characteristic doctrines concerning mathematics are adumbrated. In sharp contradistinction to Kant, Bolzano thought that the methodology of mathematics is logic, and that mathematics should be founded independently of epistemology. In the Contributions Bolzano formulates a ‘thoroughgoing refutation of Kant's account of mathematical knowledge’, repudiating in particular …. (shrink)
I describe two approaches to modelling the universe, the one having its origin in topos theory and differential geometry, the other in set theory. The first is synthetic differential geometry. Traditionally, there have been two methods of deriving the theorems of geometry: the analytic and the synthetic. While the analytical method is based on the introduction of numerical coordinates, and so on the theory of real numbers, the idea behind the synthetic approach is to furnish the subject of geometry with (...) a purely geometric foundation in which the theorems are then deduced by purely logical means from an initial body of postulates. The most familiar examples of the synthetic geometry are classical Euclidean geometry and the synthetic projective geometry introduced by Desargues in the 17th century and revived and developed by Carnot, Poncelet, Steiner and others during the 19th century. The power of analytic geometry derives very largely from the fact that it permits the methods of the calculus, and, more generally, of mathematical analysis, to be introduced into geometry, leading in particular to differential geometry (a term, by the way, introduced in 1894 by the Italian geometer Luigi Bianchi). That being the case, the idea of a “synthetic” differential geometry seems elusive: how can differential geometry be placed on a “purely geometric” or “axiomatic” foundation when the apparatus of the calculus seems inextricably involved? To my knowledge there have been two attempts to develop a synthetic differential geometry. The first was initiated by Herbert Busemann in the 1940s, building on earlier work of Paul Finsler. Here the idea was to build a differential geometry that, in its author’s words, “requires no derivatives”: the basic objects in Busemann’s approach are not differentiable manifolds, but metric spaces of a certain type in which the notion of a geodesic can be defined in an intrinsic manner. I shall not have anything more to say about this approach. The second approach, that with which I shall be concerned here, was originally proposed in the 1960s by F.. (shrink)
In this paper a number of oppositions which have haunted mathematics and philosophy are described and analyzed. These include the Continuous and the Discrete, the One and the Many, the Finite and the Infinite, the Whole and the Part, and the Constant and the Variable.
Traditionally, there have been two methods of deriving the theorems of geometry: the analytic and the synthetic. While the analytical method is based on the introduction of numerical coordinates, and so on the theory of real numbers, the idea behind the synthetic approach is to furnish the subject of geometry with a purely geometric foundation in which the theorems are then deduced by purely logical means from an initial body of postulates. The most familiar examples of the synthetic geometry are (...) classical Euclidean geometry and the synthetic projective geometry introduced by Desargues in the 17th century and revived and developed by Carnot, Poncelet, Steiner and others during the 19th century. The power of analytic geometry derives very largely from the fact that it permits the methods of the calculus, and, more generally, of mathematical analysis, to be introduced into geometry, leading in particular to differential geometry (a term, by the way, introduced in 1894 by the Italian geometer Luigi Bianchi). That being the case, the idea of a “synthetic” differential geometry seems elusive: how can differential geometry be placed on a “purely geometric” or “axiomatic” foundation when the apparatus of the calculus seems inextricably involved? To my knowledge there have been two attempts to develop a synthetic differential geometry. The first was initiated by Herbert Busemann in the 1940s, building on earlier work of Paul Finsler. Here the idea was to build a differential geometry that, in its author’s words, “requires no derivatives”: the basic objects in Busemann’s approach are not differentiable manifolds, but metric spaces of a certain type in which the notion of a geodesic can be defined in an intrinsic manner. I shall not have anything more to say about this approach. The second approach, that with which I shall be concerned here, was originally proposed in the 1960s by F. W. Lawvere, who was in fact striving to fashion a decisive axiomatic framework for continuum mechanics.. (shrink)
to indicate that the object a is an element or member of the class A. We assume that every member of a class is an object. Lower-case letters a, b, c, x, y, z, … will always denote objects, and later, sets. Equality between classes is governed by the Axiom of Extensionality.
On the contrary, I find nothing in logistic but shackles. It does not help us at all in the direction of conciseness, far from it; and if it requires 27 equations to establish that 1 is a number, how many will it require to demonstrate a real theorem?