In this paper, we explore Fregean metatheory, what Frege called the New Science. The New Science arises in the context of Frege’s debate with Hilbert over independence proofs in geometry and we begin by considering their dispute. We propose that Frege’s critique rests on his view that language is a set of propositions, each immutably equipped with a truth value (as determined by the thought it expresses), so to Frege it was inconceivable that axioms could even be considered to be (...) other than true. Because of his adherence to this view, Frege was precluded from the sort of metatheoretical considerations that were available to Hilbert; but from this, we shall argue, it does not follow that Frege was blocked from metatheory in toto. Indeed, Frege suggests in Die Grundlagen der Geometrie a metatheoretical method for establishing independence proofs in the context of the New Science. Frege had reservations about the method, however, primarily because of the apparent need to stipulate the logical terms, those terms that must be held invariant to obtain such proofs. We argue that Frege’s skepticism on this score is not warranted, by showing that within the New Science a characterization of logical truth and logical constant can be obtained by a suitable adaptation of the permutation argument Frege employs in indicating how to prove independence. This establishes a foundation for Frege’s metatheoretical method of which he himself was unsure, and allows us to obtain a clearer understanding of Frege’s conception of logic, especially in relation to contemporary conceptions. (shrink)
The term "non-monotonic logic" covers a family of formal frameworks devised to capture and represent defeasible inference , i.e., that kind of inference of everyday life in which reasoners draw conclusions tentatively, reserving the right to retract them in the light of further information. Such inferences are called "non-monotonic" because the set of conclusions warranted on the basis of a given knowledge base does not increase (in fact, it can shrink) with the size of the knowledge base itself. This is (...) in contrast to classical (first-order) logic, whose inferences, being deductively valid, can never be "undone" by new information. (shrink)
While second-order quantifiers have long been known to admit nonstandard, or interpretations, first-order quantifiers (when properly viewed as predicates of predicates) also allow a kind of interpretation that does not presuppose the full power-set of that interpretationgeneral” interpretations for (unary) first-order quantifiers in a general setting, emphasizing the effects of imposing various further constraints that the interpretation is to satisfy.
This paper presents a formalization of first-order arithmetic characterizing the natural numbers as abstracta of the equinumerosity relation. The formalization turns on the interaction of a nonstandard cardinality quantifier with an abstraction operator assigning objects to predicates. The project draws its philosophical motivation from a nonreductionist conception of logicism, a deflationary view of abstraction, and an approach to formal arithmetic that emphasizes the cardinal properties of the natural numbers over the structural ones.
In this paper we apply the idea of Revision Rules, originally developed within the framework of the theory of truth and later extended to a general mode of deﬁnition, to the analysis of the arithmetical hierarchy. This is also intended as an example of how ideas and tools from philosophical logic can provide a diﬀerent perspective on mathematically more “respectable” entities. Revision Rules were ﬁrst introduced by A. Gupta and N. Belnap as tools in the theory of truth, and they (...) have been further developed to provide the foundations for a general theory of (possibly circular) deﬁnitions. Revision Rules are non-monotonic inductive operators that are iterated into the transﬁnite beginning with some given “bootstrapper” or “initial guess.” Since their iteration need not give rise to an increasing sequence, Revision Rules require a particular kind of operation of “passage to the limit,” which is a variation on the idea of the inferior limit of a sequence. We then deﬁne a sequence of sets of strictly increasing arithmetical complexity, and provide a representation of these sets by means of an operator G(x, φ) whose “revision” is carried out over ω2 beginning with any total function satisfying certain relatively simple conditions. Even this relatively simple constraint is later lifted, in a theorem whose proof is due to Anil Gupta. (shrink)
A propositional system of modal logic is second-order if it contains quantiﬁers ∀p and ∃p, which, in the standard interpretation, are construed as ranging over sets of possible worlds (propositions). Most second-order systems of modal logic are highly intractable; for instance, when augmented with propositional quantiﬁers, K, B, T, K4 and S4 all become eﬀectively equivalent to full second-order logic. An exception is S5, which, being interpretable in monadic second-order logic, is decidable.
The purpose of this note is to present a simplification of the system of arithmetical axioms given in previous work; specifically, it is shown how the induction principle can in fact be obtained from the remaining axioms, without the need of explicit postulation. The argument might be of more general interest, beyond the specifics of the proposed axiomatization, as it highlights the interaction of the notion of Dedekind-finiteness and the induction principle.
This paper presents a bivalent extensional semantics for positive free logic without resorting to the philosophically questionable device of using models endowed with a separate domain of “non-existing” objects. The models here introduced have only one domain, and a partial reference function for the singular terms. Such an approach provides a solution to an open problem put forward by Lambert, and can be viewed as supplying a version of parametrized truth non unlike the notion of “truth at world” found in (...) modal logic. A model theory is developed, establishing compactness, interpolation, and completeness. (shrink)
This paper introduces a generalization of Reiter’s notion of “extension” for default logic. The main difference from the original version mainly lies in the way conﬂicts among defaults are handled: in particular, this notion of “general extension” allows defaults not explicitly triggered to pre-empt other defaults. A consequence of the adoption of such a notion of extension is that the collection of all the general extensions of a default theory turns out to have a nontrivial algebraic structure. This fact has (...) two major technical fall-outs: ﬁrst, it turns out that every default theory has a general extension; second, general extensions allow one to deﬁne a well-behaved, skeptical relation of defeasible consequence for default theories, satisfying the principles of Reﬂexivity, Cut, and Cautious Monotonicity formulated by D. Gabbay. (shrink)
Henry Leonard and Karel Lambert first introduced so-called presupposition-free (or just simply: free) logics in the 1950’s in order to provide a logical framework allowing for non-denoting singular terms (be they descriptions or constants) such as “the largest prime” or “Pegasus” (see Leonard  and Lambert ). Of course, ever since Russell’s paradigmatic treatment of definite descriptions (Russell ), philosophers have had a way to deal with such terms. A sentence such as “the..
In §21 of Grundgesetze der Arithmetik asks us to consider the forms: a a2 = 4 and a a > 0 and notices that they can be obtained from a φ(a) by replacing the function-name placeholder φ(ξ) by names for the functions ξ2 = 4 and ξ > 0 (and the placeholder cannot be replaced by names of objects or of functions of 2 arguments).
This paper is concerned with the way different axiom systems for set theory can be justified by appeal to such intuitions as limitation of size, predicativity, stratification, etc. While none of the different conceptions historically resulting from the impetus to provide a solution to the paradoxes turns out to rest on an intuition providing an unshakeable foundation,'each supplies a picture of the set-theoretic universe that is both useful and internally well motivated. The same is true of more recently proposed axiom (...) systems for non-well-founded universes, and an attempt is made to motivate such axiom systems on the basis of an old and respected ‘algebraic’ intuition. (shrink)
Many diﬀerent modes of deﬁnition have been proposed over time, but none of them allows for circular deﬁnitions, since, according to the prevalent view, the term deﬁned would then be lacking a precise signiﬁcation. I argue that although circular deﬁnitions may at times fail uniquely to pick out a concept or an object, sense still can be made of them by using a rule of revision in the style adopted by Anil Gupta and Nuel Belnap in the theory of truth.
In this paper we introduce the notion of a set algebra S satisfying a system E of equations. After deﬁning a notion of freeness for such algebras, we show that, for any system E of equations, set algebras that are free in the class of structures satisfying E exist and are unique up to a bisimulation. Along the way, analogues of classical set-theoretic and algebraic properties are investigated.
Logic is an ancient discipline that, ever since its inception some 2500 years ago, has been concerned with the analysis of patterns of valid reasoning. Aristotle ﬁrst developed the theory of the syllogism (a valid argument form involving predicates and quantiﬁers), and later the Stoics singled out patterns of propositional argumentation (involving sentential connectives). The study of logic ﬂourished in ancient times and during the middle ages, when logic was regarded, together with grammar and rhetoric (the other two disciplines of (...) the trivium), as the foundation of humanistic education. (shrink)
One of the most important developments over the last twenty years both in logic and in Artiﬁcial Intelligence is the emergence of so-called non-monotonic logics. These logics were initially developed by McCarthy , McDermott & Doyle , and Reiter . Part of the original motivation was to provide a formal framework within which to model cognitive phenomena such as defeasible inference and defeasible knowledge representation, i.e., to provide a formal account of the fact that reasoners can reach conclusions tentatively, reserving (...) the right to retract them in the light of further information. (shrink)
The emergence, over the last twenty years or so, of so-called “non-monotonic” logics represents one of the most signiﬁcant developments both in logic and artiﬁcial intelligence. These logics were devised in order to represent defeasible reasoning, i.e., that kind of inference in which reasoners draw conclusions tentatively, reserving the right to retract them in the light of further evidence.
Like Elvis, logical empiricism has been officially dead for decades. But just like Elvis, it stubbornly keeps resurfacing at one juncture or another in our philosophical landscape. In fact, the more the main characters of logical empiricism recede in the distance, the more frequently they reappear, to the point that it’s fair to say that we are witnessing a veritable renaissance in studies leading to the historical appraisal of the import and influence of the logical empiricist movement.
Using PhilPapers from home?
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it: