The Bounds of Logic presents a new philosophical theory of the scope and nature of logic based on critical analysis of the principles underlying modern Tarskian logic and inspired by mathematical and linguistic development. Extracting central philosophical ideas from Tarski’s early work in semantics, Sher questions whether these are fully realized by the standard first-order system. The answer lays the foundation for a new, broader conception of logic. By generally characterizing logical terms, Sher establishes a fundamental result in semantics. Her (...) development of the notion of logicality for quantifiers and her work on branching are of great importance for linguistics. Sher outlines the boundaries of the new logic and points out some of the philosophical ramifications of the new view of logic for such issues as the logicist thesis, ontological commitment, the role of mathematics in logic, and the metaphysical underpinning of logic. She proposes a constructive definition of logical terms, reexamines and extends the notion of branching quantification, and discusses various linguistic issues and applications. (shrink)
Gila Sher approaches knowledge from the perspective of the basic human epistemic situation—the situation of limited yet resourceful beings, living in a complex world and aspiring to know it in its full complexity. What principles should guide them? Two fundamental principles of knowledge are epistemic friction and freedom. Knowledge must be substantially constrained by the world (friction), but without active participation of the knower in accessing the world (freedom) theoretical knowledge is impossible. This requires a grounding of all knowledge, empirical (...) and abstract, in both mind and world, but the fall of traditional foundationalism has led many to doubt the viability of this ‘classical’ project. Sher challenges this skepticism, charting a new foundational methodology, foundational holism, that differs from others in being holistic, world-oriented, and universal (i.e., applicable to all fields of knowledge). Using this methodology, Epistemic Friction develops an integrated theory of knowledge, truth, and logic. This includes (i) a dynamic model of knowledge, incorporating some of Quine’s revolutionary ideas while rejecting his narrow empiricism, (ii) a substantivist, non-traditional correspondence theory of truth, and (iii) an outline of a joint grounding of logic in mind and world. The model of knowledge subjects all disciplines to demanding norms of both veridicality and conceptualization. The correspondence theory is robust and universal yet not simplistic or naive, admitting diverse forms of correspondence. Logic’s grounding in the world brings it in line with other disciplines while preserving, and explaining, its strong formality, necessity, generality, and normativity. (shrink)
Properties and relations in general have a certain degree of invariance, and some types of properties/relations have a stronger degree of invariance than others. In this paper I will show how the degrees of invariance of different types of properties are associated with, and explain, the modal force of the laws governing them. This explains differences in the modal force of laws/principles of different disciplines, starting with logic and mathematics and proceeding to physics and biology.
The problem that motivates me arises from a constellation of factors pulling in different, sometimes opposing directions. Simplifying, they are: (1) The complexity of the world; (2) Humans’ ambitious project of theoretical knowledge of the world; (3) The severe limitations of humans’ cognitive capacities; (4) The considerable intricacy of humans’ cognitive capacities . Given these circumstances, the question arises whether a serious notion of truth is applicable to human theories of the world. In particular, I am interested in the questions: (...) (a) Is a substantive standard of truth for human theories of the world possible? (b) What kind of standard would that be? (shrink)
The construction of a systematic philosophical foundation for logic is a notoriously difficult problem. In Part One I suggest that the problem is in large part methodological, having to do with the common philosophical conception of “providing a foundation”. I offer an alternative to the common methodology which combines a strong foundational requirement with the use of non-traditional, holistic tools to achieve this result. In Part Two I delineate an outline of a foundation for logic, employing the new methodology. The (...) outline is based on an investigation of why logic requires a veridical justification, i.e., a justification which involves the world and not just the mind, and what features or aspect of the world logic is grounded in. Logic, the investigation suggests, is grounded in the formal aspect of reality, and the outline proposes an account of this aspect, the way it both constrains and enables logic, logic's role in our overall system of knowledge, the relation between logic and mathematics, the normativity of logic, the characteristic traits of logic, and error and revision in logic. (shrink)
The paper delineates a new approach to truth that falls under the category of “Pluralism within the bounds of correspondence”, and illustrates it with respect to mathematical truth. Mathematical truth, like all other truths, is based on correspondence, but the route of mathematical correspondence differs from other routes of correspondence in (i) connecting mathematical truths to a special aspect of reality, namely, its formal aspect, and (ii) doing so in a complex, indirect way, rather than in a simple and direct (...) way. The underlying idea is that an intricate mind is capable of creating intricate routes from language to reality, and this enables it to apply correspondence principles in areas for which correspondence is traditionally thought to be problematic. (shrink)
Substantivism is a general philosophical methodology advocating a substantive approach to philosophical theorizing. In this article, I present an overview of this methodology with a special emphasis on the field of truth. I begin with a framework for understanding what is at stake in the substantivist–deflationist debate and describe the substantivist critique of deflationism. I then proceed to discuss contemporary substantivism as a positive methodology, present examples of recent substantivist theories of truth, delineate several principles of philosophical substantivism, and connect (...) it to contemporary thought about the nature and methods of philosophy. Due to limitations of space, I am unable to discuss all the forms contemporary substantivism has taken. But I try to give a clear sense of the central principles, challenges, and promise of this methodology. (shrink)
The paper offers a new analysis of the difficulties involved in the construction of a general and substantive correspondence theory of truth and delineates a solution to these difficulties in the form of a new methodology. The central argument is inspired by Kant, and the proposed methodology is explained and justified both in general philosophical terms and by reference to a particular variant of Tarski's theory. The paper begins with general considerations on truth and correspondence and concludes with a brief (...) outlook on the "family" of theories of truth generated by the new methodology. (shrink)
Gila Sher interviewed by Chen Bo: -/- I. Academic Background and Earlier Research: 1. Sher’s early years. 2. Intellectual influence: Kant, Quine, and Tarski. 3. Origin and main Ideas of The Bounds of Logic. 4. Branching quantifiers and IF logic. 5. Preparation for the next step. -/- II. Foundational Holism and a Post-Quinean Model of Knowledge: 1. General characterization of foundational holism. 2. Circularity, infinite regress, and philosophical arguments. 3. Comparing foundational holism and foundherentism. 4. A post-Quinean model of knowledge. (...) 5. Intellect and figuring out. 6. Comparing foundational holism with Quine’s holism. 7. Evaluation of Quine’s Philosophy -/- III. Substantive Theory of Truth and Relevant Issues: 1. Outline of Sher’s substantive theory of truth. 2. Criticism of deflationism and treatment of the Liar. 3. Comparing Sher’s substantive theory of truth with Tarski’s theory of truth. -/- IV. A New Philosophy of Logic and Comparison with Other Theories: 1. Foundational account of logic. 2. Standard of logicality, set theory and logic. 3. Psychologism, Hanna’s and Maddy’s conceptions of logic. 4. Quine’s theses about the revisability of logic. -/- V. Epilogue. (shrink)
The paper presents an outline of a unified answer to five questions concerning logic: (1) Is logic in the mind or in the world? (2) Does logic need a foundation? What is the main obstacle to a foundation for logic? Can it be overcome? (3) How does logic work? What does logical form represent? Are logical constants referential? (4) Is there a criterion of logicality? (5) What is the relation between logic and mathematics?
This is a critique of Michael P. Lynch’s functional pluralism with respect to truth. The paper is sympathetic to Lynch’s overall approach to truth, but is critical of (i) his platitudinous characterization of the general principles of truth, (ii) his excessive pluralism with respect to the “realizers” of truth, (iii) his treatment of atomic truth, and (iv) his analysis of “mixed” logical inferences. The paper concludes with a proposal for a functional pluralism that puts greater emphasis on the unity of (...) truth. For example: while Lynch regards truth as based on correspondence principles in some domains and on coherence principles in others, the current proposal regards truth as based on correspondence principles in all domains, restricting the plurality of truth to a plurality of correspondence principles. (shrink)
Although the invariance criterion of logicality first emerged as a criterion of a purely mathematical interest, it has developed into a criterion of considerable linguistic and philosophical interest. In this paper I compare two different perspectives on this criterion. The first is the perspective of natural language. Here, the invariance criterion is measured by its success in capturing our linguistic intuitions about logicality and explaining our logical behavior in natural-linguistic settings. The second perspective is more theoretical. Here, the invariance criterion (...) is used as a tool for developing a theoretical foundation of logic, focused on a critical examination, explanation, and justification of its veridicality and modal force. (shrink)
Branching quantifiers were first introduced by L. Henkin in his 1959 paper ‘Some Remarks on Infmitely Long Formulas’. By ‘branching quantifiers’ Henkin meant a new, non-linearly structured quantiiier-prefix whose discovery was triggered by the problem of interpreting infinitistic formulas of a certain form} The branching (or partially-ordered) quantifier-prefix is, however, not essentially infinitistic, and the issues it raises have largely been discussed in the literature in the context of finitistic logic, as they will be here. Our discussion transcends, however, the (...) resources of standard lst-order languages and we will consider the new form in the context of 1st-order logic with 1- and 2-place ‘Mostowskian` generalized quantifiers.2.. (shrink)
Philosophers are divided on whether the proof- or truth-theoretic approach to logic is more fruitful. The paper demonstrates the considerable explanatory power of a truth-based approach to logic by showing that and how it can provide (i) an explanatory characterization —both semantic and proof-theoretical—of logical inference, (ii) an explanatory criterion for logical constants and operators, (iii) an explanatory account of logic’s role (function) in knowledge, as well as explanations of (iv) the characteristic features of logic —formality, strong modal force, generality, (...) topic neutrality, basicness, and (quasi-)apriority, (v) the veridicality of logic and its applicability to science, (v) the normativity of logic, (vi) error, revision, and expansion in/of logic, and (vii) the relation between logic and mathematics. The high explanatory power of the truth-theoretic approach does not rule out an equal or even higher explanatory power of the proof-theoretic approach. But to the extent that the truth-theoretic approach is shown to be highly explanatory, it sets a standard for other approaches to logic, including the proof-theoretic approach. (shrink)
The viability of metaphysics as a field of knowledge has been challenged time and again. But in spite of the continuing tendency to dismiss metaphysics, there has been considerable progress in this field in the 20th- and 21st- centuries. One of the newest − though, in a sense, also oldest − frontiers of metaphysics is the grounding project. In this paper I raise a methodological challenge to the new grounding project and propose a constructive solution. Both the challenge and its (...) solution apply to metaphysics in general, but grounding theory puts the challenge in an especially sharp focus. The solution consists of a new methodology, holistic grounding or holistic metaphysics. This methodology is modeled after a recent epistemic methodology, foundational holism, that enables us to pursue the foundational project of epistemology without being hampered by the problems associated with foundationalism. (shrink)
Attention to the conversational role of alethic terms seems to dominate, and even sometimes exhaust, many contemporary analyses of the nature of truth. Yet, because truth plays a role in judgment and assertion regardless of whether alethic terms are expressly used, such analyses cannot be comprehensive or fully adequate. A more general analysis of the nature of truth is therefore required – one which continues to explain the significance of truth independently of the role alethic terms play in discourse. We (...) undertake such an analysis in this paper; in particular, we start with certain elements from Kant and Frege, and develop a construct of truth as a normative modality of cognitive acts (e.g., thought, judgment, assertion). Using the various biconditional T-schemas to sanction the general passage from assertions to (equivalent) assertions of truth, we then suggest that an illocutionary analysis of truth can contribute to its locutionary analysis as well, including the analysis of diverse constructions involving alethic terms that have been largely overlooked in the philosophical literature. Finally, we briefly indicate the importance of distinguishing between alethic and epistemic modalities. (shrink)
The paper argues that a philosophically informative and mathematically precise characterization is possible by (i) describing a particular proposal for such a characterization, (ii) showing that certain criticisms of this proposal are incorrect, and (iii) discussing the general issue of what a characterization of logical constants aims at achieving.
In this paper I investigate Putnam’s model-theoretic argument from a transcendent standpoint, in spite of Putnam’s well-known objections to such a standpoint. This transcendence, however, requires ascent to something more like a Tarskian meta-level than what Putnam regards as a “God’s eye view”. Still, it is methodologically quite powerful, leading to a significant increase in our investigative tools. The result is a shift from Putnam’s skeptical conclusion to a new understanding of realism, truth, correspondence, knowledge, and theories, or certain aspects (...) thereof, based on, among other things, a better understanding of what models are designed (and not designed) to do. (shrink)
In a recent paper, “The Concept of Logical Consequence,” W. H. Hanson criticizes a formal-structural characterization of logical consequence in Tarski and Sher. Hanson accepts many principles of the formal-structural view. Relating to Sher 1991 and 1996a, he says.
Knowledge requires both freedom and friction . Freedom to set up our epistemic goals, choose the subject matter of our investigations, espouse cognitive norms, design research programs, etc., and friction (constraint) coming from two directions: the object or target of our investigation, i.e., the world in a broad sense, and our mind as the sum total of constraints involving the knower. My goal is to investigate the problem of epistemic friction, the relation between epistemic friction and freedom, the viability of (...) foundationalism as a solution to the problem of friction, an alternative solution in the form of a neo-Quinean model, and the possibility of solving the problem of friction as it applies to logic and the philosophy of logic within that model. (shrink)
In the early part of the 20th century the logical positivists launched a powerful attack on traditional philosophy, rejecting the very idea of philosophy as a substantive discipline and replacing it with a practical, conventionalist, meta-theoretical view of philosophy. The positivist critique was based on a series of dichotomies: the analytic vs. the synthetic, the external vs. the internal, the apriori vs. the empirical, the meta-theoretical vs. the object- theoretical, the conventional vs. the factual. Quine's attack on the positivists' dichotomies (...) was, by extension (if not by intention), also an attack on their critique of philosophy. Quine's own theory, however, in time took an extreme naturalistic turn that, if anything, deepened the schism between philosophy and knowledge. -/- In this paper I show that many of Quine's early philosophical ideas - his denial of the analytic-synthetic distinction, his thesis of the interconnectedness of knowledge, his universal revisability thesis, his principle of the inseparability of language and theory, his methodological pragmatism, and his realism - are in fact compatible with a substantive philosophy. Moreover, certain inner tensions in Quine's theory naturally lead to a new model of knowledge in which philosophy plays a substantive role: not as a "first philosophy", or as a "meta-science", or as a "chapter in psychology", but as an independent discipline in its own right, alongside, continuous with, and complementary to, science. (shrink)
In this paper I reconstruct David Foster Wallace’s argument against fatalism in his undergraduate honors thesis, “Richard Taylor’s ‘Fatalism’ and the Semantics of Physical Modality”. My goal is to present the argument in a clear and concise way, so that it is easy to see its main line of reasoning and potential power. A secondary goal is to offer clarificatory and critical notes on some of the issues at stake. The reconstruction reveals interesting connections between Wallace’s argument and John MacFarlane’s (...) recent work on relative truth. (shrink)
Confronting the Liar Paradox is commonly viewed as a prerequisite for developing a theory of truth. In this paper I turn the tables on this traditional conception of the relation between the two. The theorist of truth need not constrain his search for a “material” theory of truth, i.e., a theory of the philosophical nature of truth, by committing himself to one solution or another to the Liar Paradox. If he focuses on the nature of truth (leaving issues of formal (...) consistency for a later stage), he can arrive at material principles that prevent the Liar Paradox from arising in the first place. I argue for this point both on general methodological grounds and by example. The example is based on a substantivist theory of truth that emphasizes the role of truth in human cognition. The key point is that truth requires a certain complementarity of “immanence” and “transcendence”, and this means that some hierarchical structure is inherent in truth. Approaching the Liar Paradox from this perspective throws new light on its existent solutions: their differences and commonalities, their purported ad-hocness, and the relevance of natural language and bivalence to truth and the Liar. (shrink)
Jennifer Hornsby’s 1997 paper, ‘Truth: The Identity Theory’, has been highly influential in making the identity theory of truth a viable option in contemporary philosophy. In this introduction and commentary I focus on what distinguishes her theory and its methodology from the correspondence theory and the ‘substantivist’ methodology, and on other issues that have not been widely discussed in earlier commentaries yet are central to the current debate on truth.
Following Henkin's discovery of partially-ordered (branching) quantification (POQ) with standard quantifiers in 1959, philosophers of language have attempted to extend his definition to POQ with generalized quantifiers. In this paper I propose a general definition of POQ with 1-place generalized quantifiers of the simplest kind: namely, predicative, or "cardinality" quantifiers, e.g., "most", "few", "finitely many", "exactly α", where α is any cardinal, etc. The definition is obtained in a series of generalizations, extending the original, Henkin definition first to a general (...) definition of monotone-increasing (M↑) POQ and then to a general definition of generalized POQ, regardless of monotonicity. The extension is based on (i) Barwise's 1979 analysis of the basic case of M↑ POQ and (ii) my 1990 analysis of the basic case of generalized POQ. POQ is a non-compositional Ist-order structure, hence the problem of extending the definition of the basic case to a general definition is not trivial. The paper concludes with a sample of applications to natural and mathematical languages. (shrink)
The paper seeks to answer two new questions about truth and scientific change: What lessons does the phenomenon of scientific change teach us about the nature of truth? What light do recent developments in the theory of truth, incorporating these lessons, throw on problems arising from the prevalence of scientific change, specifically, the problem of pessimistic meta-induction?
Kant is known for having said relatively little about truth in Critique of Pure Reason. Nevertheless, there are important lessons to be learned from this work about truth, lessons that apply to the contemporary debate on the nature and structure of truth and its theory. In this paper I suggest two such lessons. The first lesson concerns the structure of a substantive theory of truth as contrasted with a deflationist theory; the second concerns the structure of a correspondence theory of (...) truth. The first lesson warns us against conceiving of a substantive theory of truth in a way that led Kant to conclude that such a theory is unviable. In so doing it indirectly suggests what a viable substantive theory of truth would be like. The second lesson teaches us that a correspondence theory of truth need not be as naive and overly simplistic as it is usually thought to be, but can, and should be, far more complex. This lesson is based on a correspondence theory of truth incipient in the Critique, one whose structure reflects the complexity of the relation between mind and world in Kant’s theory. (shrink)
This collection of new essays offers a 'state-of-the-art' conspectus of major trends in the philosophy of logic and philosophy of mathematics. A distinguished group of philosophers addresses issues at the centre of contemporary debate: semantic and set-theoretic paradoxes, the set/class distinction, foundations of set theory, mathematical intuition and many others. The volume includes Hilary Putnam's 1995 Alfred Tarski lectures, published here for the first time.
The paper argues that a philosophically informative and mathematically precise characterization is possible by describing a particular proposal for such a characterization, showing that certain criticisms of this proposal are incorrect, and discussing the general issue of what a characterization of logical constants aims at achieving.
In this paper I examine a cluster of concepts relevant to the methodology of truth theories: 'informative definition', 'recursive method', 'semantic structure', 'logical form', 'compositionality', etc. The interrelations between these concepts, I will try to show, are more intricate and multi-dimensional than commonly assumed.
In this paper I develop a new conception of Tarskian logic based on Tarski’s intuitive characterization of logical consequence as formal and necessary in his 1936 paper. Special emphasis is placed on the role of logic in our system of knowledge, the origins of semantics, the semantic definition of logical consequence, and the role of logical and non-logical terms in a logical system. The paper offers a new definition of logical terms based on the question: what division of terms into (...) logical and extra-logical would yield a logical system that satisfies Tarski’s intuitive characterization of logical consequence in complete generality? I discuss the consequences of the new conception for revision in logic, the logicist thesis, and the relation between logic and mathematics. I offer a proof-theoretic perspective on the semantic conception delineated in this paper. And I conclude with a postscript on Tarski’s lecture, “What are Logical Notions?”, which was published shortly after the present conception of logic was developed. (shrink)
The question motivating my investigation is: Are the basic philosophical principles underlying the "core" system of contemporary logic exhausted by the standard version? In particular, is the accepted narrow construal of the notion "logical term" justified? ;As a point of comparison I refer to systems of 1st-order logic with generalized quantifiers developed by mathematicians and linguists . Based on an analysis of the Tarskian conception of the role of logic I show that the standard division of terms into logical and (...) extra-logical is partly arbitrary. I argue that the semantic principles of Tarskian logic allow any higher order mathematical predicate or relation to function as a logical term in a 1st-order system, provided it is introduced in the right way into the syntactic-semantic apparatus. ;Formally, I propose a new, "constructive", semantic characterization of logical terms. A logical term is defined by a function which, given a model, "shows" how to construct relations, sets, or n-tuples of individuals which satisfy it. I discuss the linguistic applicability of the extended logic and bring numerous examples. ;One chapter is devoted to branching quantification--a non-standard logico-linguistic construction obtained by affixing a non-linear, partially-ordered quantifier-prefix to a well-formed formula. Although standard branching quantifiers were given a complete account by Henkin and Walkoe, it is not altogether clear what the meaning of branching generalized quantifiers is. I propose a new analysis of the generalized branching prefix which extends the existent, partial, definitions due to Barwise, and I develop a generalization leading to a "family" of branching structures. ;In conclusion I discuss the philosophical impact of the generalized conception of logic on such issues as the logicist thesis, the interaction between mathematics and logic, ontological commitment via logic, metaphysics and logical semantics. (shrink)
In this paper I present an outline of a model of knowledge that complements, and is complemented by, my the conception of logic delineated in The Bounds of Logic. The Bounds of Logic had as its goal a critical, systematic and constructive understanding of logic. As such it aimed at maximum neutrality vis-a-vis epistemic, metaphysical and meta-mathematical controversies. But a conception of logic does not exist in a vacuum. Eventually our goal is to produce an account of logic that answers (...) the needs of, contributes to the development of, and is supported by, a broader epistemology. In this paper I make first steps in this direction. I begin with an outline of a model of knowledge whose basic principles are based on the early Quine. I identify, and offer independent justification for, the special requirements this model sets on an adequate conception of logic. Finally, I show how, by satisfying these requirements, the conception of logic delineated in The Bounds of Logic can naturally be incorporated in this epistemic model. Logic, in this model, is both central and peripheral, is guided by both veridical and pragmatic norms, is both grounded in the world and assigned a central role in unifying our body of knowledge. (shrink)
Logic and mathematics are abstract disciplines par excellence. What is the nature of truth and knowledge in these disciplines? In this paper I investigate the possibility of a new approach to this question. The underlying idea is that knowledge qua knowledge, including logical and mathematical knowledge, has a dual grounding in mind and reality, and the standard of truth applicable to all knowledge is a correspondence standard. This applies to logic and mathematics as much as to other disciplines; i.e., logical (...) and mathematical truth are based on correspondence. But the view that logical and mathematical truth are (i) based on correspondence and (ii) require a grounding in reality demands a change in the common conception of both correspondence and epistemic grounding. (shrink)