In this book Terence Parsons revives the older tradition of taking such objects at face value. Using various modern techniques from logic and the philosophy of language, he formulates a metaphysical theory of nonexistent objects. The theory is given a formalization in symbolism rich enough to contain definite descriptions, modal operators, and epistemic contexts, and the book includes a discussion which relates the formalized theory explicitly to English.
This extended investigation of the semantics of event (and state) sentences in their various forms is a major contribution to the semantics of natural language, simultaneously encompassing important issues in linguistics, philosophy, and logic. It develops the view that the logical forms of simple English sentences typically contain quantification over events or states and shows how this view can account for a wide variety of semantic phenomena. Focusing on the structure of meaning in English sentences at a &"subatomic&" level&-that is, (...) a level below the one most theories accept as basic or &"atomic&"&-Parsons asserts that the semantics of simple English sentences require logical forms somewhat more complex than is normally assumed in natural language semantics. His articulation of underlying event theory explains a wide variety of apparently diverse semantic characteristics of natural language, and his development of the theory shows the importance of seeing the distinction between events and states. Parsons demonstrates that verbs, also, indicate kinds of actions rather than specific, individual actions. Verb phrases, too, he argues, depend on modifiers to make their function and meaning in a sentence specific. An appendix gives many of the details needed to formalize the theory discussed in the body of the text and provides a series of templates that permit the generation of atomic formulas of English. (shrink)
Terence Parsons presents a lively and controversial study of philosophical questions about identity. Because many puzzles about identity remain unsolved, some people believe that they are questions that have no answers and that there is a problem with the language used to formulate them. Parsons explores a different possibility: that such puzzles lack answers because of the way the world is (or because of the way the world is not). He claims that there is genuine indeterminacy of identity in the (...) world. He articulates such a view in detail and defends it from a host of criticisms. (shrink)
Terence Parsons presents a new study of the development and continuing value of medieval logic, which expanded Aristotle's basic principles of logic in important ways. Parsons argues that the resulting system is as rich as contemporary first-order symbolic logic.
This entry traces the historical development of the Square of Opposition, a collection of logical relationships traditionally embodied in a square diagram. This body of doctrine provided a foundation for work in logic for over two millenia. For most of this history, logicians assumed that negative particular propositions ("Some S is not P") are vacuously true if their subjects are empty. This validates the logical laws embodied in the diagram, and preserves the doctrine against modern criticisms. Certain additional principles ("contraposition" (...) and "obversion") were sometimes adopted along with the Square, and they genuinely yielded inconsistency. By the nineteenth century an inconsistent set of doctrines was widely adopted. Strawson's 1952 attempt to rehabilitate the Square does not apply to the traditional doctrine; it does salvage the nineteenth century version but at the cost of yielding inferences that lead from truth to falsity when strung together. (shrink)
In In Contradiction, Graham Priest shows, as clearly as anything like this can be shown, that it is coherent to maintain that some sentences can be both true and false at the same time. As a consequence, some contradictions are true, and an appreciation of this possibility advances our understanding of the nature of logic and language.
This paper has two goals. The first is to formulate an adequate account of the semantics of the progressive aspect in English: the semantics of Agatha is making a cake, as opposed to Agatha makes a cake. This account presupposes a version of the so-called Aristotelian classification of verbs in English into EVENT, PROCESS and STATE verbs. The second goal of this paper is to refine this classification so as to account for the infamous category switch problem, the problem of (...) how it is that modification of a verb like run by an adverbial like to the store can turn a PROCESS phrase (run) into an EVENT phrase (run to the store). Views discussed include those of Aqvist, Bach, Bennett, Bennett and Partee, Dowry, Montague and Scott, and Vendler. (shrink)
This paper consists principally of selections from a much longer work on the semantics of English. It discusses some problems concerning how to represent grammatical modifiers (e.g. slowly in x drives slowly) in a logically perspicuous notation. A proposal of Reichenbach's is given and criticized; then a new theory (apparently discovered independently by myself, Romain Clark, and Richard Montague and Hans Kamp) is given, in which grammatical modifiers are represented by operators added to a first-order predicate calculus. Finally some problems (...) concerning applications of adjectives to that-clauses and gerundive-clauses are discussed. (shrink)
This paper explores the view that there are such things as (nonexistent) fictional objects, and that we refer to such objects when we say things like "Sherlock Holmes is a fictional detective", or "Conan Doyle wrote about Sherlock Holmes". A theory of such objects is developed as a special application of a Meinongian Ontology.
An introductory text in linguistic semantics, uniquely balancing empirical coverage and formalism with development of intuition and methodology. -/- This introductory textbook in linguistic semantics for undergraduates features a unique balance between empirical coverage and formalism on the one hand and development of intuition and methodology on the other. It will equip students to form intuitions about a set of data, explain how well an analysis of the data accords with their intuitions, and extend the analysis or seek an alternative. (...) No prior knowledge of linguistics is required. After mastering the material, students will be able to tackle some of the most difficult questions in the field even if they have never taken a linguistics course before. -/- After introducing such concepts as truth conditions and compositionality, the book presents a basic symbolic logic with negation, conjunction, and generalized quantifiers, to serve as the basis for translation throughout the book. It then develops a detailed compositional semantics, covering quantification (scope and binding), adverbial modification, relative clauses, event semantics, tense and aspect, as well as pragmatic phenomena, notably deictic pronouns and narrative progression. -/- A Course in Semantics offers a large and diverse set of exercises, interspersed throughout the text; those labeled “Important practice and looking ahead” prepare students for material to come; those labeled “Thinking about ” invite students to think beyond the content of the book. (shrink)
Frege held various views about language and its relation to non-linguistic things. These views led him to the paradoxical-sounding conclusion that "the concept horse is NOT a concept." A key assumption that led him to say this is the assumption that phrases beginning with the definite article "the" denote objects, not concepts. In sections I-III this issue is explained. In sections IV-V Frege's theory is articulated, and it is shown that he was incorrect in thinking that this theory led to (...) the conclusion that "the concept horse is not a concept." Section VI goes on to show that his strict theory about the functioning of ordinary language is inconsistent. Sections VII-VIII investigate Frege's reasons for thinking that "the concept horse" must denote an object; these reasons are not adequate on Frege's own grounds. Section IX sketches a systematic way to allow such phrases to denote concepts (not objects) within the framework of Frege's main views about language. Section X comments briefly on the consequences of this idea for his logistic program. (shrink)
This paper explores the view that there are such things as fictional objects, and that we refer to such objects when we say things like "Sherlock Holmes is a fictional detective", or "Conan Doyle wrote about Sherlock Holmes". A theory of such objects is developed as a special application of a Meinongian Ontology.
Methods of representing sentences containing mass terms (e.g. "gold") and amount terms (e.g. "three gallons") within the predicate calculus are given, and the semantics of the resulting sentences is discussed. the appendix sketches a way to systematically translate english sentences into the logical notation, exploiting some results of transformational grammar.
. The truth conditions that Aristotle attributes to the propositions making up the traditional square of opposition have as a consequence that a particular affirmative proposition such as ‘Some A is not B’ is true if there are no Bs. Although a different convention than the modern one, this assumption remained part of centuries of work in logic that was coherent and logically fruitful.
I begin by sketching a theory about the semantics of verbs in event sentences, and the evidence on which that theory is based. In the second section, I discuss the evidence for extending that theory to state sentences, including copulative sentences with adjectives and nouns; the evidence for this extension of the theory is not very good. In the third section, I discuss new evidence based on considerations of talk about time travel; that evidence is apparently quite good. I conclude (...) with a problem about formulating default knowledge. (shrink)
This paper follows up a suggestion by Paul Vincent Spade that there were two Medieval theories of the modes of personal supposition. I suggest that early work by Sherwood and others was a study of quantifiers: their semantics and the effects of context on inferences that can be made from quantified terms. Later, in the hands of Burley and others, it changed into a study of something else, a study of what I call global quantificational effect. For example, although the (...) quantifier in ‘¬∀xPx’ is universal, it can be seen globally as having an existential effect; this is because the formula containing it is equivalent to ‘∃x¬Px’. The notion of global effect can be explained in terms of the modern theory of normal forms. I suggest that early authors were studying quantifiers, and the terminology of the theory of personal supposition is a classification of kinds of quantifiers. In this theory, to say that a term has distributive supposition is to say, roughly, that it is quantified by a universal quantifying sign. Later authors turned this into a theory of global quantificational effect. In the later theory, to say that a term has distributive supposition is to say that the overall effect is as if the term were universally quantified with a quantifier taking (relatively) wide scope. The difference between these two approaches is illustrated by the fact that the term ‘man’ is classified as having distributive ("universal") supposition in ‘Not every man is running’ in the earlier theory, whereas in the later theory that term does not have distributive supposition; it has determinate ("existential") supposition. In the paper I explain these options, and I argue from several texts that the earlier and later medieval theories actually worked like this. In an appendix I make further efforts to clarify the obscure early accounts, as well as the nineteenth century "doctrine of distribution". The last section of the paper discusses the "purpose(s)" of supposition theory. (shrink)
For nearly four centuries Peter of Spain's influential Summaries of Logic was the basis for teaching logic; few university texts were read by more people. This new translation presents the Latin and English on facing pages, and comes with an extensive introduction, chapter-by-chapter analysis, notes, and a full bibliography.
Peter Geach describes the 'doctrine of distribution' as the view that a term is distributed if it refers to everything that it denotes, and undistributed if it refers to only some of the things that it denotes. He argues that the notion, so explained, is incoherent. He claims that the doctrine of distribution originates from a degenerate use of the notion of ?distributive supposition? in medieval supposition theory sometime in the 16th century. This paper proposes instead that the doctrine of (...) distribution occurs at least as early as the 12th century, and that it originates from a study of Aristotle's notion of a term's being ?taken universally?, and not from the much later theory of distributive supposition. A detailed version of the doctrine found in the Port Royal Logic is articulated, and compared with a slightly different modern version. Finally, Geach's arguments for the incoherence of the doctrine are discussed and rejected. (shrink)