This is the first book to approach depictive secondary predication - a hot topic in syntax and semantics research - from a crosslinguistic perspective. It maps out all the relevant phenomena and brings together critical surveys and new contributions on their morphosyntactic and semantic properties.
Truth, falsity, and unity -- Sentences, lists, and collections -- Declarative and other kinds of sentence -- Declarative sentences and propositions -- Sentences, propositions, and truth-values -- Sentences, propositions, and unity -- Unity and complexity -- Reference and supposition -- Reference and signification -- Linguistic idealism and empirical realism -- Russell on truth, falsity, and unity (I) : 1903 -- Russell on truth, falsity, and unity (II) : 1910-13 -- Russell on truth, falsity, and unity (III) : 1918 -- Sense, (...) reference, and propositions -- Russellian propositions, Fregean thoughts, and facts -- The location of propositions -- Proper names, concept-expressions, and definite descriptions -- Concept-expressions and Carnapian intensions -- Carnapian intensions and understanding -- Carnapian intensions and Russellian propositions -- Russellian propositions and functionality -- A revised semantic map -- Sentences as referring expressions -- False propositions at the level of reference -- The world's own language -- Signification and supposition revisited -- Frege and Russell on unity -- Saturatedness and unsaturatedness -- The copula as secundum adiacens and as tertium adiacens -- Frege and the Copula -- The paradox of the concept horse -- Russell on unity and the paradox -- An unsuccessful attempt to avoid the paradox -- The paradox and the level of language -- Reforming Frege's treatment of concept-expressions -- Concepts and functions -- The reformed Frege : refinements and objections -- Frege, Russell, and the anti-fregean strategy -- The anti-fregean strategy : the case of names -- Disquotation and propositional form -- The context principle -- Prabhakara semantics and the related designation theory -- For that is not a word which is not the name of a thing -- The impartial strategy -- Secundum and tertium adiacens, matter and form -- The hierarchy of levels and the syntactic priority thesis -- Fregean and anti-fregean strategies -- The anti-fregean strategy and relations (I) -- Interlude: The subject--predicate distinction -- The anti-fregean strategy and relations (II) -- The reality of relations -- Polyadicity, monadicity, and identity -- The anti-fregean strategy and Montague grammar -- Fregean and anti-fregean strategies : further comparison -- Ramsey on the subject : predicate distinction -- Dummett's attack on the anti-fregean strategy -- Linguistic idealism revisited -- Alternative hierarchies and the context principle -- The linguistic hierarchy and categorial nonsense -- Logical syntax and the context principle -- Proper names, singular terms, and the identity test -- Proper names, Leibniz's law, and the identity of indiscernibles -- The negation asymmetry test -- Dummett's tests for singular termhood -- Discarding the syntactic priority thesis -- Logical predication, logical form, and Bradley's regress -- Names, verbs, and the replacement test -- Analysis and paradox -- Simple, complex, and logical predicates -- The grammatical copula and the logical copula -- Predication in Frege -- Two exegetical problems in Frege -- Inference and the logical predicate -- Unity and the logical predicate -- Bradley's regress and the tradition -- Russell and the general form of the proposition -- Wittgenstein's criticism of Russell -- Logical form in theTractatus -- Bradley's regress and the unity of the proposition -- The logical copula and theories of meaning -- Reference and the logical copula -- Bradley's regress and the analysis of meaning -- Vicious practical regresses -- Bradley's regress and the solution to the unity problem -- Propositions, sets, sums, and the objects themselves -- Bradley's regress and the infinite -- Vallicella's onto-theology -- A comparison with other innocent regresses -- Truth, falsity, and unity revisited -- Bradley's regress, realism, and states of affairs -- Unity and use -- The unity of sentences and the unity of complex names (I) -- The unity of sentences and the unity of complex names (II) -- Congruence, functionality, and propositional unity -- Davidson on predication -- Epilogue: The limits of language. (shrink)
In The Dynamics of Meaning , Gennaro Chierchia tackles central issues in dynamic semantics and extends the general framework. Chapter 1 introduces the notion of dynamic semantics and discusses in detail the phenomena that have been used to motivate it, such as "donkey" sentences and adverbs of quantification. The second chapter explores in greater depth the interpretation of indefinites and issues related to presuppositions of uniqueness and the "E-type strategy." In Chapter 3, Chierchia extends the dynamic approach to the (...) domain of syntactic theory, considering a range of empirical problems that includes backwards anaphora, reconstruction effects, and weak crossover. The final chapter develops the formal system of dynamic semantics to deal with central issues of definites and presupposition. Chierchia shows that an approach based on a principled enrichment of the mechanisms dealing with meaning is to be preferred on empirical grounds over approaches that depend on an enrichment of the syntactic apparatus. Dynamics of Meaning illustrates how seemingly abstract stances on the nature of meaning can have significant and far-reaching linguistic consequences, leading to the detection of new facts and influencing our understanding of the syntax/semantics/pragmatics interface. (shrink)
During the last thirty years, most linguists and philosophers have assumed that meaning can be represented symbolically and that the mental processing of language involves the manipulation of symbols. Scholars have assembled strong evidence that there must be linguistic representations at several abstract levels--phonological, syntactic, and semantic--and that those representations are related by a describable system of rules. Because meaning is so complex, linguists often posit an equally complex relationship between semantic and other levels of grammar. The Semantics of Syntax (...) is an elegant and powerful analysis of the relationship between syntax and semantics. Noting that meaning is underdetermined by form even in simple cases, Denis Bouchard argues that it is impossible to build knowledge of the world into grammar and still have a describable grammar. He thus proposes simple semantic representations and simple rules to relate linguistic levels. Focusing on a class of French verbs, Bouchard shows how multiple senses can be accounted for by the assumption of a single abstract core meaning along with background information about how objects behave in the world. He demonstrates that this move simplifies the syntax at no cost to the descriptive power of the semantics. In two important final chapters, he examines the consequences of his approach for standard syntactic theories. (shrink)
This book is concerned with the relationship between semantics and surface structure and in particular with the way in which each is mapped into the other. Jim Miller argues that semantic and syntactic structure require different representations and that semantic structure is far more complex than many analysts realise. He argues further that semantic structure should be based on notions of location and movement. The need for a semantic component of greater complexity is demonstrated by an examination of prepositions, particles, (...) adverbs and verb-prefixes, and is shown to accord with cross-language and historical facts. The volume goes on to consider the sort of rules that are required to map semantic structures onto syntax. Semantics and Syntax tackles fundamental issues and draws together many of the key concepts of traditional grammar and formal linguistics. The general framework for handling syntax, semantics and morphology that it outlines is perhaps a controversial one, but it will be recognized as challenging and original. (shrink)
Split constructions are widespread in natural languages. The separation of the semantic restriction of a quantifier from that quantifier is a typical example of such a construction. This study addresses the problem that such discontinuous strings exhibit--namely, a number of locality constraints, including intervention effects. These are shown to follow from the interaction of a minimalist syntax with a semantics that directly assigns a model-theoretic interpretation to syntactic logical forms. The approach is shown to have wide empirical coverage and a (...) conceptual simplicity. The book will be of interest to scholars and advanced students of syntax and semantics. (shrink)
Why do speakers of all languages use different grammatical structures under different communicative circumstances to express the same idea? In this comprehensive study, Professor Lambrecht explores the relationship between the structure of sentences and the linguistic and extra-linguistic contexts in which they are used. His analysis is based on the observation that the structure of a sentence reflects a speaker's assumptions about the hearer's state of knowledge and consciousness at the time of the utterance. This relationship between speaker assumptions and (...) formal sentence structure is governed by rules and conventions of grammar, in a component called 'information structure'. Four independent but interrelated categories are analysed: presupposition and assertion, identifiability and activation, topic, and focus. (shrink)
This book offers a new approach to the analysis of the multiple meanings of English modals, conjunctions, conditionals, and perception verbs. Although such ambiguities cannot easily be accounted for by feature-analyses of word meaning, Eve Sweetser's argument shows that they can be analyzed both readily and systematically. Meaning relationships in general cannot be understood independently of human cognitive structure, including the metaphorical and cultural aspects of that structure. Sweetser shows that both lexical polysemy and pragmatic ambiguity are shaped by (...) our metaphorical folk understanding of epistemic processes and of speech interaction. Similar regularities can be shown to structure the contrast among root, epistemic and speech act uses of modal verbs, multiple uses of conjunctions and conditionals, and certain processes of historical change observed in Indo-European languages. Since polysemy is typically the intermediate step in semantic change, the same regularities observable in polysemy can be extended to an analysis of semantic change. (shrink)
This book examines the hypothesis of "direct compositionality", which requires that semantic interpretation proceed in tandem with syntactic combination. Although associated with the dominant view in formal semantics of the 1970s and 1980s, the feasibility of direct compositionality remained unsettled, and more recently the discussion as to whether or not this view can be maintained has receded. The syntax-semantics interaction is now often seen as a process in which the syntax builds representations which, at the abstract level of logical form, (...) are sent for interpretation to the semantics component of the language faculty. In the first extended discussion of the hypothesis of direct compositionality for twenty years, this book considers whether its abandonment might have been premature and whether in fact direct compositionality is not after all a simpler and more effective conception of the grammar than the conventional account of the syntax-semantics interface in generative grammar. It contains contributions from both sides of the debate, locates the debate in the setting of a variety of formal theories, and draws on examples from a range of languages and a range of empirical phenomena. (shrink)
This volume contains twelve chapters on the derivation of and the correlates to verb initial word order. The studies in this volume cover such widely divergent languages as Irish, Welsh, Scots Gaelic, Old Irish, Biblical Hebrew, Jakaltek, Mam, Lummi (Straits Salish), Niuean, Malagasy, Palauan, K'echi', and Zapotec, from a wide variety of theoretical perspectives, including Minimalism, information structure, and sentence processing. The first book to take a crosslinguistic comparative approach to verb initial syntax, this volume provides new data to (...) some old problems anddebates and explores some innovative approaches to the derivation of verb initial order. (shrink)
One of the major arenas for debate within generative grammar is the nature of paradigmatic relations among words. Intervening in key debates at the interface between syntax and semantics, this book examines the relation between structure and meaning, and analyses how it affects the internal properties of words and corresponding syntactic manifestations. Adapting notions from the Evo-Devo project in biology (the idea of 'co-linearity' between structural units and behavioural manifestations) Juan Uriagereka addresses a major puzzle: how words can be both (...) decomposable so as to be acquired by children, and atomic, so that they do not manifest themselves as modular to adults. (shrink)
This book attempts to marry truth-conditional semantics with cognitive linguistics in the church of computational neuroscience. To this end, it examines the truth-conditional meanings of coordinators, quantifiers, and collective predicates as neurophysiological phenomena that are amenable to a neurocomputational analysis. Drawing inspiration from work on visual processing, and especially the simple/complex cell distinction in early vision (V1), we claim that a similar two-layer architecture is sufficient to learn the truth-conditional meanings of the logical coordinators and logical quantifiers. As a prerequisite, (...) much discussion is given over to what a neurologically plausible representation of the meanings of these items would look like. We eventually settle on a representation in terms of correlation, so that, for instance, the semantic input to the universal operators (e.g. and, all)is represented as maximally correlated, while the semantic input to the universal negative operators (e.g. nor, no)is represented as maximally anticorrelated. On the basis this representation, the hypothesis can be offered that the function of the logical operators is to extract an invariant feature from natural situations, that of degree of correlation between parts of the situation. This result sets up an elegant formal analogy to recent models of visual processing, which argue that the function of early vision is to reduce the redundancy inherent in natural images. Computational simulations are designed in which the logical operators are learned by associating their phonological form with some degree of correlation in the inputs, so that the overall function of the system is as a simple kind of pattern recognition. Several learning rules are assayed, especially those of the Hebbian sort, which are the ones with the most neurological support. Learning vector quantization (LVQ) is shown to be a perspicuous and efficient means of learning the patterns that are of interest. We draw a formal parallelism between the initial, competitive layer of LVQ and the simple cell layer in V1, and between the final, linear layer of LVQ and the complex cell layer in V1, in that the initial layers are both selective, while the final layers both generalize. It is also shown how the representations argued for can be used to draw the traditionally-recognized inferences arising from coordination and quantification, and why the inference of subalternacy breaks down for collective predicates. Finally, the analogies between early vision and the logical operators allow us to advance the claim of cognitive linguistics that language is not processed by proprietary algorithms, but rather by algorithms that are general to the entire brain. Thus in the debate between objectivist and experiential metaphysics, this book falls squarely into the camp of the latter. Yet it does so by means of a rigorous formal, mathematical, and neurological exposition – in contradiction of the experiential claim that formal analysis has no place in the understanding of cognition. To make our own counter-claim as explicit as possible, we present a sketch of the LVQ structure in terms of mereotopology, in which the initial layer of the network performs topological operations, while the final layer performs mereological operations. The book is meant to be self-contained, in the sense that it does not assume any prior knowledge of any of the many areas that are touched upon. It therefore contains mini-summaries of biological visual processing, especially the retinocortical and ventral /what?/ parvocellular pathways computational models of neural signaling, and in particular the reduction of the Hodgkin-Huxley equations to the connectionist and integrate-and-fire neurons Hebbian learning rules and the elaboration of learning vector quantization the linguistic pathway in the left hemisphere memory and the hippocampus truth-conditional vs. image-schematic semantics objectivist vs. experiential metaphysics and mereotopology. All of the simulations are implemented in MATLAB, and the code is available from the book’s website. • The discovery of several algorithmic similarities between visison and semantics. • The support of all of this by means of simulations, and the packaging of all of this in a coherent theoretical framework. (shrink)
This is an introduction to the structure of sentences in human languages. It assumes no prior knowledge of linguistic theory and little of elementary grammar. It will suit students coming to syntactic theory for the first time either as graduates or undergraduates. It will also be useful for those in fields such as computational science, artificial intelligence, or cognitive psychology who need a sound knowledge of current syntactic theory.
Building on the success of the bestselling first edition, the second edition of this textbook provides a comprehensive and accessible introduction to the major issues in Principles and Parameters syntactic theory, including phrase structure, the lexicon, case theory, movement, and locality conditions. Includes new and extended problem sets in every chapter, all of which have been annotated for level and skill type. Features three new chapters on advanced topics including vP shells, object shells, control, gapping and ellipsis and an additional (...) chapter on advanced topics in binding. Offers a brief survey of both Lexical-Functional Grammar and Head-Driven Phrase Structure Grammar. Succeeds in strengthening the reader's foundational knowledge, and prepares them for more advanced study. Supported by an instructor's manual and online resources for students and instructors, available at www.blackwellpublishing.com/carnie. (shrink)
Is native speaker variation in understanding complex sentences due to individual differences in working memory capacity or in syntactic competence? The answer to this question has very important consequences for both theoretical and applied concerns in linguistics and education. This book is distinctive in giving an historical and interdisciplinary perspective on the rule- based and experience-based debate and in supporting an integrated account. In the study reported here, variation was found to be due to differences in syntactic competence and the (...) author argues that sentence comprehension is a learned skill, displaying many of the general characteristics of cognitive skills. The book will be stimulating reading for psycholinguists, theoretical linguists, applied linguists and educators. (shrink)
This collection covers the fundamental concepts and analytic tools of generative transformational syntax of the last half century, from Chomsky's Morphophonemics of Modern Hebrew (1951) to the present day. It makes available, in one place, key published material on important areas such as phrase structure, transformations, and conditions on rules and representations. Presenting articles by leading contributors to the field such as Baltin, Bokovic, Bresnan, Chomsky, Cinque, Emonds, Freidin, Hale, Higginbotham, Huang, Kayne, Lasnik, McCawley, Pollock, Postal, Reinhart, Rizzi, Ross, Stowell, (...) Torrego, Travis, Vergnaud, and Williams, this fascinating collection also includes a general introduction by the editors and an index, thus providing a comprehensive single reference resource for students and researchers alike. (shrink)
Negative polarity is one of the more elusive aspects of linguistics and a subject which has been gaining in importance in recent years. Written from within the well-defined theoretical framework of Generalized Quantifiers, the three main areas considered in this study are collocations, polarity items and multiple negations. In this mature piece of research, van der Wouden takes into account, not only semantic and syntactic considerations, but also to a large extent, pragmatic ones illustrating a wide array of linguistic approaches.
Understanding Minimalist Syntax introduces the logic of the Minimalist Program by analyzing well-known descriptive generalizations about long-distance dependencies. Proposes a new theory of how long-distance dependencies are formed, with implications for theories of locality, and the Minimalist Program as a whole Rich in empirical coverage, which will be welcomed by experts in the field, yet accessible enough for students looking for an introduction to the Minimalist Program.
In recent years the idea that an adequate semantics of ordinary language calls for some theory of events has sparked considerable debate among linguists and philosophers. Speaking of Events offers a vivid and up-to-date indication of this debate, with emphasis precisely on the interplay between linguistic applications and philosophical implications. Each chapter has been written expressly for this volume by leading authors in the field, including Nicholas Asher, Pier Marco Bertinetto, Johannes Brandl, Denis Delfitto, Regine Eckardt, James Higginbotham, Alessandro Lenci, (...) Terence Parsons, Alice ter Meulen, and Henk Verkuyl. (shrink)