Already hailed as a masterpiece, Foundations of Language offers a brilliant overhaul of the last thirty-five years of research in generative linguistics and related fields. "Few books really deserve the cliché 'this should be read by every researcher in the field'," writes Steven Pinker, author of The Language Instinct, "but Ray Jackendoff's Foundations of Language does." Foundations of Language offers a radically new understanding of how language, the brain, and perception intermesh. The book renews the promise of early generative linguistics: (...) that language can be a valuable entrée into understanding the human mind and brain. The approach is remarkably interdisciplinary. Behind its innovations is Jackendoff's fundamental proposal that the creativity of language derives from multiple parallel generative systems linked by interface components. This shift in basic architecture makes possible a radical reconception of mental grammar and how it is learned. As a consequence, Jackendoff is able to reintegrate linguistics with philosophy of mind, cognitive and developmental psychology, evolutionary biology, neuroscience, and computational linguistics. Among the major topics treated are language processing, the relation of language to perception, the innateness of language, and the evolution of the language capacity, as well as more standard issues in linguistic theory such as the roles of syntax and the lexicon. In addition, Jackendoff offers a sophisticated theory of semantics that incorporates insights from philosophy of language, logic and formal semantics, lexical semantics of various stripes, cognitive grammar, psycholinguistic and neurolinguistic approaches, and the author's own conceptual semantics. (shrink)
This book emphasizes the role of semantics as a bridge between the theory of language and the theories of other cognitive capacities such as visual perception...
Semantic Structures is a large-scale study of conceptual structure and its lexical and syntactic expression in English that builds on the theory of Conceptual...
Presenting a landmark in linguistics and cognitive science, Ray Jackendoff proposes a new holistic theory of the relation between the sounds, structure, and meaning of language and their relation to mind and brain. Foundations of Language exhibits the most fundamental new thinking in linguistics since Noam Chomsky's Aspects of the Theory of Syntax in 1965—yet is readable, stylish, and accessible to a wide readership. Along the way it provides new insights on the evolution of language, thought, and communication.
We examine the question of which aspects of language are uniquely human and uniquely linguistic in light of recent suggestions by Hauser, Chomsky, and Fitch that the only such aspect is syntactic recursion, the rest of language being either specific to humans but not to language (e.g. words and concepts) or not specific to humans (e.g. speech perception). We find the hypothesis problematic. It ignores the many aspects of grammar that are not recursive, such as phonology, morphology, case, agreement, and (...) many properties of words. It is inconsistent with the anatomy and neural control of the human vocal tract. And it is weakened by experiments suggesting that speech perception cannot be reduced to primate audition, that word learning cannot be reduced to fact learning, and that at least one gene involved in speech and language was evolutionarily selected in the human lineage but is not specific to recursion. The recursion-only claim, we suggest, is motivated by Chomsky’s recent approach to syntax, the Minimalist Program, which de-emphasizes the same aspects of language. The approach, however, is sufficiently problematic that it cannot be used to support claims about evolution. We contest related arguments that language is not an adaptation, namely that it is “perfect,” non-redundant, unusable in any partial form, and badly designed for.. (shrink)
The science of linguistics is made accessible by the author of Consciousness and the Computational Mind, who demonstrates evidence for an innate Universal Grammar that provides the building blocks for all human languages.
Fundamental to spatial knowledge in all species are the representations underlying object recognition, object search, and navigation through space. But what sets humans apart from other species is our ability to express spatial experience through language. This target article explores the language ofobjectsandplaces, asking what geometric properties are preserved in the representations underlying object nouns and spatial prepositions in English. Evidence from these two aspects of language suggests there are significant differences in the geometric richness with which objects and places (...) are encoded. When an object is named, detailed geometric properties – principally the object's shape – are represented. In contrast, when an object plays the role of either “figure” or “ground” in a locational expression, only very coarse geometric object properties are represented, primarily the main axes. In addition, the spatial functions encoded by spatial prepositions tend to be nonmetric and relatively coarse, for example, “containment,” “contact,” “relative distance,” and “relative direction.” These properties are representative of other languages as well. The striking differences in the way language encodes objects versus places lead us to suggest two explanations: First, there is a tendency for languages to level out geometric detail from both object and place representations. Second, a nonlinguistic disparity between the representations of “what” and “where” underlies how language represents objects and places. The language of objects and places converges with and enriches our understanding of corresponding spatial representations. (shrink)
A profoundly arresting integration of the faculties of the mind - of how we think, speak, and see the world. Written with an informality that belies the originality of its insights and the radical nature of its conclusions, this is the author's most important book since his groundbreaking Foundations of Language in 2002.
On formal and empirical grounds, the overt form of language cannot be the vehicle that the mind uses for reasoning. Nevertheless, we most frequently experience our thought as "inner speech". It is argued that inner speech aids thought by providing a "handle " for attention, making it possible to pay attention to relational and abstract aspects of thought, and thereby to process them with greater richness. Organisms lacking language have no modality of experience that provides comparable articulation of thought; hence (...) certain kinds of thought very important for human intelligence are simply unavailable to them. (shrink)
In a continuation of the conversation with Fitch, Chomsky, and Hauser on the evolution of language, we examine their defense of the claim that the uniquely human, language-specific part of the language faculty (the “narrow language faculty”) consists only of recursion, and that this part cannot be considered an adaptation to communication. We argue that their characterization of the narrow language faculty is problematic for many reasons, including its dichotomization of cognitive capacities into those that are utterly unique and those (...) that are identical to nonlinguistic or nonhuman capacities, omitting capacities that may have been substantially modified during human evolution. We also question their dichotomy of the current utility versus original function of a trait, which omits traits that are adaptations for current use, and their dichotomy of humans and animals, which conflates similarity due to common function and similarity due to inheritance from a recent common ancestor. We show that recursion, though absent from other animals’ communications systems, is found in visual cognition, hence cannot be the sole evolutionary development that granted language to humans. Finally, we note that despite Fitch et al.’s denial, their view of language evolution is tied to Chomsky’s conception of language itself, which identifies combinatorial productivity with a core of “narrow syntax.” An alternative conception, in which combinatoriality is spread across words and constructions, has both empirical advantages and greater evolutionary plausibility. q 2005 Elsevier B.V. All rights reserved. (shrink)
A profoundly arresting integration of the faculties of the mind - of how we think, speak, and see the world. Written with an informality that belies the originality of its insights and the radical nature of its conclusions this is the author's most important book since his groundbreaking Foundations of Language in 2002.
This groundbreaking book offers a new and compelling perspective on the structure of human language. The fundamental issue it addresses is the proper balance between syntax and semantics, between structure and derivation, and between rule systems and lexicon. It argues that the balance struck by mainstream generative grammar is wrong. It puts forward a new basis for syntactic theory, drawing on a wide range of frameworks, and charts new directions for research. In the past four decades, theories of syntactic structure (...) have become more abstract, and syntactic derivations have become ever more complex. Peter Culicover and Ray Jackendoff trace this development through the history of contemporary syntactic theory, showing how much it has been driven by theory-internal rather than empirical considerations. They develop an alternative that is responsive to linguistic, cognitive, computational, and biological concerns. At the core of this alternative is the Simpler Syntax Hypothesis: the most explanatory syntactic theory is one that imputes the minimum structure necessary to mediate between phonology and meaning. A consequence of this hypothesis is a far richer mapping between syntax and semantics than is generally assumed. Through concrete analyses of numerous grammatical phenomena, some well studied and some new, the authors demonstrate the empirical and conceptual superiority of the Simpler Syntax approach.Simpler Syntax is addressed to linguists of all persuasions. It will also be of central interest to those concerned with language in psychology, human biology, evolution, computational science, and artificial intellige. (shrink)
On formal and empirical grounds, the overt form of language cannot be the vehicle that the mind uses for reasoning. Nevertheless, we most frequently experience our thought as "inner speech". It is argued that inner speech aids thought by providing a "handle " for attention, making it possible to pay attention to relational and abstract aspects of thought, and thereby to process them with greater richness. Organisms lacking language have no modality of experience that provides comparable articulation of thought; hence (...) certain kinds of thought very important for human intelligence are simply unavailable to them. (shrink)
Article history: This article sketches the Parallel Architecture, an approach to the structure of grammar that Accepted 29 August 2006 contrasts with mainstream generative grammar (MGG) in that (a) it treats phonology, Available online 13 October 2006 syntax, and semantics as independent generative components whose structures are linked by interface rules; (b) it uses a parallel constraint-based formalism that is nondirectional; (c) Keywords: it treats words and rules alike as pieces of linguistic structure stored in long-term memory.
In addition to providing an account of the empirical facts of language, a theory that aspires to account for language as a biologically based human faculty should seek a graceful integration of linguistic phenomena with what is known about other human cognitive capacities and about the character of brain computation. The present article compares the theoretical stance of biolinguistics (Chomsky 2005, Di Sciullo and Boeckx 2011) with a constraint-based Parallel Architecture approach to the language faculty (Jackendoff 2002, Culicover and Jackendoff (...) 2005). The issues considered include the necessity of redundancy in the lexicon and the rule system, the ubiquity of recursion in cognition, derivational vs. constraint-based formalisms, the relation between lexical items and grammatical rules, the roles of phonology and semantics in the grammar, the combinatorial character of thought in humans and nonhumans, the interfaces between language, thought, and vision, and the possible course of evolution of the language faculty. In each of these areas, the Parallel Architecture offers a superior account both of linguistic facts and of the relation of language to the rest of the mind/brain. (shrink)
On formal and empirical grounds, the overt form of language cannot be the vehicle that the mind uses for reasoning. Nevertheless, we most frequently experience our thought as "inner speech". It is argued that inner speech aids thought by providing a "handle " for attention, making it possible to pay attention to relational and abstract aspects of thought, and thereby to process them with greater richness. Organisms lacking language have no modality of experience that provides comparable articulation of thought; hence (...) certain kinds of thought very important for human intelligence are simply unavailable to them. (shrink)
The approach can be characterized at two somewhat independent levels. The first is the overall framework for the theory of meaning, and how this framework is integrated into linguistics, philosophy of language, and cognitive science (section 1). The second is the formal machinery that has been developed to achieve the goals of this framework (sections 2 and 3). The general framework might be realized in terms of other formal approaches, and many aspects of the formal machinery can empirically motivated within (...) any framework for studying meaning. (shrink)
The goal of this study is to reintegrate the theory of generative grammar into the cognitive sciences. Generative grammar was right to focus on the child's acquisition of language as its central problem, leading to the hypothesis of an innate Universal Grammar. However, generative grammar was mistaken in assuming that the syntactic component is the sole course of combinatoriality, and that everything else is “interpretive.” The proper approach is a parallel architecture, in which phonology, syntax, and semantics are autonomous generative (...) systems linked by interface components. The parallel architecture leads to an integration within linguistics, and to a far better integration with the rest of cognitive neuroscience. It fits naturally into the larger architecture of the mind/brain and permits a properly mentalistic theory of semantics. It results in a view of linguistic performance in which the rules of grammar are directly involved in processing. Finally, it leads to a natural account of the incremental evolution of the language capacity. Key Words: evolution of language; generative grammar; parallel architecture; semantics; syntax. (shrink)
There is ample evidence that speakers’ linguistic knowledge extends well beyond what can be described in terms of rules of compositional interpretation stated over combinations of single words. We explore a range of multiword constructions to get a handle both on the extent of the phenomenon and on the grammatical constraints that may govern it. We consider idioms of various sorts, collocations, compounds, light verbs, syntactic nuts, and assorted other constructions, as well as morphology. Our conclusion is that MWCs highlight (...) the central role that grammar plays in licensing MWCs in the lexicon and the creation of novel MWCs, and they help to clarify how the lexicon articulates with the rest of the grammar. (shrink)
While endorsing Evans & Levinson's (E&L's) call for rigorous documentation of variation, we defend the idea of Universal Grammar as a toolkit of language acquisition mechanisms. The authors exaggerate diversity by ignoring the space of conceivable but nonexistent languages, trivializing major design universals, conflating quantitative with qualitative variation, and assuming that the utility of a linguistic feature suffices to explain how children acquire it.
This paper presents a phenomenon of colloquial English that we call Contrastive Reduplication (CR), involving the copying of words and sometimes phrases as in It’s tuna salad, not SALAD-salad, or Do you LIKE-HIM-like him? Drawing on a corpus of examples gathered from natural speech, written texts, and television scripts, we show that CR restricts the interpretation of the copied element to a ‘real’ or prototypical reading. Turning to the structural properties of the construction, we show that CR is unusual among (...) reduplication phenomena in that whole idioms can be copied, object pronouns are often copied (as in the second example above), and inflectional morphology need not be copied. Thus the ‘scope’ of CR cannot be defined in purely phonological terms; rather, a combination of phonological, morphosyntactic, syntactic, and lexical factors is involved. We develop an analysis within the parallel architecture framework of Jackendoff (1997, 2002), whereby CR is treated as a lexical item with syntactic and semantic content and reduplicative phonology. We then sketch an alternative analysis, based on current assumptions within the Minimalist Program, which involves movement into a focus-like position with both the head and the tail of the resulting chain spelled out. (shrink)
language to explain, and I want to show how this depends on what you think language is. So, what is language? Everybody recognizes that language is partly culturally dependent: there is a huge variety of disparate languages in the world, passed down through cultural transmission. If that’s all there is to language, a theory of the evolution of language has nothing at all to explain. We need only explain the cultural evolution of languages: English, Dutch, Mandarin, Hausa, etc. are products (...) of cultural history. However, most readers of the present volume probably subscribe to the contemporary scientific view of language, which goes beneath the cultural differences among languages. It focuses on individual language users and asks. (shrink)
English resultative expressions have been a major focus of research on the syntax-semantics interface. We argue in this article that a family of related constructions is required to account for their distribution. We demonstrate that a number of generalizations follow from the semantics of the constructions we posit: the syntactic argument structure of the sentence is predicted by general principles of argument linking; and the aspectual structure of the sentence is determined by the aspectual structure of the constnictional subevent, ivhich (...) is in turn predictable from general principles correlating event structure with change, extension, motion, and paths. Finally, the semantics and syntax of resultatives explain the possibilities for temporal relations between the two subevents. While these generalizations clearly exist, there is also a great deal of idiosyncrasy involved in resultatives. Many idiosyncratic instances and small subclasses of the construction must be.. (shrink)
The English NPN construction, exemplified by construction after construction, is productive with five prepositions ââ¬â by, for, to, after, and upon ââ¬â with a variety of meanings, including succession, juxtaposition, and comparison; it also has numerous idiomatic cases. This mixture of regularity and idiosyncrasy lends itself to an account in the spirit of construction grammar, in which the..