Already hailed as a masterpiece, Foundations of Language offers a brilliant overhaul of the last thirty-five years of research in generative linguistics and related fields. "Few books really deserve the cliché 'this should be read by every researcher in the field'," writes Steven Pinker, author of The Language Instinct, "but Ray Jackendoff's Foundations of Language does." Foundations of Language offers a radically new understanding of how language, the brain, and perception intermesh. The book renews the promise of early generative linguistics: (...) that language can be a valuable entrée into understanding the human mind and brain. The approach is remarkably interdisciplinary. Behind its innovations is Jackendoff's fundamental proposal that the creativity of language derives from multiple parallel generative systems linked by interface components. This shift in basic architecture makes possible a radical reconception of mental grammar and how it is learned. As a consequence, Jackendoff is able to reintegrate linguistics with philosophy of mind, cognitive and developmental psychology, evolutionary biology, neuroscience, and computational linguistics. Among the major topics treated are language processing, the relation of language to perception, the innateness of language, and the evolution of the language capacity, as well as more standard issues in linguistic theory such as the roles of syntax and the lexicon. In addition, Jackendoff offers a sophisticated theory of semantics that incorporates insights from philosophy of language, logic and formal semantics, lexical semantics of various stripes, cognitive grammar, psycholinguistic and neurolinguistic approaches, and the author's own conceptual semantics. (shrink)
We examine the question of which aspects of language are uniquely human and uniquely linguistic in light of recent suggestions by Hauser, Chomsky, and Fitch that the only such aspect is syntactic recursion, the rest of language being either speciﬁc to humans but not to language (e.g. words and concepts) or not speciﬁc to humans (e.g. speech perception). We ﬁnd the hypothesis problematic. It ignores the many aspects of grammar that are not recursive, such as phonology, morphology, case, agreement, and (...) many properties of words. It is inconsistent with the anatomy and neural control of the human vocal tract. And it is weakened by experiments suggesting that speech perception cannot be reduced to primate audition, that word learning cannot be reduced to fact learning, and that at least one gene involved in speech and language was evolutionarily selected in the human lineage but is not speciﬁc to recursion. The recursion-only claim, we suggest, is motivated by Chomsky’s recent approach to syntax, the Minimalist Program, which de-emphasizes the same aspects of language. The approach, however, is sufﬁciently problematic that it cannot be used to support claims about evolution. We contest related arguments that language is not an adaptation, namely that it is “perfect,” non-redundant, unusable in any partial form, and badly designed for.. (shrink)
In a continuation of the conversation with Fitch, Chomsky, and Hauser on the evolution of language, we examine their defense of the claim that the uniquely human, language-speciﬁc part of the language faculty (the “narrow language faculty”) consists only of recursion, and that this part cannot be considered an adaptation to communication. We argue that their characterization of the narrow language faculty is problematic for many reasons, including its dichotomization of cognitive capacities into those that are utterly unique and those (...) that are identical to nonlinguistic or nonhuman capacities, omitting capacities that may have been substantially modiﬁed during human evolution. We also question their dichotomy of the current utility versus original function of a trait, which omits traits that are adaptations for current use, and their dichotomy of humans and animals, which conﬂates similarity due to common function and similarity due to inheritance from a recent common ancestor. We show that recursion, though absent from other animals’ communications systems, is found in visual cognition, hence cannot be the sole evolutionary development that granted language to humans. Finally, we note that despite Fitch et al.’s denial, their view of language evolution is tied to Chomsky’s conception of language itself, which identiﬁes combinatorial productivity with a core of “narrow syntax.” An alternative conception, in which combinatoriality is spread across words and constructions, has both empirical advantages and greater evolutionary plausibility. q 2005 Elsevier B.V. All rights reserved. (shrink)
In addition to providing an account of the empirical facts of language, a theory that aspires to account for language as a biologically based human faculty should seek a graceful integration of linguistic phenomena with what is known about other human cognitive capacities and about the character of brain computation. The present article compares the theoretical stance of biolinguistics (Chomsky 2005, Di Sciullo and Boeckx 2011) with a constraint-based Parallel Architecture approach to the language faculty (Jackendoff 2002, Culicover and Jackendoff (...) 2005). The issues considered include the necessity of redundancy in the lexicon and the rule system, the ubiquity of recursion in cognition, derivational vs. constraint-based formalisms, the relation between lexical items and grammatical rules, the roles of phonology and semantics in the grammar, the combinatorial character of thought in humans and nonhumans, the interfaces between language, thought, and vision, and the possible course of evolution of the language faculty. In each of these areas, the Parallel Architecture offers a superior account both of linguistic facts and of the relation of language to the rest of the mind/brain. (shrink)
Article history: This article sketches the Parallel Architecture, an approach to the structure of grammar that Accepted 29 August 2006 contrasts with mainstream generative grammar (MGG) in that (a) it treats phonology, Available online 13 October 2006 syntax, and semantics as independent generative components whose structures are linked by interface rules; (b) it uses a parallel constraint-based formalism that is nondirectional; (c) Keywords: it treats words and rules alike as pieces of linguistic structure stored in long-term memory.
On formal and empirical grounds, the overt form of language cannot be the vehicle that the mind uses for reasoning. Nevertheless, we most frequently experience our thought as "inner speech". It is argued that inner speech aids thought by providing a "handle " for attention, making it possible to pay attention to relational and abstract aspects of thought, and thereby to process them with greater richness. Organisms lacking language have no modality of experience that provides comparable articulation of thought; hence (...) certain kinds of thought very important for human intelligence are simply unavailable to them. (shrink)
While endorsing Evans & Levinson's (E&L's) call for rigorous documentation of variation, we defend the idea of Universal Grammar as a toolkit of language acquisition mechanisms. The authors exaggerate diversity by ignoring the space of conceivable but nonexistent languages, trivializing major design universals, conflating quantitative with qualitative variation, and assuming that the utility of a linguistic feature suffices to explain how children acquire it.
A profoundly arresting integration of the faculties of the mind - of how we think, speak, and see the world. Written with an informality that belies the originality of its insights and the radical nature of its conclusions this is the author's most important book since his groundbreaking Foundations of Language in 2002.
The goal of this study is to reintegrate the theory of generative grammar into the cognitive sciences. Generative grammar was right to focus on the child's acquisition of language as its central problem, leading to the hypothesis of an innate Universal Grammar. However, generative grammar was mistaken in assuming that the syntactic component is the sole course of combinatoriality, and that everything else is “interpretive.” The proper approach is a parallel architecture, in which phonology, syntax, and semantics are autonomous generative (...) systems linked by interface components. The parallel architecture leads to an integration within linguistics, and to a far better integration with the rest of cognitive neuroscience. It fits naturally into the larger architecture of the mind/brain and permits a properly mentalistic theory of semantics. It results in a view of linguistic performance in which the rules of grammar are directly involved in processing. Finally, it leads to a natural account of the incremental evolution of the language capacity. Key Words: evolution of language; generative grammar; parallel architecture; semantics; syntax. (shrink)
The approach can be characterized at two somewhat independent levels. The first is the overall framework for the theory of meaning, and how this framework is integrated into linguistics, philosophy of language, and cognitive science (section 1). The second is the formal machinery that has been developed to achieve the goals of this framework (sections 2 and 3). The general framework might be realized in terms of other formal approaches, and many aspects of the formal machinery can empirically motivated within (...) any framework for studying meaning. (shrink)
The English NPN construction, exemplified by construction after construction, is productive with five prepositions Ã¢â¬â by, for, to, after, and upon Ã¢â¬â with a variety of meanings, including succession, juxtaposition, and comparison; it also has numerous idiomatic cases. This mixture of regularity and idiosyncrasy lends itself to an account in the spirit of construction grammar, in which the..
language to explain, and I want to show how this depends on what you think language is. So, what is language? Everybody recognizes that language is partly culturally dependent: there is a huge variety of disparate languages in the world, passed down through cultural transmission. If that’s all there is to language, a theory of the evolution of language has nothing at all to explain. We need only explain the cultural evolution of languages: English, Dutch, Mandarin, Hausa, etc. are products (...) of cultural history. However, most readers of the present volume probably subscribe to the contemporary scientiﬁc view of language, which goes beneath the cultural differences among languages. It focuses on individual language users and asks. (shrink)
Presenting a landmark in linguistics and cognitive science, Ray Jackendoff proposes a new holistic theory of the relation between the sounds, structure, and meaning of language and their relation to mind and brain. Foundations of Language exhibits the most fundamental new thinking in linguistics since Noam Chomsky's Aspects of the Theory of Syntax in 1965—yet is readable, stylish, and accessible to a wide readership. Along the way it provides new insights on the evolution of language, thought, and communication.
In asking about the origins of human language, we first have to make clear what the question is. The question is not how languages gradually developed over time into the languages of the world today. Rather, it is how the human species developed over time so that we–and not our closest relatives, the chimpanzees and bonobos–became capable of using language.
There is ample evidence that speakers’ linguistic knowledge extends well beyond what can be described in terms of rules of compositional interpretation stated over combinations of single words. We explore a range of multiword constructions to get a handle both on the extent of the phenomenon and on the grammatical constraints that may govern it. We consider idioms of various sorts, collocations, compounds, light verbs, syntactic nuts, and assorted other constructions, as well as morphology. Our conclusion is that MWCs highlight (...) the central role that grammar plays in licensing MWCs in the lexicon and the creation of novel MWCs, and they help to clarify how the lexicon articulates with the rest of the grammar. (shrink)
Any theory of how language is internally organized and how it interacts with other mental capacities must address the fundamental question of how syntactic and lexico-semantic information interact at one central linguistic compositional level, the sentence level. With this general objective in mind, we examine “lightverbs”, so called because the main thrust of the semantic relations of the predicate that they denote is found not in the predicate itself, but in the argument structure of the syntactic object that such a (...) predicate licenses. For instance, in the sentence.. (shrink)