The English NPN construction, exemplified by construction after construction, is productive with five prepositions Ã¢â¬â by, for, to, after, and upon Ã¢â¬â with a variety of meanings, including succession, juxtaposition, and comparison; it also has numerous idiomatic cases. This mixture of regularity and idiosyncrasy lends itself to an account in the spirit of construction grammar, in which the..
English resultative expressions have been a major focus of research on the syntax-semantics interface. We argue in this article that a family of related constructions is required to account for their distribution. We demonstrate that a number of generalizations follow from the semantics of the constructions we posit: the syntactic argument structure of the sentence is predicted by general principles of argument linking; and the aspectual structure of the sentence is determined by the aspectual structure of the constnictional subevent, ivhich (...) is in turn predictable from general principles correlating event structure with change, extension, motion, and paths. Finally, the semantics and syntax of resultatives explain the possibilities for temporal relations between the two subevents. While these generalizations clearly exist, there is also a great deal of idiosyncrasy involved in resultatives. Many idiosyncratic instances and small subclasses of the construction must be.. (shrink)
explored only in the context of (a) the differences between them, and (b) those parallels that are also shared with other cognitive capacities. The two differ in many aspects of structure and function, and, with the exception of the metrical grid, all aspects they share appear to be instances of more general capacities.
In addition to providing an account of the empirical facts of language, a theory that aspires to account for language as a biologically based human faculty should seek a graceful integration of linguistic phenomena with what is known about other human cognitive capacities and about the character of brain computation. The present article compares the theoretical stance of biolinguistics (Chomsky 2005, Di Sciullo and Boeckx 2011) with a constraint-based Parallel Architecture approach to the language faculty (Jackendoff 2002, Culicover and Jackendoff (...) 2005). The issues considered include the necessity of redundancy in the lexicon and the rule system, the ubiquity of recursion in cognition, derivational vs. constraint-based formalisms, the relation between lexical items and grammatical rules, the roles of phonology and semantics in the grammar, the combinatorial character of thought in humans and nonhumans, the interfaces between language, thought, and vision, and the possible course of evolution of the language faculty. In each of these areas, the Parallel Architecture offers a superior account both of linguistic facts and of the relation of language to the rest of the mind/brain. (shrink)
language to explain, and I want to show how this depends on what you think language is. So, what is language? Everybody recognizes that language is partly culturally dependent: there is a huge variety of disparate languages in the world, passed down through cultural transmission. If that’s all there is to language, a theory of the evolution of language has nothing at all to explain. We need only explain the cultural evolution of languages: English, Dutch, Mandarin, Hausa, etc. are products (...) of cultural history. However, most readers of the present volume probably subscribe to the contemporary scientiﬁc view of language, which goes beneath the cultural differences among languages. It focuses on individual language users and asks. (shrink)
Any theory of how language is internally organized and how it interacts with other mental capacities must address the fundamental question of how syntactic and lexico-semantic information interact at one central linguistic compositional level, the sentence level. With this general objective in mind, we examine “lightverbs”, so called because the main thrust of the semantic relations of the predicate that they denote is found not in the predicate itself, but in the argument structure of the syntactic object that such a (...) predicate licenses. For instance, in the sentence.. (shrink)
The primary goal of modern linguistic theory (at least in the circles I inhabit) is an explanation of the human language capacity and how it enables the child to acquire adult competence in language.1 Adult competence in turn is understood as the ability (or knowledge) to creatively map between sound and meaning, using a rich combinatorial system – the lexicon and grammar of the language. An adequate theory must satisfy at least three crucial constraints, which I will call the Descriptive (...) Constraint, the Learnability Constraint, and the Evolutionary Constraint. (shrink)
Article history: This article sketches the Parallel Architecture, an approach to the structure of grammar that Accepted 29 August 2006 contrasts with mainstream generative grammar (MGG) in that (a) it treats phonology, Available online 13 October 2006 syntax, and semantics as independent generative components whose structures are linked by interface rules; (b) it uses a parallel constraint-based formalism that is nondirectional; (c) Keywords: it treats words and rules alike as pieces of linguistic structure stored in long-term memory.
This paper presents a phenomenon of colloquial English that we call Contrastive Reduplication (CR), involving the copying of words and sometimes phrases as in It’s tuna salad, not SALAD-salad, or Do you LIKE-HIM-like him? Drawing on a corpus of examples gathered from natural speech, written texts, and television scripts, we show that CR restricts the interpretation of the copied element to a ‘real’ or prototypical reading. Turning to the structural properties of the construction, we show that CR is unusual among (...) reduplication phenomena in that whole idioms can be copied, object pronouns are often copied (as in the second example above), and inﬂectional morphology need not be copied. Thus the ‘scope’ of CR cannot be deﬁned in purely phonological terms; rather, a combination of phonological, morphosyntactic, syntactic, and lexical factors is involved. We develop an analysis within the parallel architecture framework of Jackendoff (1997, 2002), whereby CR is treated as a lexical item with syntactic and semantic content and reduplicative phonology. We then sketch an alternative analysis, based on current assumptions within the Minimalist Program, which involves movement into a focus-like position with both the head and the tail of the resulting chain spelled out. (shrink)
The basic premise of the Parallel Architecture (Jackendoff 1997, 2002) is that phonology, syntax, and semantics are independent generative components in language, each with its own primitives and principles of combination. The theory builds on insights about linguistic structure that emerged in the 1970s. First, phonology was demonstrated to have highly articulated structure that cannot be derived directly from syntax: structured units such as syllables and prosodic constituents do not correspond one-to-one with syntactic units. Moreover, phonological structure includes several independent (...) substructures or tiers, each with its own type of generative structure: segmental-syllabic structure, the metrical grid, intonation contour, and (in tone languages) the tone tier. The tiers are correlated with each other by interface rules: principles that establish optimal correspondence between structures of two independent types. Such rules are not derivational. Since these phonological structures cannot be derived from syntactic structures, the connection between syntax and phonology must also be mediated not by derivations, but by a component of interface rules. (shrink)
Within cognitive science, language is often set apart (as it is in the present volume) from perception, action, learning, memory, concepts, and reasoning. Yet language is intertwined with all of them. Language perception is a kind of perception; language production is a kind of action. Vocabulary and grammar are learned and stored in long-term memory. As novel utterances are perceived or produced, they are built up in working memory. Concepts are most often studied in the context of word meanings; reasoning (...) is most often studied in the context of inferring one sentence from another. (shrink)
It has become fashionable recently to speak of linguistic inquiry as biolinguistics, an attempt to frame questions of linguistic theory in a biological context. The Minimalist Program (Chomsky 1995, 2001) is of course the most prominent stream of research in this paradigm. However, an alternative stream within the paradigm, the Parallel Architecture, has been developing in my own work over the past 30 years; it includes two important subcomponents, Conceptual Structure and Simpler Syntax (Jackendoff 2002, 2007b; Culicover and Jackendoff 2005). (...) The present article will show how the Parallel Architecture is in many ways a more promising realization of biolinguistic goals than the Minimalist Program, and that it is more conducive to integration with both the rest of linguistic theory and the rest of cognitive science. (shrink)
mar of a language? What are the consequences of these only the ‘tryer’ but also the ‘drinker’, even though the noun roles for syntactic structure, and why does it matter? We phrase Ozzie is not overtly an argument of the verb drink. sketch the Simpler Syntax Hypothesis, which holds that..
This paper is more about the questions for a theory of language evolution than about the answers. I’d like to ask what there is for a theory of the evolution of language to explain, and I want to show how this depends on what you think language is.
The approach can be characterized at two somewhat independent levels. The first is the overall framework for the theory of meaning, and how this framework is integrated into linguistics, philosophy of language, and cognitive science (section 1). The second is the formal machinery that has been developed to achieve the goals of this framework (sections 2 and 3). The general framework might be realized in terms of other formal approaches, and many aspects of the formal machinery can empirically motivated within (...) any framework for studying meaning. (shrink)
A User's Guide to Thought and Meaning presents a profound and arresting integration of the faculties of the mind - of how we think, speak, and see the world. Ray Jackendoff starts out by looking at languages and what the meanings of words and sentences actually do. He shows that meanings are more adaptive and complicated than they're commonly given credit for, and he is led to some basic questions: How do we perceive and act in the world? How do (...) we talk about it? And how can the collection of neurons in the brain give rise to conscious experience? As it turns out, the organization of language, thought, and perception does not look much like the way we experience things, and only a small part of what the brain does is conscious. Jackendoff concludes that thought and meaning must be almost completely unconscious. What we experience as rational conscious thought - which we prize as setting us apart from the animals - in fact rides on a foundation of unconscious intuition. Rationality amounts to intuition enhanced by language. -/- Written with an informality that belies both the originality of its insights and the radical nature of its conclusions, A User's Guide to Thought and Meaning is the author's most important book since the groundbreaking Foundations of Language in 2002. (shrink)
In a continuation of the conversation with Fitch, Chomsky, and Hauser on the evolution of language, we examine their defense of the claim that the uniquely human, language-speciﬁc part of the language faculty (the “narrow language faculty”) consists only of recursion, and that this part cannot be considered an adaptation to communication. We argue that their characterization of the narrow language faculty is problematic for many reasons, including its dichotomization of cognitive capacities into those that are utterly unique and those (...) that are identical to nonlinguistic or nonhuman capacities, omitting capacities that may have been substantially modiﬁed during human evolution. We also question their dichotomy of the current utility versus original function of a trait, which omits traits that are adaptations for current use, and their dichotomy of humans and animals, which conﬂates similarity due to common function and similarity due to inheritance from a recent common ancestor. We show that recursion, though absent from other animals’ communications systems, is found in visual cognition, hence cannot be the sole evolutionary development that granted language to humans. Finally, we note that despite Fitch et al.’s denial, their view of language evolution is tied to Chomsky’s conception of language itself, which identiﬁes combinatorial productivity with a core of “narrow syntax.” An alternative conception, in which combinatoriality is spread across words and constructions, has both empirical advantages and greater evolutionary plausibility. q 2005 Elsevier B.V. All rights reserved. (shrink)
We examine the question of which aspects of language are uniquely human and uniquely linguistic in light of recent suggestions by Hauser, Chomsky, and Fitch that the only such aspect is syntactic recursion, the rest of language being either speciﬁc to humans but not to language (e.g. words and concepts) or not speciﬁc to humans (e.g. speech perception). We ﬁnd the hypothesis problematic. It ignores the many aspects of grammar that are not recursive, such as phonology, morphology, case, agreement, and (...) many properties of words. It is inconsistent with the anatomy and neural control of the human vocal tract. And it is weakened by experiments suggesting that speech perception cannot be reduced to primate audition, that word learning cannot be reduced to fact learning, and that at least one gene involved in speech and language was evolutionarily selected in the human lineage but is not speciﬁc to recursion. The recursion-only claim, we suggest, is motivated by Chomsky’s recent approach to syntax, the Minimalist Program, which de-emphasizes the same aspects of language. The approach, however, is sufﬁciently problematic that it cannot be used to support claims about evolution. We contest related arguments that language is not an adaptation, namely that it is “perfect,” non-redundant, unusable in any partial form, and badly designed for.. (shrink)
The goal of this study is to reintegrate the theory of generative grammar into the cognitive sciences. Generative grammar was right to focus on the child's acquisition of language as its central problem, leading to the hypothesis of an innate Universal Grammar. However, generative grammar was mistaken in assuming that the syntactic component is the sole course of combinatoriality, and that everything else is “interpretive.” The proper approach is a parallel architecture, in which phonology, syntax, and semantics are autonomous generative (...) systems linked by interface components. The parallel architecture leads to an integration within linguistics, and to a far better integration with the rest of cognitive neuroscience. It fits naturally into the larger architecture of the mind/brain and permits a properly mentalistic theory of semantics. It results in a view of linguistic performance in which the rules of grammar are directly involved in processing. Finally, it leads to a natural account of the incremental evolution of the language capacity. Key Words: evolution of language; generative grammar; parallel architecture; semantics; syntax. (shrink)
The commentaries show the wide variety of incommensurable viewpoints on language that Foundations of Language attempts to integrate. In order to achieve a more comprehensive framework that preserves genuine insights coming from all sides, everyone will have to give a little.
In asking about the origins of human language, we first have to make clear what the question is. The question is not how languages gradually developed over time into the languages of the world today. Rather, it is how the human species developed over time so that we–and not our closest relatives, the chimpanzees and bonobos–became capable of using language.