This is an introduction to the structure of sentences in human languages. It assumes no prior knowledge of linguistic theory and little of elementary grammar. It will suit students coming to syntactic theory for the first time either as graduates or undergraduates. It will also be useful for those in fields such as computational science, artificial intelligence, or cognitive psychology who need a sound knowledge of current syntactic theory.
W. Labov's & T. Labov's findings concerning their child grammar acquisition ("Learning the Syntax of Questions" in Recent Advances in the Psychology of Language, Campbell, R. & Smith, P. Eds, New York: Plenum Press, 1978) are interpreted in terms of different semantics of why & other wh-questions. Z. Dubiel.
Much of the best contemporary work in the philosophy of language and content makes appeal to the theories developed in generative syntax. In particular, there is a presumption that—at some level and in some way—the structures provided by syntactic theory mesh with or support our conception of content/linguistic meaning as grounded in our first-person understanding of our communicative speech acts. This paper will suggest that there is no such tight fit. Its claim will be that, if recent generative theories (...) are on the right lines, syntactic structure provides both too much and too little to serve as the structural partner for content, at least as that notion is generally understood in philosophy. The paper will substantiate these claims by an assessment of the recent work of King, Stanley, and others. (shrink)
It is commonly argued that the rules of language, as distinct from its semantic features, are the characteristics which most clearly distinguish language from the communication systems of other species. A number of linguists (e.g., Chomsky 1972, 1980; Pinker 1994) have suggested that the universal features of grammar (UG) are unique human adaptations showing no evolutionary continuities with any other species. However, recent summaries of the substantive features of UG are quite remarkable in the very general nature of the features (...) proposed. While the syntax of any given language can be quite complex, the specific rules vary so much between languages that the truly universal (i.e. innate) aspects of grammar are not complex at all. In fact, these features most closely resemble a set of general descriptions of our richly complex semantic cognition, and not a list of specific rules. General principles of the evolutionary process suggest that syntax is more properly understood as an emergent characteristic of the explosion of semantic complexity that occurred during hominid evolution. It is argued that grammatical rules used in given languages are likely to be simply conventionalized, invented features of language, and not the result of an innate, grammar-specific module. The grammatical and syntactic regularities that are found across languages occur simply because all languages attempt to communicate the same sorts of semantic information. (shrink)
I critically examine some provocative arguments that John Searle presents in his book The Rediscovery of Mind to support the claim that the syntactic states of a classical computational system are "observer relative" or "mind dependent" or otherwise less than fully and objectively real. I begin by explaining how this claim differs from Searle's earlier and more well-known claim that the physical states of a machine, including the syntactic states, are insufficient to determine its semantics. In contrast, his more recent (...) claim concerns the syntax, in particular, whether a machine actually has symbols to underlie its semantics. I then present and respond to a number of arguments that Searle offers to support this claim, including whether machine symbols are observer relative because the assignment of syntax is arbitrary, or linked to universal realizability, or linked to the sub-personal interpretive acts of a homunculus, or linked to a person's consciousness. I conclude that a realist about the computational model need not be troubled by such arguments. Their key premises need further support. (shrink)
Building on the success of the bestselling first edition, the second edition of this textbook provides a comprehensive and accessible introduction to the major issues in Principles and Parameters syntactic theory, including phrase structure, the lexicon, case theory, movement, and locality conditions. Includes new and extended problem sets in every chapter, all of which have been annotated for level and skill type. Features three new chapters on advanced topics including vP shells, object shells, control, gapping and ellipsis and an additional (...) chapter on advanced topics in binding. Offers a brief survey of both Lexical-Functional Grammar and Head-Driven Phrase Structure Grammar. Succeeds in strengthening the reader's foundational knowledge, and prepares them for more advanced study. Supported by an instructor's manual and online resources for students and instructors, available at www.blackwellpublishing.com/carnie. (shrink)
Proponents of the language of thought (LOT) thesis are realists when it comes to syntactically structured representations, and must defend their view against instrumentalists, who would claim that syntactic structures may be useful in describing cognition, but have no more causal powers in governing cognition than do the equations of physics in guiding the planets. This paper explores what it will take to provide an argument for LOT that can defend its conclusion from instrumentalism. I illustrate a difficulty in this (...) project by discussing arguments for LOT put forward by Horgan and Tienson. When their evidence is viewed in the light of results in connectionist research, it is hard to see how a realist conception of syntax can be formulated and defended. (shrink)
Three studies provided evidence that syntax influences intentionality judgments. In Experiment 1, participants made either speeded or unspeeded intentionality judgments about ambiguously intentional subjects or objects. Participants were more likely to judge grammatical subjects as acting intentionally in the speeded relative to the reflective condition (thus showing an intentionality bias), but grammatical objects revealed the opposite pattern of results (thus showing an unintentionality bias). In Experiment 2, participants made an intentionality judgment about one of the two actors in a (...) partially symmetric sentence (e.g., “John exchanged products with Susan”). The results revealed a tendency to treat the grammatical subject as acting more intentionally than the grammatical object. In Experiment 3 participants were encouraged to think about the events that such sentences typically refer to, and the tendency was significantly reduced. These results suggest a privileged relationship between language and central theory-of-mind concepts. More specifically, there may be two ways of determining intentionality judgments: (1) an automatic verbal bias to treat grammatical subjects (but not objects) as intentional (2) a deeper, more careful consideration of the events typically described by a sentence. (shrink)
I CATEGORIES AND PRINCIPLES ii Introductory Remarks The value of linguistics as a cognitive science lies largely in its potential for providing insights ...
Turner argues that computer programs must have purposes, that implementation is not a kind of semantics, and that computers might need to understand what they do. I respectfully disagree: Computer programs need not have purposes, implementation is a kind of semantic interpretation, and neither human computers nor computing machines need to understand what they do.
One of the most important discoveries of the last thirty years is the extent to which the pattern of anaphoric interpretations is determined by the geometry of syntactic structure. As our understanding of these phenomena has steadily grown, the theory of syntax has often been driven by discoveries in this domain, and it is no accident that Chomsky's Binding Theory was a centerpiece of the principles and parameters approach of the 1980s. However, what remained accidental in Chomsky's theory, and (...) in most of the theories that have followed it, is the apparently complementary distribution of forms that support anaphora for a given antecedent. This book argues not only that the complementary distribution in question is robust empirically, but that its existence is derived by a competitive theory of anaphora. It is demonstrated in detail that the competitive theory provides a far better explanation of anti-locality, anti-subject orientation and the range of apparently exceptional distributions that have been long been problematic for other approaches, such as Chomsky's Binding Theory and the influential predication-based theory of Reinhart and Reuland. (shrink)
The discrepancy between syntax and semantics is a painstaking issue that hinders a better comprehension of the underlying neuronal processes in the human brain. In order to tackle the issue, we at first describe a striking correlation between Wittgenstein's Tractatus, that assesses the syntactic relationships between language and world, and Perlovsky's joint language-cognitive computational model, that assesses the semantic relationships between emotions and “knowledge instinct”. Once established a correlation between a purely logical approach to the language and computable psychological (...) activities, we aim to find the neural correlates of syntax and semantics in the human brain. Starting from topological arguments, we suggest that the semantic properties of a proposition are processed in higher brain's functional dimensions than the syntactic ones. In a fully reversible process, the syntactic elements embedded in Broca's area project into multiple scattered semantic cortical zones. The presence of higher functional dimensions gives rise to the increase in informational content that takes place in semantic expressions. Therefore, diverse features of human language and cognitive world can be assessed in terms of both the logic armor described by the Tractatus, and the neurocomputational techniques at hand. One of our motivations is to build a neuro-computational framework able to provide a feasible explanation for brain's semantic processing, in preparation for novel computers with nodes built into higher dimensions. (shrink)
This groundbreaking book offers a new and compelling perspective on the structure of human language. The fundamental issue it addresses is the proper balance between syntax and semantics, between structure and derivation, and between rule systems and lexicon. It argues that the balance struck by mainstream generative grammar is wrong. It puts forward a new basis for syntactic theory, drawing on a wide range of frameworks, and charts new directions for research. In the past four decades, theories of syntactic (...) structure have become more abstract, and syntactic derivations have become ever more complex. Peter Culicover and Ray Jackendoff trace this development through the history of contemporary syntactic theory, showing how much it has been driven by theory-internal rather than empirical considerations. They develop an alternative that is responsive to linguistic, cognitive, computational, and biological concerns. At the core of this alternative is the Simpler Syntax Hypothesis: the most explanatory syntactic theory is one that imputes the minimum structure necessary to mediate between phonology and meaning. A consequence of this hypothesis is a far richer mapping between syntax and semantics than is generally assumed. Through concrete analyses of numerous grammatical phenomena, some well studied and some new, the authors demonstrate the empirical and conceptual superiority of the Simpler Syntax approach.Simpler Syntax is addressed to linguists of all persuasions. It will also be of central interest to those concerned with language in psychology, human biology, evolution, computational science, and artificial intellige. (shrink)
Recently several philosophers of science have proposed what has come to be known as the semantic account of scientific theories. It is presented as an improvement on the positivist account, which is now called the syntactic account of scientific theories. Bas van Fraassen claims that the syntactic account does not give a satisfactory definition of "empirical adequacy" and "empirical equivalence". He contends that his own semantic account does define these notations acceptably, through the concept of "embeddability", a concept which he (...) claims cannot be defined syntactically. Here, I define a syntactic relation which corresponds to the semantic relation of "embeddability". I suggest that the critical differences between the positivist account and van Fraassen's account have nothing to do with the distinction between semantics and syntax. (shrink)
P.M.S. Hacker has argued that there are numerous misconceptions in James Conant's account of Wittgenstein's views and of those of Carnap. I discuss only Hacker's treatment of Conant on logical syntax in the _Tractatus. I try to show that passages in the _Tractatus which Hacker takes to count strongly against Conant's view do no such thing, and that he himself has not explained how he can account for a significant passage which certainly appears to support Conant's reading.
The ‘syntax’ and ‘combinatorics’ of my title are what Curry (1961) referred to as phenogrammatics and tectogrammatics respectively. Tectogrammatics is concerned with the abstract combinatorial structure of the grammar and directly informs semantics, while phenogrammatics deals with concrete operations on syntactic data structures such as trees or strings. In a series of previous papers (Muskens, 2001a; Muskens, 2001b; Muskens, 2003) I have argued for an architecture of the grammar in which finite sequences of lambda terms are the basic data (...) structures, pairs of terms syntax, semantics for example. These sequences then combine with the help of simple generalizations of the usual abstraction and application operations. This theory, which I call Lambda Grammars and which is closely related to the independently formulated theory of Abstract Categorial Grammars (de Groote, 2001; de Groote, 2002), in fact is an implementation of Curry’s ideas: the level of tectogrammar is encoded by the sequences of lambda-terms and their ways of combination, while the syntactic terms in those sequences constitute the phenogrammatical level. In de Groote’s formulation of the theory, tectogrammar is the level of abstract terms, while phenogrammar is the level of object terms. (shrink)
Dynamic Syntax is an action-based grammar formalism which models the process of natural language understanding as monotonic tree growth. This paper presents an introduction to the notions of incrementality and underspecification and update, drawing on the assumptions made by DS. It lays out the tools of the theoretical framework that are necessary to understand the accounts developed in the other contributions to the Special Issue. It also represents an up-to-date account of the framework, combining the developments that have previously (...) remained distributed in a diverse body of literature. (shrink)
Like '&', '=' is no term; it represents no extrasentential property. It marks an atomic, nonpredicative, declarative structure, sentences true solely by codesignation. Identity (its necessity and total reflexivity, its substitution rule, its metaphysical vacuity) is the objectual face of codesignation. The syntax demands pure reference, without predicative import for the asserted fact. 'Twain is Clemens' is about Twain, but nothing is predicated of him. Its informational value is in its 'metailed' semantic content: the fact of codesignation (that 'Twain' (...) names Clemens) that explains what fact it asserts and why it is necessary. Critiques of concepts of rigidity and elimination of singular terms result. (shrink)
This collection covers the fundamental concepts and analytic tools of generative transformational syntax of the last half century, from Chomsky's Morphophonemics of Modern Hebrew (1951) to the present day. It makes available, in one place, key published material on important areas such as phrase structure, transformations, and conditions on rules and representations. Presenting articles by leading contributors to the field such as Baltin, Bokovic, Bresnan, Chomsky, Cinque, Emonds, Freidin, Hale, Higginbotham, Huang, Kayne, Lasnik, McCawley, Pollock, Postal, Reinhart, Rizzi, Ross, (...) Stowell, Torrego, Travis, Vergnaud, and Williams, this fascinating collection also includes a general introduction by the editors and an index, thus providing a comprehensive single reference resource for students and researchers alike. (shrink)
The aim of this paper is to provide context for and historical exegesis of Carnap’s alleged move from syntax to semantics. The Orthodox Received View states that there was a radical break, while the Unorthodox Received View holds that Carnap’s syntactical period already had many significant semantical elements. I will argue that both of them are partly right, both of them contain a kernel of truth: it is true that Carnap’s semantical period started after his Logical Syntax of (...) Language — in one sense of semantics. But it is also true that Carnap had already included semantical ideas in LSL: though not in the sense that URV maintains. This latter sense of semantics is related to what is usually called inferentialism, and by getting a clearer picture of Carnap’s original aims, context, and concept-usage, we might be in a better position to approach his alleged inferentialism. (shrink)
The only obligatory temporal expression in English is tense, yet Hans Reichenbach (1947) has argued convincingly that the simplest sentence is understood in terms of three temporal notions. Additional possibilities for a simple sentence are limited: English sentences have one time adverbial each. It is not immediately clear how to resolve these matters, that is, how (if at all) Reichenbach's account can be reconciled with the facts of English. This paper attempts to show that they can be reconciled, and presents (...) an analysis of temporal specification that is based directly on Reichenbach's account.Part I is devoted to a study of the way the three times—speech time, reference time, event time—are realized and interpreted. The relevant syntactic structures and their interaction and interpretation are examined in detail. Part II discusses how a grammar should deal with time specification, and proposes a set of interpretive rules. The study offers an analysis of simple sentences, sentences with complements, and habitual sentences. It is shown that tense and adverbials function differently, depending on the structure in which they appear. The temporal system is relational: the orientation and values of temporal expressions are not fixed, but their relational values are consistent. This consistency allows the statement of principles of interpretation. (shrink)
Linear Syntax makes a case for a critical reassessment of the wide-spread view that syntax can be reduced to tree structures. It argues that a crucial part of the description of German clausal syntax should instead be based on concepts that are defined in terms of linear order. By connecting the descriptive tools of modern phrase-structure grammar with traditional descriptive scholarship, Andreas Kathol offers a new perspective on many long-standing problems in syntactic theory.
Language—often said to set human beings apart from other animals—has resisted explanation in terms of evolution. Language has—among others—two fundamental and distinctive features: syntax and the ability to express non-present actions and events. We suggest that the relation between this representation (of non-present action) and syntax can be analyzed as a relation between a function and a structure to fulfill this function. The strategy of the paper is to ask if there is any evidence of pre-linguistic communication that (...) fulfills the function of communicating an absent action. We identify a structural similarity between understanding indexes of past actions of conspecifics (who did what to whom) and one of the simplest and most paradigmatic linguistic syntactic patterns – that of the simple transitive sentence. When a human being infers past events from an index (i.e., a trace, the conditions of a conspecifics or an animal, a constellation or an object) the interpreters’ comprehension must rely on concepts similar in structure and function to the ‘thematic roles’ believed to underpin the comprehension of linguistic syntax: in his or her mind the idea of a past action or event emerges along with thematic role-like concepts; in the case of the presentation of, e.g., a hunting trophy, the presenter could be understood to be an agent (subject) and the trophy a patient (direct object), while the past action killed is implied by the condition of the object and its possession by the presenter. We discuss whether both the presentation of a trophy and linguistic syntax might have emerged independently while having the same function (to represent a past action) or whether the presentation of an index of a deed could constitute a precursor of language. Both possibilities shed new light on early, and maybe first, language use. (shrink)
The relation between linguistics and logic has been discussed in a, recent paper by Bar-Hillel} where it is argued that a disregard for workin logical syntax and semantics has caused linguists to limit themselves too narrowly in their inquiries, and to fall into several errors. In particular, Bar-Hillel asserts, they have attempted to derive relations of synonymy and so-called ‘rules of transfOI`1'Il8.tiOH,, such as the active—pussive relation, from distributional studies alone, and they have hesitated to rely on considerations of (...) meaning in linguistic analysis. No one can quarrel with the suggestion that linguists interest themselves in meaning or transformation rules, but the relevance of logical syntax and semsmticsz (at least as we now know them) to this study is very dubious. I think that a closer investigation of the assumptions and concems of logical syntax and semantics will show that the hope of applying the results which have been achieved in these fields to the solution of linguistic problems is illusory. (shrink)
This book presents and exemplifies the theory of grammar called Semantic Syntax. The grammar, which offers a syntactic theory closely connected with semantic analyses, is a direct continuation of Generative Semantics; it will re-ignite interest in that framework which flourished and promised so much in the 1960s and 1970s.
Split constructions are widespread in natural languages. The separation of the semantic restriction of a quantifier from that quantifier is a typical example of such a construction. This study addresses the problem that such discontinuous strings exhibit--namely, a number of locality constraints, including intervention effects. These are shown to follow from the interaction of a minimalist syntax with a semantics that directly assigns a model-theoretic interpretation to syntactic logical forms. The approach is shown to have wide empirical coverage and (...) a conceptual simplicity. The book will be of interest to scholars and advanced students of syntax and semantics. (shrink)
This book is a collection of key readings on Minimalist Syntax, the most recent, and arguably most important, theoretical development within the Principles and Parameters approach to syntactic theory. Brings together in one volume the key readings on Minimalist Syntax Includes an introduction and overview of the Minimalist Program written by two prominent researchers Excerpts crucial pieces from the beginning of Minimalism to the most recent work and provides invaluable coverage of the most important topics.
The Marxian Thesis about the role of violence in History, as it is enunciated in The Capital, is investigated through an analysis of the Hegelian character of its syntax, and the way Engels develops it; a non-teleological interpretation of the thesis is then defended, one that understands that violence presents a plurality of forms, a pervasive character and a heavy materiality.Trata-se de investigar a tese marxiana acerca do papel da violência na história, tal como enunciada em O Capital, analisando (...) sua sintaxe de matriz hegeliana e o modo como Engels articula tal tese, para então defender uma interpretação não-teleológica da violência, segundo a qual esta apresenta uma pluralidade de formas, um caráter totalmente difuso e uma pesada materialidade. (shrink)
Available for the first time in 20 years, here is the Rudolf Carnap's famous principle of tolerance by which everyone is free to mix and match the rules of ...
Abstract It is widely assumed that the meaning of at least some types of expressions involves more than their reference to objects, and hence that there may be co-referential expressions which differ in meaning. It is also widely assumed that ?syntax does not suffice for semantics?, i.e. that we cannot account for the fact that expressions have semantic properties in purely syntactical or computational terms. The main goal of the paper is to argue against a third related assumption, namely (...) that what is responsible for a difference in meaning between co-referential expressions is the computational difference in the cognitive functioning of the expressions. ?Intentional aspects? of expressions?those features which their meanings involve in addition to reference?cannot be syntacticized, since they are individuated not in terms of any cognitive feature, but rather in terms of those properties of the referents through which the expressions refer to them, and cognitive features cannot determine such properties in exactly the same sense as they cannot determine reference. (shrink)
Following Aristotle (who himself was following Parmenides), philosophers have appealed to the distributional reflexes of expressions in determining their semantic status, and ultimately, the nature of the extra-linguistic world. This methodology has been practiced throughout the history of philosophy; it was clarified and made popular by the likes of Zeno Vendler and J.L. Austin, and is realized today in the toolbox of linguistically minded philosophers. Studying the syntax of natural language was fueled by the belief that there is a (...) conceptually tight connection between the syntax of our language and its semantics, and the belief that there is a similarly tight connection between the semantics of our language and metaphysical facts about the world. We are less confident than our colleagues about the relation syntax has to semantics and metaphysics. In particular, we do not believe that the current status of theoretical syntax (or semantics or metaphysics) provides much support for either of the above two beliefs. We will illustrate our view with a case study regarding the status of complex demonstratives. We will show that a recent and particularly subtle syntactically based argument for the semantic/metaphysical status of complex demonstratives does not in fact show what semantic category complex demonstratives are.. (shrink)
This volume contains twelve chapters on the derivation of and the correlates to verb initial word order. The studies in this volume cover such widely divergent languages as Irish, Welsh, Scots Gaelic, Old Irish, Biblical Hebrew, Jakaltek, Mam, Lummi (Straits Salish), Niuean, Malagasy, Palauan, K'echi', and Zapotec, from a wide variety of theoretical perspectives, including Minimalism, information structure, and sentence processing. The first book to take a crosslinguistic comparative approach to verb initial syntax, this volume provides new data to (...) some old problems anddebates and explores some innovative approaches to the derivation of verb initial order. (shrink)