Search results for 'grammars' (try it on Scholar)

1000+ found
Sort by:
  1. J. Lambek (2008). Pregroup Grammars and Chomsky's Earliest Examples. Journal of Logic, Language and Information 17 (2):141-160.score: 24.0
    Pregroups are partially ordered monoids in which each element has two “adjoints”. Pregroup grammars provide a computational approach to natural languages by assigning to each word in the mental dictionary a type, namely an element of the pregroup freely generated by a partially ordered set of basic types. In this expository article, the attempt is made to introduce linguists to a pregroup grammar of English by looking at Chomsky’s earliest examples.
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  2. Anne Preller & Mehrnoosh Sadrzadeh (2011). Semantic Vector Models and Functional Models for Pregroup Grammars. Journal of Logic, Language and Information 20 (4):419-443.score: 24.0
    We show that vector space semantics and functional semantics in two-sorted first order logic are equivalent for pregroup grammars. We present an algorithm that translates functional expressions to vector expressions and vice-versa. The semantics is compositional, variable free and invariant under change of order or multiplicity. It includes the semantic vector models of Information Retrieval Systems and has an interior logic admitting a comprehension schema. A sentence is true in the interior logic if and only if the ‘usual’ first (...)
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  3. Daniel Feinstein & Shuly Wintner (2008). Highly Constrained Unification Grammars. Journal of Logic, Language and Information 17 (3):345-381.score: 24.0
    Unification grammars are widely accepted as an expressive means for describing the structure of natural languages. In general, the recognition problem is undecidable for unification grammars. Even with restricted variants of the formalism, off-line parsable grammars, the problem is computationally hard. We present two natural constraints on unification grammars which limit their expressivity and allow for efficient processing. We first show that non-reentrant unification grammars generate exactly the class of context-free languages. We then relax the (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  4. Philippe de Groote & Sylvain Pogodalla (2004). On the Expressive Power of Abstract Categorial Grammars: Representing Context-Free Formalisms. [REVIEW] Journal of Logic, Language and Information 13 (4):421-438.score: 24.0
    We show how to encode context-free string grammars, linear context-free tree grammars, and linear context-free rewriting systems as Abstract Categorial Grammars. These three encodings share the same constructs, the only difference being the interpretation of the composition of the production rules. It is interpreted as a first-order operation in the case of context-free string grammars, as a second-order operation in the case of linear context-free tree grammars, and as a third-order operation in the case of (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  5. Peter Beim Graben & Sabrina Gerth (2012). Geometric Representations for Minimalist Grammars. Journal of Logic, Language and Information 21 (4):393-432.score: 24.0
    We reformulate minimalist grammars as partial functions on term algebras for strings and trees. Using filler/role bindings and tensor product representations, we construct homomorphisms for these data structures into geometric vector spaces. We prove that the structure-building functions as well as simple processors for minimalist languages can be realized by piecewise linear operators in representation space. We also propose harmony, i.e. the distance of an intermediate processing step from the final well-formed state in representation space, as a measure of (...)
    Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  6. Audrey Truschke (2012). Defining the Other: An Intellectual History of Sanskrit Lexicons and Grammars of Persian. [REVIEW] Journal of Indian Philosophy 40 (6):635-668.score: 24.0
    From the fourteenth to the eighteenth centuries, Indian intellectuals produced numerous Sanskrit–Persian bilingual lexicons and Sanskrit grammatical accounts of Persian. However, these language analyses have been largely unexplored in modern scholarship. Select works have occasionally been noticed, but the majority of such texts languish unpublished. Furthermore, these works remain untheorized as a sustained, in-depth response on the part of India’s traditional elite to tremendous political and cultural changes. These bilingual grammars and lexicons are one of the few direct, written (...)
    Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  7. Efrat Jaeger, Nissim Francez & Shuly Wintner (2005). Unification Grammars and Off-Line Parsability. Journal of Logic, Language and Information 14 (2):199-234.score: 24.0
    Unification grammars are known to be Turing-equivalent; given a grammar G and a word w, it is undecidable whether w L(G). In order to ensure decidability, several constraints on grammars, commonly known as off-line parsability (OLP), were suggested, such that the recognition problem is decidable for grammars which satisfy OLP. An open question is whether it is decidable if a given grammar satisfies OLP. In this paper we investigate various definitions of OLP and discuss their interrelations, proving (...)
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  8. Aleksandra Kiślak-Malinowska (2012). Extended Pregroup Grammars Applied to Natural Languages. Logic and Logical Philosophy 21 (3):229-252.score: 24.0
    Pregroups and pregroup grammars were introduced by Lambek in 1999 [14] as an algebraic tool for the syntactic analysis of natural lan-guages. The main focus in that paper was on certain extended pregroup grammars such as pregroups with modalities, product pregroup grammars and tupled pregroup grammars. Their applications to different syntactic structures of natural languages, mainly Polish, are explored/shown here.
    Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  9. Sylvain Salvati (2010). On the Membership Problem for Non-Linear Abstract Categorial Grammars. Journal of Logic, Language and Information 19 (2):163-183.score: 24.0
    In this paper we show that the membership problem for second order non-linear Abstract Categorial Grammars is decidable. A consequence of that result is that Montague-like semantics yield to a decidable text generation problem. Furthermore the proof we propose is based on a new tool, Higher Order Intersection Signatures, which grasps statically dynamic properties of λ-terms and presents an interest in its own.
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  10. Anne Preller (2007). Toward Discourse Representation Via Pregroup Grammars. Journal of Logic, Language and Information 16 (2):173-194.score: 22.0
    Every pregroup grammar is shown to be strongly equivalent to one which uses basic types and left and right adjoints of basic types only. Therefore, a semantical interpretation is independent of the order of the associated logic. Lexical entries are read as expressions in a two sorted predicate logic with ∈ and functional symbols. The parsing of a sentence defines a substitution that combines the expressions associated to the individual words. The resulting variable free formula is the translation of the (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  11. András Kornai (2011). Probabilistic Grammars and Languages. Journal of Logic, Language and Information 20 (3):317-328.score: 21.0
    Using an asymptotic characterization of probabilistic finite state languages over a one-letter alphabet we construct a probabilistic language with regular support that cannot be generated by probabilistic CFGs. Since all probability values used in the example are rational, our work is immune to the criticism leveled by Suppes (Synthese 22:95–116, 1970 ) against the work of Ellis ( 1969 ) who first constructed probabilistic FSLs that admit no probabilistic FSGs. Some implications for probabilistic language modeling by HMMs are discussed.
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  12. John Case & Sanjay Jain (2011). Rice and Rice-Shapiro Theorems for Transfinite Correction Grammars. Mathematical Logic Quarterly 57 (5):504-516.score: 21.0
    No categories
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  13. Reinhard Muskens (2010). New Directions in Type-Theoretic Grammars. Journal of Logic, Language and Information 19 (2):129-136.score: 20.0
    This paper argues for the idea that in describing language we should follow Haskell Curry in distinguishing between the structure of an expression and its appearance or manifestation . It is explained how making this distinction obviates the need for directed types in type-theoretic grammars and a simple grammatical formalism is sketched in which representations at all levels are lambda terms. The lambda term representing the abstract structure of an expression is homomorphically translated to a lambda term representing its (...)
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  14. Stephan Kepser & Jim Rogers (2011). The Equivalence of Tree Adjoining Grammars and Monadic Linear Context-Free Tree Grammars. Journal of Logic, Language and Information 20 (3):361-384.score: 20.0
    The equivalence of leaf languages of tree adjoining grammars and monadic linear context-free grammars was shown about a decade ago. This paper presents a proof of the strong equivalence of these grammar formalisms. Non-strict tree adjoining grammars and monadic linear context-free grammars define the same class of tree languages. We also present a logical characterisation of this tree language class showing that a tree language is a member of this class iff it is the two-dimensional yield (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  15. Julia Uddén, Martin Ingvar, Peter Hagoort & Karl M. Petersson (2012). Implicit Acquisition of Grammars With Crossed and Nested Non-Adjacent Dependencies: Investigating the Push-Down Stack Model. Cognitive Science 36 (6):1078-1101.score: 20.0
    A recent hypothesis in empirical brain research on language is that the fundamental difference between animal and human communication systems is captured by the distinction between finite-state and more complex phrase-structure grammars, such as context-free and context-sensitive grammars. However, the relevance of this distinction for the study of language as a neurobiological system has been questioned and it has been suggested that a more relevant and partly analogous distinction is that between non-adjacent and adjacent dependencies. Online memory resources (...)
    Direct download (9 more)  
     
    My bibliography  
     
    Export citation  
  16. Makoto Kanazawa (2010). Second-Order Abstract Categorial Grammars as Hyperedge Replacement Grammars. Journal of Logic, Language and Information 19 (2):137-161.score: 20.0
    Second-order abstract categorial grammars (de Groote in Association for computational linguistics, 39th annual meeting and 10th conference of the European chapter, proceedings of the conference, pp. 148–155, 2001) and hyperedge replacement grammars (Bauderon and Courcelle in Math Syst Theory 20:83–127, 1987; Habel and Kreowski in STACS 87: 4th Annual symposium on theoretical aspects of computer science. Lecture notes in computer science, vol 247, Springer, Berlin, pp 207–219, 1987) are two natural ways of generalizing “context-free” grammar formalisms for string (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  17. Makoto Kanazawa (1996). Identification in the Limit of Categorial Grammars. Journal of Logic, Language and Information 5 (2):115-155.score: 20.0
    It is proved that for any k, the class of classical categorial grammars that assign at most k types to each symbol in the alphabet is learnable, in the Gold (1967) sense of identification in the limit from positive data. The proof crucially relies on the fact that the concept known as finite elasticity in the inductive inference literature is preserved under the inverse image of a finite-valued relation. The learning algorithm presented here incorporates Buszkowski and Penn's (1990) algorithm (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  18. Christian Retoré & Sylvain Salvati (2010). A Faithful Representation of Non-Associative Lambek Grammars in Abstract Categorial Grammars. Journal of Logic, Language and Information 19 (2):185-200.score: 20.0
    This paper solves a natural but still open question: can abstract categorial grammars (ACGs) respresent usual categorial grammars? Despite their name and their claim to be a unifying framework, up to now there was no faithful representation of usual categorial grammars in ACGs. This paper shows that Non-Associative Lambek grammars as well as their derivations can be defined using ACGs of order two. To conclude, the outcome of such a representation are discussed.
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  19. Gabriel Citron (2012). Simple Objects of Comparison for Complex Grammars: An Alternative Strand in Wittgenstein's Later Remarks on Religion. Philosophical Investigations 35 (1):18-42.score: 18.0
    The predominant interpretation of Wittgenstein's later remarks on religion takes him to hold that all religious utterances are non-scientific, and to hold that the way to show that religious utterances are non-scientific is to identify and characterise the grammatical rules governing their use. This paper claims that though this does capture one strand of Wittgenstein's later thought on religion, there is an alternative strand of that thought which is quite different and more nuanced. In this alternative strand Wittgenstein stresses that (...)
    Direct download (9 more)  
     
    My bibliography  
     
    Export citation  
  20. Wojciech Buszkowski & Gerald Penn (1990). Categorial Grammars Determined From Linguistic Data by Unification. Studia Logica 49 (4):431 - 454.score: 18.0
    We provide an algorithm for determining a categorial grammar from linguistic data that essentially uses unification of type-schemes assigned to atoms. The algorithm presented here extends an earlier one restricted to rigid categorial grammars, introduced in [4] and [5], by admitting non-rigid outputs. The key innovation is the notion of an optimal unifier, a natural generalization of that of a most general unifier.
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  21. Claudia Casadio & Joachim Lambek (2002). A Tale of Four Grammars. Studia Logica 71 (3):315-329.score: 18.0
    In this paper we consider the relations existing between four deductive systems that have been called categorial grammars and have relevant connections with linguistic investigations: the syntactic calculus, bilinear logic, compact bilinear logic and Curry''s semantic calculus.
    Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  22. Krasimir Angelov, Björn Bringert & Aarne Ranta (2010). Pgf: A Portable Run-Time Format for Type-Theoretical Grammars. [REVIEW] Journal of Logic, Language and Information 19 (2):201-228.score: 18.0
    Portable Grammar Format (PGF) is a core language for type-theoretical grammars. It is the target language to which grammars written in the high-level formalism Grammatical Framework (GF) are compiled. Low-level and simple, PGF is easy to reason about, so that its language-theoretic properties can be established. It is also easy to write interpreters that perform parsing and generation with PGF grammars, and compilers converting PGF to other formats. This paper gives a concise description of PGF, covering syntax, (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  23. Nissim Francez & Michael Kaminski (2007). Commutation-Augmented Pregroup Grammars and Mildly Context-Sensitive Languages. Studia Logica 87 (2-3):295 - 321.score: 18.0
    The paper presents a generalization of pregroup, by which a freely-generated pregroup is augmented with a finite set of commuting inequations, allowing limited commutativity and cancelability. It is shown that grammars based on the commutation-augmented pregroups generate mildly context-sensitive languages. A version of Lambek’s switching lemma is established for these pregroups. Polynomial parsability and semilinearity are shown for languages generated by these grammars.
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  24. Kenneth R. Paap & Derek Partridge (forthcoming). Recursion Isn't Necessary for Human Language Processing: NEAR (Non-Iterative Explicit Alternatives Rule) Grammars Are Superior. Minds and Machines:1-26.score: 18.0
    Language sciences have long maintained a close and supposedly necessary coupling between the infinite productivity of the human language faculty and recursive grammars. Because of the formal equivalence between recursion and non-recursive iteration; recursion, in the technical sense, is never a necessary component of a generative grammar. Contrary to some assertions this equivalence extends to both center-embedded relative clauses and hierarchical parse trees. Inspection of language usage suggests that recursive rule components in fact contribute very little, and likely nothing (...)
    Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  25. Wojciech Buszkowski (1988). Gaifman's Theorem on Categorial Grammars Revisited. Studia Logica 47 (1):23 - 33.score: 18.0
    The equivalence of (classical) categorial grammars and context-free grammars, proved by Gaifman [4], is a very basic result of the theory of formal grammars (an essentially equivalent result is known as the Greibach normal form theorem [1], [14]). We analyse the contents of Gaifman's theorem within the framework of structure and type transformations. We give a new proof of this theorem which relies on the algebra of phrase structures and exhibit a possibility to justify the key construction (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  26. Eero Hyvönen (1986). Applying a Logical Interpretation of Semantic Nets and Graph Grammars to Natural Language Parsing and Understanding. Synthese 66 (1):177 - 190.score: 18.0
    In this paper a logical interpretation of semantic nets and graph grammars is proposed for modelling natural language understanding and creating language understanding computer systems. An example of parsing a Finnish question by graph grammars and inferring the answer to it by a semantic net representation is provided.
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  27. Christopher Manning, An ¢¡¤£¦¥¨§ Agenda-Based Chart Parser for Arbitrary Probabilistic Context-Free Grammars.score: 18.0
    fundamental rule” in an order-independent manner, such that the same basic algorithm supports top-down and Most PCFG parsing work has used the bottom-up bottom-up parsing, and the parser deals correctly with CKY algorithm (Kasami, 1965; Younger, 1967) with the difficult cases of left-recursive rules, empty elements, Chomsky Normal Form Grammars (Baker, 1979; Jeand unary rules, in a natural way.
    No categories
    Direct download  
     
    My bibliography  
     
    Export citation  
  28. Mati Pentus (1997). Product-Free Lambek Calculus and Context-Free Grammars. Journal of Symbolic Logic 62 (2):648-660.score: 18.0
    In this paper we prove the Chomsky Conjecture (all languages recognized by the Lambek calculus are context-free) for both the full Lambek calculus and its product-free fragment. For the latter case we present a construction of context-free grammars involving only product-free types.
    Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  29. Yael Sygal & Shuly Wintner (2009). Associative Grammar Combination Operators for Tree-Based Grammars. Journal of Logic, Language and Information 18 (3):293-316.score: 18.0
    Polarized unification grammar (PUG) is a linguistic formalism which uses polarities to better control the way grammar fragments interact. The grammar combination operation of PUG was conjectured to be associative. We show that PUG grammar combination is not associative, and even attaching polarities to objects does not make it order-independent. Moreover, we prove that no non-trivial polarity system exists for which grammar combination is associative. We then redefine the grammar combination operator, moving to the powerset domain, in a way that (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  30. Theo Janssen, Gerard Kok & Lambert Meertens (1977). On Restrictions on Transformational Grammars Reducing the Generative Power. Linguistics and Philosophy 1 (1):111 - 118.score: 18.0
    Various restrictions on transformational grammars have been investigated in order to reduce their generative power from recursively enumerable languages to recursive languages.It will be shown that any restriction on transformational grammars defining a recursively enumerable subset of the set of all transformational grammars, is either too weak (in the sense that there does not exist a general decision procedure for all languages generated under such a restriction) or too strong (in the sense that there exists a recursive (...)
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  31. Neil Law Malcolm (1999). Grammars Rule O.K. Behavioral and Brain Sciences 22 (4):723-724.score: 18.0
    Colours are not the sorts of thing that are amendable to traditional forms of scientific explanation. To think otherwise is to mistake their ontology and ignore their normativity. The acquisition and use of colour categories is constrained by the logic of colour grammars.
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  32. Wojciech Zielonka (1978). A Direct Proof of the Equivalence of Free Categorial Grammars and Simple Phrase Structure Grammars. Studia Logica 37 (1):41 - 57.score: 18.0
    In [2], Bar-Hillel, Gaifman, and Shamir prove that the simple phrase structure grammars (SPGs) defined by Chomsky are equivalent in a certain sense to Bar-Hillel's bidirectional categorial grammars (BCGs). On the other hand, Cohen [3] proves the equivalence of the latter ones to what the calls free categorial grammars (FCGs). They are closely related to Lambek's syntactic calculus which, in turn, is based on the idea due to Ajdukiewicz [1]. For the reasons which will be discussed in (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  33. Denis Béchet, Annie Foret & Isabelle Tellier (2007). Learnability of Pregroup Grammars. Studia Logica 87 (2-3):225 - 252.score: 18.0
    This paper investigates the learnability by positive examples in the sense of Gold of Pregroup Grammars. In a first part, Pregroup Grammars are presented and a new parsing strategy is proposed. Then, theoretical learnability and non-learnability results for subclasses of Pregroup Grammars are proved. In the last two parts, we focus on learning Pregroup Grammars from a special kind of input called feature-tagged examples. A learning algorithm based on the parsing strategy presented in the first part (...)
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  34. Denis Béchet (2007). Parsing Pregroup Grammars and Lambek Calculus Using Partial Composition. Studia Logica 87 (2-3):199 - 224.score: 18.0
    The paper presents a way to transform pregroup grammars into contextfree grammars using functional composition. The same technique can also be used for the proof-nets of multiplicative cyclic linear logic and for Lambek calculus allowing empty premises.
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  35. Dan Klein & Christopher D. Manning, An Ç ´Ò¿ Μ Agenda-Based Chart Parser for Arbitrary Probabilistic Context-Free Grammars.score: 18.0
    While Ç ´Ò¿ µ methods for parsing probabilistic context-free grammars (PCFGs) are well known, a tabular parsing framework for arbitrary PCFGs which allows for botton-up, topdown, and other parsing strategies, has not yet been provided. This paper presents such an algorithm, and shows its correctness and advantages over prior work. The paper finishes by bringing out the connections between the algorithm and work on hypergraphs, which permits us to extend the presented Viterbi (best parse) algorithm to an inside (total (...)
    No categories
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  36. José M. Castaño (2004). Global Index Grammars and Descriptive Power. Journal of Logic, Language and Information 13 (4):403-419.score: 18.0
    We review the properties of Global Index Grammars (GIGs), a grammar formalism that uses a stack of indices associated with productions and has restricted context-sensitive power. We show how the control of the derivation is performed and how this impacts in the descriptive power of this formalism both in the string languages and the structural descriptions that GIGs can generate.
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  37. Axel Cleeremans, Rules Vs. Statistics in Implicit Learning of Biconditional Grammars.score: 18.0
    A significant part of everyday learning occurs incidentally — a process typically described as implicit learning. A central issue in this domain and others, such as language acquisition, is the extent to which performance depends on the acquisition and deployment of abstract rules. Shanks and colleagues [22], [11] have suggested (1) that discrimination between grammatical and ungrammatical instances of a biconditional grammar requires the acquisition and use of abstract rules, and (2) that training conditions — in particular whether instructions orient (...)
    Direct download  
     
    My bibliography  
     
    Export citation  
  38. Barbara Dziemidowicz-Gryz (2007). On Learnability of Restricted Classes of Categorial Grammars. Studia Logica 85 (2):153 - 169.score: 18.0
    In this paper we present learning algorithms for classes of categorial grammars restricted by negative constraints. We modify learning functions of Kanazawa [10] and apply them to these classes of grammars. We also prove the learnability of intersection of the class of minimal grammars with the class of k-valued grammars.
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  39. Joyce Friedman & David S. Warren (1978). A Parsing Method for Montague Grammars. Linguistics and Philosophy 2 (3):347 - 372.score: 18.0
    The main result in this paper is a method for obtaining derivation trees from sentences of certain formal grammars. No parsing algorithm was previously known to exist for these grammars.Applied to Montague's PTQ the method produces all parses that could correspond to different meanings. The technique directly addresses scope and reference and provides a framework for examining these phenomena. The solution for PTQ is implemented in an efficient and useful computer program.
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  40. Gabriel Infante-Lopez & Maarten De Rijke (2006). A Note on the Expressive Power of Probabilistic Context Free Grammars. Journal of Logic, Language and Information 15 (3):219-231.score: 18.0
    We examine the expressive power of probabilistic context free grammars (PCFGs), with a special focus on the use of probabilities as a mechanism for reducing ambiguity by filtering out unwanted parses. Probabilities in PCFGs induce an ordering relation among the set of trees that yield a given input sentence. PCFG parsers return the trees bearing the maximum probability for a given sentence, discarding all other possible trees. This mechanism is naturally viewed as a way of defining a new class (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  41. Ruth Kempson (2004). Grammars with Parsing Dynamics: A New Perspective on Alignment. Behavioral and Brain Sciences 27 (2):202-203.score: 18.0
    This commentary argues that dialogue alignment can be explained if parsing-directed grammar formalisms are adopted. With syntax defined as monotonic growth of semantic representations as each word is parsed, alignment between interlocutors is shown to be expected. Hence, grammars can be evaluated according to relative success in characterizing dialogue phenomena.
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  42. Bert Timmermans, Rules Vs. Statistics in Implicit Learning of Biconditional Grammars.score: 18.0
    A significant part of everyday learning occurs incidentally — a process typically described as implicit learning. A central issue in this domain and others, such as language acquisition, is the extent to which performance depends on the acquisition and deployment of abstract rules. Shanks and colleagues [22], [11] have suggested (1) that discrimination between grammatical and ungrammatical instances of a biconditional grammar requires the acquisition and use of abstract rules, and (2) that training conditions — in particular whether instructions orient (...)
    No categories
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  43. Jan van Eijck, Sequentially Indexed Grammars.score: 18.0
    This paper defines the grammar class of sequentially indexed grammars. Sequentially indexed grammars are the result of a change in the index stack handling mechanism of indexed grammars [Aho68, Aho69]. Sequentially indexed grammars are different from linear indexed grammars [Gaz88]. Like indexed languages, sequentially indexed languages are a fully abstract language class. Unlike indexed languages, sequentially indexed languages allow polynomial parsing algorithms. We give a polynomial algorithm for parsing with sequentially indexed gramamrs that is an (...)
    No categories
    Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  44. Lorenzo Carlucci, John Case & Sanjay Jain (2009). Learning Correction Grammars. Journal of Symbolic Logic 74 (2):489-516.score: 18.0
    We investigate a new paradigm in the context of learning in the limit, namely, learning correction grammars for classes of computably enumerable (c.e.) languages. Knowing a language may feature a representation of it in terms of two grammars. The second grammar is used to make corrections to the first grammar. Such a pair of grammars can be seen as a single description of (or grammar for) the language. We call such grammars correction grammars. Correction (...) capture the observable fact that people do correct their linguistic utterances during their usual linguistic activities. We show that learning correction grammars for classes of c.e. languages in the TxtEx-model (i.e., converging to a single correct correction grammar in the limit) is sometimes more powerful than learning ordinary grammars even in the TxtBe-model (where the learner is allowed to converge to infinitely many syntactically distinct but correct conjectures in the limit). For each n ≥ 0, there is a similar learning advantage, again in learning correction grammars for classes of c.e. languages, but where we compare learning correction grammars that make n + 1 corrections to those that make n corrections. The concept of a correction grammar can be extended into the constructive transfinite, using the idea of counting-down from notations for transfinite constructive ordinals. This transfinite extension can also be conceptualized as being about learning Ershov-descriptions for c.e. languages. For u a notation in Kleene's general system $(O,\, < _o )$ of ordinal notations for constructive ordinals, we introduce the concept of an u-correction grammar, where u is used to bound the number of corrections that the grammar is allowed to make. We prove a general hierarchy result: if u and v are notations for constructive ordinals such that $u\, < _o \,v$ , then there are classes of c.e. languages that can be TxtEx-learned by conjecturing v-correction grammars but not by conjecturing u-correction grammars. Surprisingly, we show that— above "ω-many" corrections—it is not possible to strengthen the hierarchy: TxtEx-learning u-correction grammars of classes of c.e. languages, where u is a notation in O for any ordinal, can be simulated by TxtBe-learning ω-correction grammars, where ω is any notation for the smallest infinite ordinal ω. (shrink)
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  45. Susan Edwards & David Lightfoot (2000). Intact Grammars but Intermittent Access. Behavioral and Brain Sciences 23 (1):31-32.score: 18.0
    Grodzinsky examines Broca's aphasia in terms of some specific grammatical deficits. However, his grammatical models offer no way to characterize the distinctions he observes. Rather than grammatical deficits, his patients seem to have intact grammars but defective modules of parsing and production.
    Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  46. Jennifer B. Wagner, Sharon E. Fox, Helen Tager-Flusberg & Charles A. Nelson (2011). Neural Processing of Repetition and Non-Repetition Grammars in 7- and 9-Month-Old Infants. Frontiers in Psychology 2.score: 18.0
    An essential aspect of infant language development involves the extraction of meaningful information from a continuous stream of auditory input. Studies have identified early abilities to differentiate auditory input along various dimensions, including the presence or absence of structural regularities. In newborn infants, frontal and temporal regions were found to respond differentially to these regularities (Gervain et al., 2008), and in order to examine the development of this abstract rule-learning we presented 7- and 9-month-old infants with syllables containing an ABB (...)
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  47. Michael Devitt (2013). The “Linguistic Conception” of Grammars. Filozofia Nauki 2.score: 18.0
    The received Chomskian view is that a grammar is about the language faculty. In contrast to this “psychological conception” of linguistics I have argued in Ignorance of Language for a “linguistic conception”. This paper aims to strengthen the case for this conception. It argues that there is a linguistic reality external to the mind and that it is theoretically interesting to study it. If there is this reality, we have good reason to think that grammars are more or less (...)
     
    My bibliography  
     
    Export citation  
  48. Roberto P. Franzosi (2010). Sociology, Narrative, and the Quality Versus Quantity Debate (Goethe Versus Newton): Can Computer-Assisted Story Grammars Help Us Understand the Rise of Italian Fascism (1919–1922)? [REVIEW] Theory and Society 39 (6):593-629.score: 17.0
    No categories
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  49. Gideon Borensztajn, Willem Zuidema & Rens Bod (2009). Children's Grammars Grow More Abstract with Age—Evidence From an Automatic Procedure for Identifying the Productive Units of Language. Topics in Cognitive Science 1 (1):175-188.score: 17.0
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  50. Jean‐Rémy Hochmann, Mahan Azadpour & Jacques Mehler (2008). Do Humans Really Learn AnBn Artificial Grammars From Exemplars? Cognitive Science 32 (6):1021-1036.score: 17.0
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
1 — 50 / 1000