Search results for 'grammars' (try it on Scholar)

385 found
Order:
  1. Formational Grammars (1983). P. Stanley Peters and RW Ritchie. In Alex Orenstein & Rafael Stern (eds.), Developments in Semantics. Haven 2--304.
    No categories
     
    Export citation  
     
    My bibliography  
  2.  29
    Philippe de Groote & Sylvain Pogodalla (2004). On the Expressive Power of Abstract Categorial Grammars: Representing Context-Free Formalisms. [REVIEW] Journal of Logic, Language and Information 13 (4):421-438.
    We show how to encode context-free string grammars, linear context-free tree grammars, and linear context-free rewriting systems as Abstract Categorial Grammars. These three encodings share the same constructs, the only difference being the interpretation of the composition of the production rules. It is interpreted as a first-order operation in the case of context-free string grammars, as a second-order operation in the case of linear context-free tree grammars, and as a third-order operation in the case of (...)
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography   5 citations  
  3. Daniel Feinstein & Shuly Wintner (2008). Highly Constrained Unification Grammars. Journal of Logic, Language and Information 17 (3):345-381.
    Unification grammars are widely accepted as an expressive means for describing the structure of natural languages. In general, the recognition problem is undecidable for unification grammars. Even with restricted variants of the formalism, off-line parsable grammars, the problem is computationally hard. We present two natural constraints on unification grammars which limit their expressivity and allow for efficient processing. We first show that non-reentrant unification grammars generate exactly the class of context-free languages. (...)
    Direct download (4 more)  
     
    Export citation  
     
    My bibliography   1 citation  
  4.  19
    Efrat Jaeger, Nissim Francez & Shuly Wintner (2005). Unification Grammars and Off-Line Parsability. Journal of Logic, Language and Information 14 (2):199-234.
    Unification grammars are known to be Turing-equivalent; given a grammar G and a word w, it is undecidable whether w L(G). In order to ensure decidability, several constraints on grammars, commonly known as off-line parsability (OLP), were suggested, such that the recognition problem is decidable for grammars which satisfy OLP. An open question is whether it is decidable if a given grammar satisfies OLP. In this paper we investigate various definitions of OLP and discuss their interrelations, (...)
    Direct download (6 more)  
     
    Export citation  
     
    My bibliography   2 citations  
  5.  13
    Peter Beim Graben & Sabrina Gerth (2012). Geometric Representations for Minimalist Grammars. Journal of Logic, Language and Information 21 (4):393-432.
    We reformulate minimalist grammars as partial functions on term algebras for strings and trees. Using filler/role bindings and tensor product representations, we construct homomorphisms for these data structures into geometric vector spaces. We prove that the structure-building functions as well as simple processors for minimalist languages can be realized by piecewise linear operators in representation space. We also propose harmony, i.e. the distance of an intermediate processing step from the final well-formed state in representation space, as a measure of (...)
    Direct download (7 more)  
     
    Export citation  
     
    My bibliography   1 citation  
  6.  24
    Sylvain Salvati (2010). On the Membership Problem for Non-Linear Abstract Categorial Grammars. Journal of Logic, Language and Information 19 (2):163-183.
    In this paper we show that the membership problem for second order non-linear Abstract Categorial Grammars is decidable. A consequence of that result is that Montague-like semantics yield to a decidable text generation problem. Furthermore the proof we propose is based on a new tool, Higher Order Intersection Signatures, which grasps statically dynamic properties of λ-terms and presents an interest in its own.
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  7.  41
    Anne Preller & Mehrnoosh Sadrzadeh (2011). Semantic Vector Models and Functional Models for Pregroup Grammars. Journal of Logic, Language and Information 20 (4):419-443.
    We show that vector space semantics and functional semantics in two-sorted first order logic are equivalent for pregroup grammars. We present an algorithm that translates functional expressions to vector expressions and vice-versa. The semantics is compositional, variable free and invariant under change of order or multiplicity. It includes the semantic vector models of Information Retrieval Systems and has an interior logic admitting a comprehension schema. A sentence is true in the interior logic if and only if the ‘usual’ first (...)
    Direct download (6 more)  
     
    Export citation  
     
    My bibliography  
  8.  19
    Audrey Truschke (2012). Defining the Other: An Intellectual History of Sanskrit Lexicons and Grammars of Persian. [REVIEW] Journal of Indian Philosophy 40 (6):635-668.
    From the fourteenth to the eighteenth centuries, Indian intellectuals produced numerous Sanskrit–Persian bilingual lexicons and Sanskrit grammatical accounts of Persian. However, these language analyses have been largely unexplored in modern scholarship. Select works have occasionally been noticed, but the majority of such texts languish unpublished. Furthermore, these works remain untheorized as a sustained, in-depth response on the part of India’s traditional elite to tremendous political and cultural changes. These bilingual grammars and lexicons are one of the few direct, written (...)
    Direct download (7 more)  
     
    Export citation  
     
    My bibliography  
  9.  21
    J. Lambek (2008). Pregroup Grammars and Chomsky's Earliest Examples. Journal of Logic, Language and Information 17 (2):141-160.
    Pregroups are partially ordered monoids in which each element has two “adjoints”. Pregroup grammars provide a computational approach to natural languages by assigning to each word in the mental dictionary a type, namely an element of the pregroup freely generated by a partially ordered set of basic types. In this expository article, the attempt is made to introduce linguists to a pregroup grammar of English by looking at Chomsky’s earliest examples.
    Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
  10.  2
    Aleksandra Kiślak-Malinowska (2012). Extended Pregroup Grammars Applied to Natural Languages. Logic and Logical Philosophy 21 (3):229-252.
    Pregroups and pregroup grammars were introduced by Lambek in 1999 [14] as an algebraic tool for the syntactic analysis of natural lan-guages. The main focus in that paper was on certain extended pregroup grammars such as pregroups with modalities, product pregroup grammars and tupled pregroup grammars. Their applications to different syntactic structures of natural languages, mainly Polish, are explored/shown here.
    Direct download (6 more)  
     
    Export citation  
     
    My bibliography  
  11.  15
    Anne Preller (2007). Toward Discourse Representation Via Pregroup Grammars. Journal of Logic, Language and Information 16 (2):173-194.
    Every pregroup grammar is shown to be strongly equivalent to one which uses basic types and left and right adjoints of basic types only. Therefore, a semantical interpretation is independent of the order of the associated logic. Lexical entries are read as expressions in a two sorted predicate logic with ∈ and functional symbols. The parsing of a sentence defines a substitution that combines the expressions associated to the individual words. The resulting variable free formula is the translation of the (...)
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography   4 citations  
  12.  8
    John Corcoran (1971). Discourse Grammars and the Structure of Mathematical Reasoning III: Two Theories of Proof,. Journal of Structural Learning 3 (3):1-24.
    ABSTRACT This part of the series has a dual purpose. In the first place we will discuss two kinds of theories of proof. The first kind will be called a theory of linear proof. The second has been called a theory of suppositional proof. The term "natural deduction" has often and correctly been used to refer to the second kind of theory, but I shall not do so here because many of the theories so-called are not of the second kind--they (...)
    Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  13.  17
    András Kornai (2011). Probabilistic Grammars and Languages. Journal of Logic, Language and Information 20 (3):317-328.
    Using an asymptotic characterization of probabilistic finite state languages over a one-letter alphabet we construct a probabilistic language with regular support that cannot be generated by probabilistic CFGs. Since all probability values used in the example are rational, our work is immune to the criticism leveled by Suppes (Synthese 22:95–116, 1970 ) against the work of Ellis ( 1969 ) who first constructed probabilistic FSLs that admit no probabilistic FSGs. Some implications for probabilistic language modeling by HMMs are discussed.
    Direct download (6 more)  
     
    Export citation  
     
    My bibliography   1 citation  
  14.  8
    John Case & Sanjay Jain (2011). Rice and Rice-Shapiro Theorems for Transfinite Correction Grammars. Mathematical Logic Quarterly 57 (5):504-516.
    Hay and, then, Johnson extended the classic Rice and Rice-Shapiro Theorems for computably enumerable sets, to analogs for all the higher levels in the finite Ershov Hierarchy. The present paper extends their work to analogs in the transfinite Ershov Hierarchy. Some of the transfinite cases are done for all transfinite notations in Kleene's important system of notations, equation image. Other cases are done for all transfinite notations in a very natural, proper subsystem equation image of equation image, where equation image (...)
    No categories
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  15.  38
    John Corcoran (1971). Discourse Grammars and the Structure of Mathematical Reasoning II: The Nature of a Correct Theory of Proof and Its Value. Journal of Structural Learning 3 (2):1-16.
    1971. Discourse Grammars and the Structure of Mathematical Reasoning II: The Nature of a Correct Theory of Proof and Its Value, Journal of Structural Learning 3, #2, 1–16. REPRINTED 1976. Structural Learning II Issues and Approaches, ed. J. Scandura, Gordon & Breach Science Publishers, New York, MR56#15263. -/- This is the second of a series of three articles dealing with application of linguistics and logic to the study of mathematical reasoning, especially in the setting of a concern for improvement (...)
    Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  16.  51
    Denis Béchet, Annie Foret & Isabelle Tellier (2007). Learnability of Pregroup Grammars. Studia Logica 87 (2-3):225 - 252.
    This paper investigates the learnability by positive examples in the sense of Gold of Pregroup Grammars. In a first part, Pregroup Grammars are presented and a new parsing strategy is proposed. Then, theoretical learnability and non-learnability results for subclasses of Pregroup Grammars are proved. In the last two parts, we focus on learning Pregroup Grammars from a special kind of input called feature-tagged examples. A learning algorithm based on the parsing strategy presented in the first part (...)
    Direct download (4 more)  
     
    Export citation  
     
    My bibliography   3 citations  
  17.  21
    Nissim Francez & Michael Kaminski (2007). Commutation-Augmented Pregroup Grammars and Mildly Context-Sensitive Languages. Studia Logica 87 (2-3):295 - 321.
    The paper presents a generalization of pregroup, by which a freely-generated pregroup is augmented with a finite set of commuting inequations, allowing limited commutativity and cancelability. It is shown that grammars based on the commutation-augmented pregroups generate mildly context-sensitive languages. A version of Lambek’s switching lemma is established for these pregroups. Polynomial parsability and semilinearity are shown for languages generated by these grammars.
    Direct download (4 more)  
     
    Export citation  
     
    My bibliography   4 citations  
  18.  12
    Denis Béchet (2007). Parsing Pregroup Grammars and Lambek Calculus Using Partial Composition. Studia Logica 87 (2-3):199 - 224.
    The paper presents a way to transform pregroup grammars into contextfree grammars using functional composition. The same technique can also be used for the proof-nets of multiplicative cyclic linear logic and for Lambek calculus allowing empty premises.
    Direct download (4 more)  
     
    Export citation  
     
    My bibliography   4 citations  
  19.  25
    Wojciech Buszkowski & Gerald Penn (1990). Categorial Grammars Determined From Linguistic Data by Unification. Studia Logica 49 (4):431 - 454.
    We provide an algorithm for determining a categorial grammar from linguistic data that essentially uses unification of type-schemes assigned to atoms. The algorithm presented here extends an earlier one restricted to rigid categorial grammars, introduced in [4] and [5], by admitting non-rigid outputs. The key innovation is the notion of an optimal unifier, a natural generalization of that of a most general unifier.
    Direct download (6 more)  
     
    Export citation  
     
    My bibliography   8 citations  
  20.  34
    Gabriel Citron (2012). Simple Objects of Comparison for Complex Grammars: An Alternative Strand in Wittgenstein's Later Remarks on Religion. Philosophical Investigations 35 (1):18-42.
    The predominant interpretation of Wittgenstein's later remarks on religion takes him to hold that all religious utterances are non-scientific, and to hold that the way to show that religious utterances are non-scientific is to identify and characterise the grammatical rules governing their use. This paper claims that though this does capture one strand of Wittgenstein's later thought on religion, there is an alternative strand of that thought which is quite different and more nuanced. In this alternative strand Wittgenstein stresses that (...)
    Direct download (9 more)  
     
    Export citation  
     
    My bibliography   1 citation  
  21.  30
    Julia Uddén, Martin Ingvar, Peter Hagoort & Karl M. Petersson (2012). Implicit Acquisition of Grammars With Crossed and Nested Non-Adjacent Dependencies: Investigating the Push-Down Stack Model. Cognitive Science 36 (6):1078-1101.
    A recent hypothesis in empirical brain research on language is that the fundamental difference between animal and human communication systems is captured by the distinction between finite-state and more complex phrase-structure grammars, such as context-free and context-sensitive grammars. However, the relevance of this distinction for the study of language as a neurobiological system has been questioned and it has been suggested that a more relevant and partly analogous distinction is that between non-adjacent and adjacent dependencies. Online memory resources (...)
    Direct download (9 more)  
     
    Export citation  
     
    My bibliography   1 citation  
  22.  29
    Maciej Kandulski (1995). On Commutative and Nonassociative Syntactic Calculi and Categorial Grammars. Mathematical Logic Quarterly 41 (2):217-235.
    Two axiomatizations of the nonassociative and commutative Lambek syntactic calculus are given and their equivalence is proved. The first axiomatization employs Permutation as the only structural rule, the second one, with no Permutation rule, employs only unidirectional types. It is also shown that in the case of the Ajdukiewicz calculus an analogous equivalence is valid only in the case of a restricted set of formulas. Unidirectional axiomatizations are employed in order to establish the generative power of categorial grammars based (...)
    Direct download (6 more)  
     
    Export citation  
     
    My bibliography  
  23.  22
    Claudia Casadio & Joachim Lambek (2002). A Tale of Four Grammars. Studia Logica 71 (3):315-329.
    In this paper we consider the relations existing between four deductive systems that have been called categorial grammars and have relevant connections with linguistic investigations: the syntactic calculus, bilinear logic, compact bilinear logic and Curry''s semantic calculus.
    Direct download (8 more)  
     
    Export citation  
     
    My bibliography   4 citations  
  24. Michael Devitt (2013). The “Linguistic Conception” of Grammars. Filozofia Nauki 2.
    The received Chomskian view is that a grammar is about the language faculty. In contrast to this “psychological conception” of linguistics I have argued in Ignorance of Language for a “linguistic conception”. This paper aims to strengthen the case for this conception. It argues that there is a linguistic reality external to the mind and that it is theoretically interesting to study it. If there is this reality, we have good reason to think that grammars are more or less (...)
     
    Export citation  
     
    My bibliography  
  25.  12
    José M. Castaño (2004). Global Index Grammars and Descriptive Power. Journal of Logic, Language and Information 13 (4):403-419.
    We review the properties of Global Index Grammars (GIGs), a grammar formalism that uses a stack of indices associated with productions and has restricted context-sensitive power. We show how the control of the derivation is performed and how this impacts in the descriptive power of this formalism both in the string languages and the structural descriptions that GIGs can generate.
    Direct download (4 more)  
     
    Export citation  
     
    My bibliography   2 citations  
  26.  12
    Makoto Kanazawa (1996). Identification in the Limit of Categorial Grammars. Journal of Logic, Language and Information 5 (2):115-155.
    It is proved that for any k, the class of classical categorial grammars that assign at most k types to each symbol in the alphabet is learnable, in the Gold (1967) sense of identification in the limit from positive data. The proof crucially relies on the fact that the concept known as finite elasticity in the inductive inference literature is preserved under the inverse image of a finite-valued relation. The learning algorithm presented here incorporates Buszkowski and Penn's (...)
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography   3 citations  
  27.  7
    Wojciech Zielonka (1978). A Direct Proof of the Equivalence of Free Categorial Grammars and Simple Phrase Structure Grammars. Studia Logica 37 (1):41 - 57.
    In [2], Bar-Hillel, Gaifman, and Shamir prove that the simple phrase structure grammars (SPGs) defined by Chomsky are equivalent in a certain sense to Bar-Hillel's bidirectional categorial grammars (BCGs). On the other hand, Cohen [3] proves the equivalence of the latter ones to what the calls free categorial grammars (FCGs). They are closely related to Lambek's syntactic calculus which, in turn, is based on the idea due to Ajdukiewicz [1]. For the reasons which will be discussed in (...)
    Direct download (4 more)  
     
    Export citation  
     
    My bibliography   5 citations  
  28.  21
    Mati Pentus (1997). Product-Free Lambek Calculus and Context-Free Grammars. Journal of Symbolic Logic 62 (2):648-660.
    In this paper we prove the Chomsky Conjecture (all languages recognized by the Lambek calculus are context-free) for both the full Lambek calculus and its product-free fragment. For the latter case we present a construction of context-free grammars involving only product-free types.
    Direct download (7 more)  
     
    Export citation  
     
    My bibliography   2 citations  
  29.  10
    Wojciech Zielonka (1985). JM Cohen's Claim on Categorial Grammars Remains Unproved. Bulletin of the Section of Logic 14 (4):130-133.
    Joel M. Cohen , pp. 475- 484) claims that Lambek’s categorial grammars are equivalent in a certain natural sense to those of Bar-Hillel, Gaifman, and Shamir. Unfortunately, it turns out that Cohen’s proof is based on a false lemma. Thus the equivalence of both kinds of grammars is still an open problem although there is much evidence in its favor. This paper yields a counterexample to Cohen’s lemma.
    No categories
    Direct download  
     
    Export citation  
     
    My bibliography  
  30.  1
    Lorenzo Carlucci, John Case & Sanjay Jain (2009). Learning Correction Grammars. Journal of Symbolic Logic 74 (2):489-516.
    We investigate a new paradigm in the context of learning in the limit, namely, learning correction grammars for classes of computably enumerable (c.e.) languages. Knowing a language may feature a representation of it in terms of two grammars. The second grammar is used to make corrections to the first grammar. Such a pair of grammars can be seen as a single description of (or grammar for) the language. We call such grammars correction grammars. Correction (...) capture the observable fact that people do correct their linguistic utterances during their usual linguistic activities. We show that learning correction grammars for classes of c.e. languages in the TxtEx-model (i.e., converging to a single correct correction grammar in the limit) is sometimes more powerful than learning ordinary grammars even in the TxtBe-model (where the learner is allowed to converge to infinitely many syntactically distinct but correct conjectures in the limit). For each n ≥ 0, there is a similar learning advantage, again in learning correction grammars for classes of c.e. languages, but where we compare learning correction grammars that make n + 1 corrections to those that make n corrections. The concept of a correction grammar can be extended into the constructive transfinite, using the idea of counting-down from notations for transfinite constructive ordinals. This transfinite extension can also be conceptualized as being about learning Ershov-descriptions for c.e. languages. For u a notation in Kleene's general system $(O,\, < _o )$ of ordinal notations for constructive ordinals, we introduce the concept of an u-correction grammar, where u is used to bound the number of corrections that the grammar is allowed to make. We prove a general hierarchy result: if u and v are notations for constructive ordinals such that $u\, < _o \,v$ , then there are classes of c.e. languages that can be TxtEx-learned by conjecturing v-correction grammars but not by conjecturing u-correction grammars. Surprisingly, we show that— above "ω-many" corrections—it is not possible to strengthen the hierarchy: TxtEx-learning u-correction grammars of classes of c.e. languages, where u is a notation in O for any ordinal, can be simulated by TxtBe-learning ω-correction grammars, where ω is any notation for the smallest infinite ordinal ω. (shrink)
    Direct download (4 more)  
     
    Export citation  
     
    My bibliography   1 citation  
  31.  15
    Joyce Friedman & David S. Warren (1978). A Parsing Method for Montague Grammars. Linguistics and Philosophy 2 (3):347 - 372.
    The main result in this paper is a method for obtaining derivation trees from sentences of certain formal grammars. No parsing algorithm was previously known to exist for these grammars.Applied to Montague's PTQ the method produces all parses that could correspond to different meanings. The technique directly addresses scope and reference and provides a framework for examining these phenomena. The solution for PTQ is implemented in an efficient and useful computer program.
    Direct download (6 more)  
     
    Export citation  
     
    My bibliography   3 citations  
  32.  7
    Wojciech Zielonka (1976). On the Equivalence of Ajdukiewicz-Lambek Calculus and Simple Phrase Structure Grammars. Bulletin of the Section of Logic 5 (2):1-4.
    In [2], Bar-Hillel, Gaifman, and Shamir prove that the simple phrase structure grammars dened by Chomsky are equivalent in a cer- tain sense to Bar-Hillel's bidirectional categorial grammars . On the other hand, Cohen [3] proves the equivalence of the latter ones to what he calls free categorial grammars . They are closely related to Lambek's syntactic calculus which is, in turn, based on the idea due to Ajdukiewicz [1]. For some reasons, Cohen's proof seems to be (...)
    Direct download  
     
    Export citation  
     
    My bibliography  
  33.  41
    Reinhard Muskens (2010). New Directions in Type-Theoretic Grammars. Journal of Logic, Language and Information 19 (2):129-136.
    This paper argues for the idea that in describing language we should follow Haskell Curry in distinguishing between the structure of an expression and its appearance or manifestation . It is explained how making this distinction obviates the need for directed types in type-theoretic grammars and a simple grammatical formalism is sketched in which representations at all levels are lambda terms. The lambda term representing the abstract structure of an expression is homomorphically translated to a lambda term representing its (...)
    Direct download (6 more)  
     
    Export citation  
     
    My bibliography  
  34.  6
    Wojciech Buszkowski (1986). Strong Generative Capacity of Classical Categorial Grammars. Bulletin of the Section of Logic 15 (2):60-63.
    Classical categorial grammars are the grammars introduced by Ajdukiewicz [1] and formalized by Bar-Hillel [2], Bar-Hillel et al. [3]. In [3] there is proved the weak equivalence of CCG’s and context-free grammars [6]. In this note we characterize the strong generative capacity of finite and rigid CCG’s, i.e. their capacity of structure generation. These results are more completely discussed in [4], [5].
    No categories
    Direct download  
     
    Export citation  
     
    My bibliography  
  35.  38
    Makoto Kanazawa (2010). Second-Order Abstract Categorial Grammars as Hyperedge Replacement Grammars. Journal of Logic, Language and Information 19 (2):137-161.
    Second-order abstract categorial grammars (de Groote in Association for computational linguistics, 39th annual meeting and 10th conference of the European chapter, proceedings of the conference, pp. 148–155, 2001) and hyperedge replacement grammars (Bauderon and Courcelle in Math Syst Theory 20:83–127, 1987; Habel and Kreowski in STACS 87: 4th Annual symposium on theoretical aspects of computer science. Lecture notes in computer science, vol 247, Springer, Berlin, pp 207–219, 1987) are two natural ways of generalizing “context-free” grammar formalisms for string (...)
    Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
  36.  24
    Axel Cleeremans, Rules Vs. Statistics in Implicit Learning of Biconditional Grammars.
    A significant part of everyday learning occurs incidentally — a process typically described as implicit learning. A central issue in this domain and others, such as language acquisition, is the extent to which performance depends on the acquisition and deployment of abstract rules. Shanks and colleagues [22], [11] have suggested (1) that discrimination between grammatical and ungrammatical instances of a biconditional grammar requires the acquisition and use of abstract rules, and (2) that training conditions — in particular whether instructions orient (...)
    Direct download  
     
    Export citation  
     
    My bibliography  
  37.  15
    Wojciech Buszkowski (1988). Gaifman's Theorem on Categorial Grammars Revisited. Studia Logica 47 (1):23 - 33.
    The equivalence of (classical) categorial grammars and context-free grammars, proved by Gaifman [4], is a very basic result of the theory of formal grammars (an essentially equivalent result is known as the Greibach normal form theorem [1], [14]). We analyse the contents of Gaifman's theorem within the framework of structure and type transformations. We give a new proof of this theorem which relies on the algebra of phrase structures and exhibit a possibility to justify the key construction (...)
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography   1 citation  
  38.  11
    Kenneth R. Paap & Derek Partridge (2014). Recursion Isn’T Necessary for Human Language Processing: NEAR Grammars Are Superior. Minds and Machines 24 (4):389-414.
    Language sciences have long maintained a close and supposedly necessary coupling between the infinite productivity of the human language faculty and recursive grammars. Because of the formal equivalence between recursion and non-recursive iteration; recursion, in the technical sense, is never a necessary component of a generative grammar. Contrary to some assertions this equivalence extends to both center-embedded relative clauses and hierarchical parse trees. Inspection of language usage suggests that recursive rule components in fact contribute very little, and likely nothing (...)
    Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  39.  22
    Christian Retoré & Sylvain Salvati (2010). A Faithful Representation of Non-Associative Lambek Grammars in Abstract Categorial Grammars. Journal of Logic Language and Information 19 (2):185-200.
    This paper solves a natural but still open question: can abstract categorial grammars (ACGs) respresent usual categorial grammars? Despite their name and their claim to be a unifying framework, up to now there was no faithful representation of usual categorial grammars in ACGs. This paper shows that Non-Associative Lambek grammars as well as their derivations can be defined using ACGs of order two. To conclude, the outcome of such a representation are discussed.
    Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
  40.  22
    Jan van Eijck, Sequentially Indexed Grammars.
    This paper defines the grammar class of sequentially indexed grammars. Sequentially indexed grammars are the result of a change in the index stack handling mechanism of indexed grammars [Aho68, Aho69]. Sequentially indexed grammars are different from linear indexed grammars [Gaz88]. Like indexed languages, sequentially indexed languages are a fully abstract language class. Unlike indexed languages, sequentially indexed languages allow polynomial parsing algorithms. We give a polynomial algorithm for parsing with sequentially indexed gramamrs that is an (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  41.  24
    Eero Hyvönen (1986). Applying a Logical Interpretation of Semantic Nets and Graph Grammars to Natural Language Parsing and Understanding. Synthese 66 (1):177 - 190.
    In this paper a logical interpretation of semantic nets and graph grammars is proposed for modelling natural language understanding and creating language understanding computer systems. An example of parsing a Finnish question by graph grammars and inferring the answer to it by a semantic net representation is provided.
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  42.  16
    Stephan Kepser & Jim Rogers (2011). The Equivalence of Tree Adjoining Grammars and Monadic Linear Context-Free Tree Grammars. Journal of Logic, Language and Information 20 (3):361-384.
    The equivalence of leaf languages of tree adjoining grammars and monadic linear context-free grammars was shown about a decade ago. This paper presents a proof of the strong equivalence of these grammar formalisms. Non-strict tree adjoining grammars and monadic linear context-free grammars define the same class of tree languages. We also present a logical characterisation of this tree language class showing that a tree language is a member of this class iff it is the two-dimensional yield (...)
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  43.  17
    Christopher Manning, An ¢¡¤£¦¥¨§ Agenda-Based Chart Parser for Arbitrary Probabilistic Context-Free Grammars.
    fundamental rule” in an order-independent manner, such that the same basic algorithm supports top-down and Most PCFG parsing work has used the bottom-up bottom-up parsing, and the parser deals correctly with CKY algorithm (Kasami, 1965; Younger, 1967) with the difficult cases of left-recursive rules, empty elements, Chomsky Normal Form Grammars (Baker, 1979; Jeand unary rules, in a natural way.
    No categories
    Direct download  
     
    Export citation  
     
    My bibliography  
  44.  13
    Dan Klein & Christopher D. Manning, An Ç ´Ò¿ Μ Agenda-Based Chart Parser for Arbitrary Probabilistic Context-Free Grammars.
    While Ç ´Ò¿ µ methods for parsing probabilistic context-free grammars (PCFGs) are well known, a tabular parsing framework for arbitrary PCFGs which allows for botton-up, topdown, and other parsing strategies, has not yet been provided. This paper presents such an algorithm, and shows its correctness and advantages over prior work. The paper finishes by bringing out the connections between the algorithm and work on hypergraphs, which permits us to extend the presented Viterbi (best parse) algorithm to an inside (total (...)
    No categories
    Translate
      Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  45.  14
    Bert Timmermans, Rules Vs. Statistics in Implicit Learning of Biconditional Grammars.
    A significant part of everyday learning occurs incidentally — a process typically described as implicit learning. A central issue in this domain and others, such as language acquisition, is the extent to which performance depends on the acquisition and deployment of abstract rules. Shanks and colleagues [22], [11] have suggested (1) that discrimination between grammatical and ungrammatical instances of a biconditional grammar requires the acquisition and use of abstract rules, and (2) that training conditions — in particular whether instructions orient (...)
    Translate
      Direct download  
     
    Export citation  
     
    My bibliography  
  46.  9
    Barbara Dziemidowicz-Gryz (2007). On Learnability of Restricted Classes of Categorial Grammars. Studia Logica 85 (2):153 - 169.
    In this paper we present learning algorithms for classes of categorial grammars restricted by negative constraints. We modify learning functions of Kanazawa [10] and apply them to these classes of grammars. We also prove the learnability of intersection of the class of minimal grammars with the class of k-valued grammars.
    Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
  47.  16
    Krasimir Angelov, Björn Bringert & Aarne Ranta (2010). Pgf: A Portable Run-Time Format for Type-Theoretical Grammars. [REVIEW] Journal of Logic, Language and Information 19 (2):201-228.
    Portable Grammar Format (PGF) is a core language for type-theoretical grammars. It is the target language to which grammars written in the high-level formalism Grammatical Framework (GF) are compiled. Low-level and simple, PGF is easy to reason about, so that its language-theoretic properties can be established. It is also easy to write interpreters that perform parsing and generation with PGF grammars, and compilers converting PGF to other formats. This paper gives a concise description of PGF, covering syntax, (...)
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  48.  11
    Gabriel Infante-Lopez & Maarten De Rijke (2006). A Note on the Expressive Power of Probabilistic Context Free Grammars. Journal of Logic, Language and Information 15 (3):219-231.
    We examine the expressive power of probabilistic context free grammars (PCFGs), with a special focus on the use of probabilities as a mechanism for reducing ambiguity by filtering out unwanted parses. Probabilities in PCFGs induce an ordering relation among the set of trees that yield a given input sentence. PCFG parsers return the trees bearing the maximum probability for a given sentence, discarding all other possible trees. This mechanism is naturally viewed as a way of defining a new class (...)
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  49.  9
    Theo Janssen, Gerard Kok & Lambert Meertens (1977). On Restrictions on Transformational Grammars Reducing the Generative Power. Linguistics and Philosophy 1 (1):111 - 118.
    Various restrictions on transformational grammars have been investigated in order to reduce their generative power from recursively enumerable languages to recursive languages.It will be shown that any restriction on transformational grammars defining a recursively enumerable subset of the set of all transformational grammars, is either too weak (in the sense that there does not exist a general decision procedure for all languages generated under such a restriction) or too strong (in the sense that there exists a recursive (...)
    Direct download (6 more)  
     
    Export citation  
     
    My bibliography  
  50.  7
    Neil Law Malcolm (1999). Grammars Rule O.K. Behavioral and Brain Sciences 22 (4):723-724.
    Colours are not the sorts of thing that are amendable to traditional forms of scientific explanation. To think otherwise is to mistake their ontology and ignore their normativity. The acquisition and use of colour categories is constrained by the logic of colour grammars.
    Direct download (3 more)  
     
    Export citation  
     
    My bibliography  
1 — 50 / 385