Results for ' Graph grammars'

1000+ found
Order:
  1.  11
    Graph Grammar Formalism with Multigranularity for Spatial Graphs.Yufeng Liu, Fan Yang & Jian Liu - 2023 - Journal of Logic, Language and Information 32 (5):809-827.
    Traditional spatial enabled grammars lack flexibility in specifying the spatial semantics of graphs. This paper describes a new graph grammar formalism called the multigranularity Coordinate Graph Grammar (mgCGG) for spatial graphs. Based on the Coordinate Graph Grammar (CGG), the mgCGG divides coordinates into two categories, physical coordinates and grammatical coordinates, where physical coordinates are the common coordinates in the real world, and grammatical coordinates describe the restrictions on the spatial semantics. In the derivation and reduction of (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  2.  8
    Applying a logical interpretation of semantic nets and graph grammars to natural language parsing and understanding.Eero Hyvönen - 1986 - Synthese 66 (1):177 - 190.
    In this paper a logical interpretation of semantic nets and graph grammars is proposed for modelling natural language understanding and creating language understanding computer systems. An example of parsing a Finnish question by graph grammars and inferring the answer to it by a semantic net representation is provided.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  3.  6
    Types as graphs: Continuations in type logical grammar. [REVIEW]Chris Barker & Chung-Chieh Shan - 2006 - Journal of Logic, Language and Information 15 (4):331-370.
    Using the programming-language concept of continuations, we propose a new, multimodal analysis of quantification in Type Logical Grammar. Our approach provides a geometric view of in-situ quantification in terms of graphs, and motivates the limited use of empty antecedents in derivations. Just as continuations are the tool of choice for reasoning about evaluation order and side effects in programming languages, our system provides a principled, type-logical way to model evaluation order and side effects in natural language. We illustrate with an (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  4.  14
    Learning a Generative Probabilistic Grammar of Experience: A Process‐Level Model of Language Acquisition.Oren Kolodny, Arnon Lotem & Shimon Edelman - 2015 - Cognitive Science 39 (2):227-267.
    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural‐language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  5.  23
    IDL-PMCFG, a Grammar Formalism for Describing Free Word Order Languages.François Hublet - 2022 - Journal of Logic, Language and Information 31 (3):327-388.
    We introduce _Interleave-Disjunction-Lock parallel multiple context-free grammars_ (IDL-PMCFG), a novel grammar formalism designed to describe the syntax of free word order languages that allow for extensive interleaving of grammatical constituents. Though interleaved constituents, and especially the so-called hyperbaton, are common in several ancient (Classical Latin and Greek, Sanskrit...) and modern (Hungarian, Finnish...) languages, these syntactic structures are often difficult to express in existing formalisms. The IDL-PMCFG formalism combines Seki et al.’s parallel multiple context-free grammars (PMCFG) with Nederhof and Satta’s (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  6.  20
    Learning a Generative Probabilistic Grammar of Experience: A Process‐Level Model of Language Acquisition.Oren Kolodny, Arnon Lotem & Shimon Edelman - 2014 - Cognitive Science 38 (4):227-267.
    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  7.  7
    Second-order abstract categorial grammars as hyperedge replacement grammars.Makoto Kanazawa - 2010 - Journal of Logic, Language and Information 19 (2):137-161.
    Second-order abstract categorial grammars (de Groote in Association for computational linguistics, 39th annual meeting and 10th conference of the European chapter, proceedings of the conference, pp. 148–155, 2001) and hyperedge replacement grammars (Bauderon and Courcelle in Math Syst Theory 20:83–127, 1987; Habel and Kreowski in STACS 87: 4th Annual symposium on theoretical aspects of computer science. Lecture notes in computer science, vol 247, Springer, Berlin, pp 207–219, 1987) are two natural ways of generalizing “context-free” grammar formalisms for string (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  8.  10
    Lexicalised Locality: Local Domains and Non-Local Dependencies in a Lexicalised Tree Adjoining Grammar.Diego Gabriel Krivochen & Andrea Padovan - 2021 - Philosophies 6 (3):70.
    Contemporary generative grammar assumes that syntactic structure is best described in terms of sets, and that locality conditions, as well as cross-linguistic variation, is determined at the level of designated functional heads. Syntactic operations (merge, MERGE, etc.) build a structure by deriving sets from lexical atoms and recursively (and monotonically) yielding sets of sets. Additional restrictions over the format of structural descriptions limit the number of elements involved in each operation to two at each derivational step, a head and a (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  9.  13
    Icons, Interrogations, and Graphs: On Peirce's Integrated Notion of Abduction.Francesco Bellucci & Ahti-Veikko Pietarinen - 2020 - Transactions of the Charles S. Peirce Society 56 (1):43.
    The Syllabus for Certain Topics of Logic is a long treatise that Peirce wrote in October and November to complement the material of his 1903 Lowell Lectures. The last of the eight lectures was on abduction, first entitled “How to Theorize” and then “Abduction.” Of abduction, the Syllabus states that its “conclusion is drawn in the interrogative mood ”.1 This is not the first time that Peirce associates abduction to interrogations,2 but the statement is significant because it is the first (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  10.  4
    Logic of the Future: Writings on Existential Graphs. Volume 1: History and Applications ed. by Ahti Pietarinen.Frederik Stjernfelt - 2021 - Transactions of the Charles S. Peirce Society 57 (1):114-127.
    To Peirce scholars and other aficionados of logic, semiotics, and pragmatism, 2017 brought the great news of Bellucci’s Speculative Grammar book, providing the eye-opening first detailed chronological overview over Peirce’s career-length developing of his semiotics. Now, the first volume of Ahti Pietarinen’s long-awaited three-volume publication of the totality of Peirce’s writings on his mature logic representation system known as Existential Graphs not only gives us a plethora of hitherto unpublished Peirce papers but also a new and in many ways surprising (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  11.  10
    Towards a Model of Argument Strength for Bipolar Argumentation Graphs.Erich Rast - 2018 - Studies in Logic, Grammar and Rhetoric 55 (1):31-62.
    Bipolar argument graphs represent the structure of complex pro and contra arguments for one or more standpoints. In this article, ampliative and exclusionary principles of evaluating argument strength in bipolar acyclic argumentation graphs are laid out and compared to each other. Argument chains, linked arguments, link attackers and supporters, and convergent arguments are discussed. The strength of conductive arguments is also addressed but it is argued that more work on this type of argument is needed to properly distinguish argument strength (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  12.  7
    On spectra of sentences of monadic second order logic with counting.E. Fischer & J. A. Makowsky - 2004 - Journal of Symbolic Logic 69 (3):617-640.
    We show that the spectrum of a sentence ϕ in Counting Monadic Second Order Logic (CMSOL) using one binary relation symbol and finitely many unary relation symbols, is ultimately periodic, provided all the models of ϕ are of clique width at most k, for some fixed k. We prove a similar statement for arbitrary finite relational vocabularies τ and a variant of clique width for τ-structures. This includes the cases where the models of ϕ are of tree width at most (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  13.  2
    Символічна логіка: повернення до витоків. Стаття ІV. Графіки функцій та відношень.Yaroslav Kokhan - 2023 - Multiversum. Philosophical Almanac 2 (2):129-143.
    The paper is the Part IV of the large research, dedicated to both revision of the system of basic logical categories and generalization of modern predicate logic to functional logic. The topic of the paper is consideration of graphs of functions and relations as a derivative and definable category of ultra-Fregean logistics. There are two types of function specification: an operational specification, in which a function is first applied to arguments and then the value of the function is entered as (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  14.  7
    Structures and Categories for the Representation of Meaning.Timothy C. Potts - 1994 - Cambridge University Press.
    This 1994 book develops a way of representing the meanings of linguistic expressions which is independent of any particular language, allowing the expressions to be manipulated in accordance with rules related to their meanings which could be implemented on a computer. It begins with a survey of the contributions of linguistics, logic and computer science to the problem of representation, linking each with a particular type of formal grammar. A system of graphs is then presented, organized by scope relations in (...)
    Direct download  
     
    Export citation  
     
    Bookmark   3 citations  
  15.  14
    A Mathematical Analysis of Pānini’s Śivasūtras.Wiebke Petersen - 2004 - Journal of Logic, Language and Information 13 (4):471-489.
    In Pninis grammar of Sanskrit one finds the ivastras, a table which defines the natural classes of phonological segments in Sanskrit by intervals. We present a formal argument which shows that, using his representation method, Pninis way of ordering the phonological segments to represent the natural classes is optimal. The argument is based on a strictly set-theoretical point of view depending only on the set of natural classes and does not explicitly take into account the phonological features of the segments, (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  16.  32
    Probabilistic models of cognition: Conceptual foundations.Nick Chater & Alan Yuille - 2006 - Trends in Cognitive Sciences 10 (7):287-291.
    Remarkable progress in the mathematics and computer science of probability has led to a revolution in the scope of probabilistic models. In particular, ‘sophisticated’ probabilistic methods apply to structured relational systems such as graphs and grammars, of immediate relevance to the cognitive sciences. This Special Issue outlines progress in this rapidly developing field, which provides a potentially unifying perspective across a wide range of domains and levels of explanation. Here, we introduce the historical and conceptual foundations of the approach, (...)
    Direct download (9 more)  
     
    Export citation  
     
    Bookmark   84 citations  
  17.  12
    Raising to object.Diego Gabriel Krivochen - 2023 - Evolutionary Linguistic Theory 5 (2):128-161.
    In this paper we provide an introduction to a set of tools for syntactic analysis based on graph theory, and apply them to the study of some properties of English accusativus cum infinitivo constructions, more commonly known as raising to object or exceptional case marking structures. We focus on puzzling extraction asymmetries between base-generated objects and ‘raised’ objects and on the interaction between raising to object and Right Wrap. We argue that a lexicalised derivational grammar with grammatical functions as (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  18.  14
    A Modal Logic for Supervised Learning.Alexandru Baltag, Dazhu Li & Mina Young Pedersen - 2022 - Journal of Logic, Language and Information 31 (2):213-234.
    Formal learning theory formalizes the process of inferring a general result from examples, as in the case of inferring grammars from sentences when learning a language. In this work, we develop a general framework—the supervised learning game—to investigate the interaction between Teacher and Learner. In particular, our proposal highlights several interesting features of the agents: on the one hand, Learner may make mistakes in the learning process, and she may also ignore the potential relation between different hypotheses; on the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  19.  9
    Induction of Augmented Transition Networks.John R. Anderson - 1977 - Cognitive Science 1 (2):125-157.
    LAS is a program that acquires augmented transition network (ATN) grammars. It requires as data sentences of the language and semantic network representatives of their meaning. In acquiring the ATN grammars, it induces the word classes of the language, the rules of formation for sentences, and the rules mapping sentences onto meaning. The induced ATN grammar can be used both for sentence generation and sentence comprehension. Critical to the performance of the program are assumptions that it makes about (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   13 citations  
  20. Investigative Poetics: In (night)-Light of Akilah Oliver.Feliz Molina - 2011 - Continent 1 (2):70-75.
    continent. 1.2 (2011): 70-75. cartography of ghosts . . . And as a way to talk . . . of temporality the topography of imagination, this body whose dirty entry into the articulation of history as rapturous becoming & unbecoming, greeted with violence, i take permission to extend this grace —Akilah Oliver from “An Arriving Guard of Angels Thusly Coming To Greet” Our disappearance is already here. —Jacques Derrida, 117 I wrestled with death as a threshold, an aporia, a bandit, (...)
     
    Export citation  
     
    Bookmark  
  21.  10
    On the parameterized complexity of short computation and factorization.Liming Cai, Jianer Chen, Rodney G. Downey & Michael R. Fellows - 1997 - Archive for Mathematical Logic 36 (4-5):321-337.
    A completeness theory for parameterized computational complexity has been studied in a series of recent papers, and has been shown to have many applications in diverse problem domains including familiar graph-theoretic problems, VLSI layout, games, computational biology, cryptography, and computational learning [ADF,BDHW,BFH, DEF,DF1-7,FHW,FK]. We here study the parameterized complexity of two kinds of problems: (1) problems concerning parameterized computations of Turing machines, such as determining whether a nondeterministic machine can reach an accept state in \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  22.  14
    Characterizing language identification in terms of computable numberings.Sanjay Jain & Arun Sharma - 1997 - Annals of Pure and Applied Logic 84 (1):51-72.
    Identification of programs for computable functions from their graphs and identification of grammars for recursively enumerable languages from positive data are two extensively studied problems in the recursion theoretic framework of inductive inference.In the context of function identification, Freivalds et al. have shown that only those collections of functions, , are identifiable in the limit for which there exists a 1-1 computable numbering ψ and a discrimination function d such that1. for each , the number of indices i such (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  23.  5
    Concepts and Categories: A Data Science Approach to Semiotics.André Włodarczyk - 2022 - Studies in Logic, Grammar and Rhetoric 67 (1):169-200.
    Compared to existing classical approaches to semiotics which are dyadic (signifier/signified, F. de Saussure) and triadic (symbol/concept/object, Ch. S. Peirce), this theory can be characterized as tetradic ([sign/semion]//[object/noema]) and is the result of either doubling the dyadic approach along the semiotic/ordinary dimension or splitting the ‘concept’ of the triadic one into two (semiotic/ordinary). Other important features of this approach are (a) the distinction made between concepts (only functional pairs of extent and intent) and categories (as representations of expressions) and (b) (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  24. James D. McCawley.Transformational Grammar - forthcoming - Foundations of Language.
     
    Export citation  
     
    Bookmark   1 citation  
  25. Front Matter Front Matter (pp. i-iii).Creative Grammar, Art Education Creative Grammar & Art Education - 2011 - Journal of Aesthetic Education 45 (3).
     
    Export citation  
     
    Bookmark  
  26. Nicolas Ruwet.in Generative Grammar - 1981 - In W. Klein & W. Levelt (eds.), Crossing the Boundaries in Linguistics. Reidel. pp. 23.
    No categories
     
    Export citation  
     
    Bookmark  
  27. P. Stanley Peters and RW Ritchie.Formational Grammars - 1983 - In Alex Orenstein & Rafael Stern (eds.), Developments in Semantics. Haven. pp. 2--304.
     
    Export citation  
     
    Bookmark  
  28.  2
    Primary works.Rational Grammar - 2005 - In Siobhan Chapman & Christopher Routledge (eds.), Key thinkers in linguistics and the philosophy of language. Edinburgh: Edinburgh University Press. pp. 10.
    Direct download  
     
    Export citation  
     
    Bookmark  
  29. Rosane Rocher.Indian Grammar - 1969 - Foundations of Language 5:73.
    No categories
     
    Export citation  
     
    Bookmark  
  30. Sep 2972-10 am.Transformational Grammar - 1972 - Foundations of Language 8:310.
    No categories
     
    Export citation  
     
    Bookmark  
  31.  4
    Timothy C. Potts.Fregean Categorial Grammar - 1973 - In Radu J. Bogdan & Ilkka Niiniluoto (eds.), Logic, language, and probability. Dordrecht: D. Reidel Pub. Co.. pp. 245.
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  32.  8
    Definite clause grammars for language analysis—A survey of the formalism and a comparison with augmented transition networks.Fernando C. N. Pereira & David H. D. Warren - 1980 - Artificial Intelligence 13 (3):231-278.
  33.  25
    Modal Logic: Graph. Darst.Patrick Blackburn, Maarten de Rijke & Yde Venema - 2001 - New York: Cambridge University Press. Edited by Maarten de Rijke & Yde Venema.
    This modern, advanced textbook reviews modal logic, a field which caught the attention of computer scientists in the late 1970's.
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   295 citations  
  34. Implicit learning of artificial grammars.Arthur S. Reber - 1967 - Journal of Verbal Learning and Verbal Behavior 6:855-863.
  35.  1
    The Expressivity of Autosegmental Grammars.Adam Jardine - 2019 - Journal of Logic, Language and Information 28 (1):9-54.
    This paper extends a notion of local grammars in formal language theory to autosegmental representations, in order to develop a sufficiently expressive yet computationally restrictive theory of well-formedness in natural language tone patterns. More specifically, it shows how to define a class ASL\ of stringsets using local grammars over autosegmental representations and a mapping g from strings to autosegmental structures. It then defines a particular class ASL\ using autosegmental representations specific to tone and compares its expressivity to established (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  36.  23
    The Grammars of Mystical Experience in Christian Theological Dialogue.Marc Jean-Bernard - 2018 - Philosophy Study 8 (4).
  37.  9
    A tale of four grammars.Claudia Casadio & Joachim Lambek - 2002 - Studia Logica 71 (3):315-329.
    In this paper we consider the relations existing between four deductive systems that have been called categorial grammars and have relevant connections with linguistic investigations: the syntactic calculus, bilinear logic, compact bilinear logic and Curry''s semantic calculus.
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  38.  10
    Human action in narrative grammars.Thomas Pavel - 2017 - Semiotica 2017 (214):219-229.
    Name der Zeitschrift: Semiotica Jahrgang: 2017 Heft: 214 Seiten: 219-229.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  39.  5
    Learnability of Pregroup Grammars.Denis Béchet, Annie Foret & Isabelle Tellier - 2007 - Studia Logica 87 (2-3):225-252.
    This paper investigates the learnability by positive examples in the sense of Gold of Pregroup Grammars. In a first part, Pregroup Grammars are presented and a new parsing strategy is proposed. Then, theoretical learnability and non-learnability results for subclasses of Pregroup Grammars are proved. In the last two parts, we focus on learning Pregroup Grammars from a special kind of input called feature-tagged examples. A learning algorithm based on the parsing strategy presented in the first part (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  40.  4
    Parsing Pregroup Grammars and Lambek Calculus Using Partial Composition.Denis Béchet - 2007 - Studia Logica 87 (2-3):199-224.
    The paper presents a way to transform pregroup grammars into contextfree grammars using functional composition. The same technique can also be used for the proof-nets of multiplicative cyclic linear logic and for Lambek calculus allowing empty premises.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  41. Sequentially indexed grammars.Jan van Eijck - unknown
    This paper defines the grammar class of sequentially indexed grammars. Sequentially indexed grammars are the result of a change in the index stack handling mechanism of indexed grammars [Aho68, Aho69]. Sequentially indexed grammars are different from linear indexed grammars [Gaz88]. Like indexed languages, sequentially indexed languages are a fully abstract language class. Unlike indexed languages, sequentially indexed languages allow polynomial parsing algorithms. We give a polynomial algorithm for parsing with sequentially indexed gramamrs that is an (...)
     
    Export citation  
     
    Bookmark  
  42.  16
    Frozen Sandhi, Flowing Sound: Permanent Euphonic Ligatures and the Idea of Text in Classical Pali Grammars.Aleix Ruiz-Falqués - 2022 - Journal of Indian Philosophy 50 (4):689-704.
    Pali classical grammars reflect a specific idea of what Pali Buddhist texts are. According to this traditional idea, texts are mainly conceived as sound and therefore the initial portions of every grammar deal with sound and sound ligature or sandhi. Sandhi in Pali does not work as systematically as it does in Sanskrit and therefore Pali grammarians have struggled with the optionality of many of their rules on sound ligature. Unlike modern linguists, however, they identify certain patterns of fixed (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  43.  9
    Diversity of Grammars and Their Diverging Evolutionary and Processing Paths: Evidence From Functional MRI Study of Serbian.Ljiljana Progovac, Natalia Rakhlin, William Angell, Ryan Liddane, Lingfei Tang & Noa Ofen - 2018 - Frontiers in Psychology 9.
  44.  1
    Preference Logic Grammars: Fixed point semantics and application to data standardization.Baoqiu Cui & Terrance Swift - 2002 - Artificial Intelligence 138 (1-2):117-147.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  45.  15
    On Evaluating Story Grammars.David E. Rumelhart - 1980 - Cognitive Science 4 (3):313-316.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   31 citations  
  46.  5
    Topos taxi: Michel Foucault and Virginia Woolf on two modern grammars of love.Anna-Klara Bojö - 2016 - Feminist Theory 17 (1):21-34.
    The concept of love has emerged as a central topic for philosophical and theoretical discussion over the past few years. Whereas the dominant ideology of love in contemporary culture insists that love be understood as a discovery of my long lost second half with whom I merge and finally recreate a whole, contemporary philosophers and theorists have stressed the need to reconsider the concept of love within an ethical framework that can sustain the idea of the other as forever different (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  47.  6
    Should Pregroup Grammars be Adorned with Additional Operations?Joachim Lambek - 2007 - Studia Logica 87 (2-3):343-358.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  48.  6
    The Inheritance and Innateness of Grammars.Myrna Gopnik (ed.) - 1997 - Oxford University Press USA.
    Is language somehow innate in the structure of the human brain, or is it completely learned? This debate is still at the heart of linguistics, especially as it intersects with psychology and cognitive science. In collecting papers which discuss the evidence and arguments regarding this difficult question, The Inheritance and Innateness of Grammars considers cases ranging from infants who are just beginning to learn the properties of a native language to language-impaired adults who will never learn one. These studies (...)
    Direct download  
     
    Export citation  
     
    Bookmark   2 citations  
  49.  4
    Global index grammars and descriptive power.José M. Castaño - 2004 - Journal of Logic, Language and Information 13 (4):403-419.
    We review the properties of Global Index Grammars (GIGs), a grammar formalism that uses a stack of indices associated with productions and has restricted context-sensitive power. We show how the control of the derivation is performed and how this impacts in the descriptive power of this formalism both in the string languages and the structural descriptions that GIGs can generate.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  50.  7
    Learning correction grammars.Lorenzo Carlucci, John Case & Sanjay Jain - 2009 - Journal of Symbolic Logic 74 (2):489-516.
    We investigate a new paradigm in the context of learning in the limit, namely, learning correction grammars for classes of computably enumerable (c.e.) languages. Knowing a language may feature a representation of it in terms of two grammars. The second grammar is used to make corrections to the first grammar. Such a pair of grammars can be seen as a single description of (or grammar for) the language. We call such grammars correction grammars. Correction (...) capture the observable fact that people do correct their linguistic utterances during their usual linguistic activities. We show that learning correction grammars for classes of c.e. languages in the TxtEx-model (i.e., converging to a single correct correction grammar in the limit) is sometimes more powerful than learning ordinary grammars even in the TxtBe-model (where the learner is allowed to converge to infinitely many syntactically distinct but correct conjectures in the limit). For each n ≥ 0, there is a similar learning advantage, again in learning correction grammars for classes of c.e. languages, but where we compare learning correction grammars that make n + 1 corrections to those that make n corrections. The concept of a correction grammar can be extended into the constructive transfinite, using the idea of counting-down from notations for transfinite constructive ordinals. This transfinite extension can also be conceptualized as being about learning Ershov-descriptions for c.e. languages. For u a notation in Kleene's general system $(O,\, < _o )$ of ordinal notations for constructive ordinals, we introduce the concept of an u-correction grammar, where u is used to bound the number of corrections that the grammar is allowed to make. We prove a general hierarchy result: if u and v are notations for constructive ordinals such that $u\, < _o \,v$ , then there are classes of c.e. languages that can be TxtEx-learned by conjecturing v-correction grammars but not by conjecturing u-correction grammars. Surprisingly, we show that— above "ω-many" corrections—it is not possible to strengthen the hierarchy: TxtEx-learning u-correction grammars of classes of c.e. languages, where u is a notation in O for any ordinal, can be simulated by TxtBe-learning ω-correction grammars, where ω is any notation for the smallest infinite ordinal ω. (shrink)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   1 citation  
1 — 50 / 1000