This paper presents a novel approach to the unsupervised learning of syntactic analyses of natural language text. Most previous work has focused on maximizing likelihood according to generative PCFG models. In contrast, we employ a simpler probabilistic model over trees based directly on constituent identity and linear context, and use an EM-like iterative procedure to induce structure. This method produces much higher quality analyses, giving the best published results on the ATIS dataset.
|Keywords||No keywords specified (fix it)|
|Categories||categorize this paper)|
References found in this work BETA
No references found.
Citations of this work BETA
No citations found.
Similar books and articles
Natural Language Grammar Induction Using a Constituent-Context Model.Christopher Manning - manuscript
A Generative Constituent-Context Model for Improved Grammar Induction.Dan Klein & Christopher D. Manning - unknown
Characterizing Motherese: On the Computational Structure of Child-Directed Language.Shimon Edelman - unknown
Fast Exact Inference with a Factored Model for Natural Language Parsing.Dan Klein & Christopher D. Manning - unknown
A Grammar Systems Approach to Natural Language Grammar.M. Dolores Jiménez López - 2006 - Linguistics and Philosophy 29 (4):419 - 454.
Fast Exact Inference with a Factored Model for Natural Language Parsing.Christopher Manning - manuscript
Talking About Trees and Truth-Conditions.Reinhard Muskens - 2001 - Journal of Logic, Language and Information 10 (4):417-455.
Added to index2010-12-22
Total downloads6 ( #563,676 of 2,178,176 )
Recent downloads (6 months)1 ( #316,504 of 2,178,176 )
How can I increase my downloads?