David Bourget (Western Ontario)
David Chalmers (ANU, NYU)
Rafael De Clercq
Jack Alan Reynolds
Learn more about PhilPapers
This paper presents a novel approach to the unsupervised learning of syntactic analyses of natural language text. Most previous work has focused on maximizing likelihood according to generative PCFG models. In contrast, we employ a simpler probabilistic model over trees based directly on constituent identity and linear context, and use an EM-like iterative procedure to induce structure. This method produces much higher quality analyses, giving the best published results on the ATIS dataset.
|Keywords||No keywords specified (fix it)|
|Categories||categorize this paper)|
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
|Through your library||
References found in this work BETA
No references found.
Citations of this work BETA
No citations found.
Similar books and articles
Dan Klein & Christopher D. Manning, Fast Exact Inference with a Factored Model for Natural Language Parsing.
M. Dolores Jiménez López (2006). A Grammar Systems Approach to Natural Language Grammar. Linguistics and Philosophy 29 (4):419 - 454.
Shimon Edelman, Characterizing Motherese: On the Computational Structure of Child-Directed Language.
Dan Klein & Christopher D. Manning, A Generative Constituent-Context Model for Improved Grammar Induction.
Added to index2009-01-28
Total downloads9 ( #177,741 of 1,410,134 )
Recent downloads (6 months)1 ( #177,743 of 1,410,134 )
How can I increase my downloads?