David Bourget (Western Ontario)
David Chalmers (ANU, NYU)
Rafael De Clercq
Jack Alan Reynolds
Learn more about PhilPapers
This paper presents a novel approach to the unsupervised learning of syntactic analyses of natural language text. Most previous work has focused on maximizing likelihood according to generative PCFG models. In contrast, we employ a simpler probabilistic model over trees based directly on constituent identity and linear context, and use an EM-like iterative procedure to induce structure. This method produces much higher quality analyses, giving the best published results on the ATIS dataset.
|Keywords||No keywords specified (fix it)|
No categories specified
(categorize this paper)
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
|Through your library||
References found in this work BETA
No references found.
Citations of this work BETA
No citations found.
Similar books and articles
Dan Klein & Christopher D. Manning, A Generative Constituent-Context Model for Improved Grammar Induction.
Shimon Edelman, Characterizing Motherese: On the Computational Structure of Child-Directed Language.
Dan Klein & Christopher D. Manning, Fast Exact Inference with a Factored Model for Natural Language Parsing.
M. Dolores Jiménez López (2006). A Grammar Systems Approach to Natural Language Grammar. Linguistics and Philosophy 29 (4):419 - 454.
Reinhard Muskens (2001). Talking About Trees and Truth-Conditions. Journal of Logic, Language and Information 10 (4):417-455.
Sorry, there are not enough data points to plot this chart.
Added to index2010-12-22
Recent downloads (6 months)0
How can I increase my downloads?