Learning Random Walk Models for Inducing Word Dependency Distributions

Abstract

Many NLP tasks rely on accurately estimating word dependency probabilities P(w1|w2), where the words w1 and w2 have a particular relationship (such as verb-object). Because of the sparseness of counts of such dependencies, smoothing and the ability to use multiple sources of knowledge are important challenges. For example, if the probability P(N |V ) of noun N being the subject of verb V is high, and V takes similar objects to V , and V is synonymous to V , then we want to conclude that P(N |V ) should also be reasonably high—even when those words did not cooccur in the training data.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 93,590

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

  • Only published works are available at libraries.

Similar books and articles

The Present Situation in Biology and Genetics: Report of a Discussion.[author unknown] - 1965 - Russian Studies in Philosophy 4 (2):25-42.
Studies of intensional contexts in mohist writings.Desheng Zong - 2000 - Philosophy East and West 50 (2):208-228.
Co-stationarity of the Ground Model.Natasha Dobrinen & Sy-David Friedman - 2006 - Journal of Symbolic Logic 71 (3):1029 - 1043.

Analytics

Added to PP
2010-12-22

Downloads
20 (#181,865)

6 months
2 (#1,816,284)

Historical graph of downloads
How can I increase my downloads?

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references