The Hidden Markov Topic Model: A Probabilistic Model of Semantic Representation

Topics in Cognitive Science 2 (1):101-113 (2010)
  Copy   BIBTEX

Abstract

In this paper, we describe a model that learns semantic representations from the distributional statistics of language. This model, however, goes beyond the common bag‐of‐words paradigm, and infers semantic representations by taking into account the inherent sequential nature of linguistic data. The model we describe, which we refer to as a Hidden Markov Topics model, is a natural extension of the current state of the art in Bayesian bag‐of‐words models, that is, the Topics model of Griffiths, Steyvers, and Tenenbaum (2007), preserving its strengths while extending its scope to incorporate more fine‐grained linguistic information.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 91,386

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Analytics

Added to PP
2013-11-03

Downloads
62 (#254,871)

6 months
9 (#290,637)

Historical graph of downloads
How can I increase my downloads?