Online Causal Structure Learning

Abstract

Causal structure learning algorithms have focused on learning in ”batch-mode”: i.e., when a full dataset is presented. In many domains, however, it is important to learn in an online fashion from sequential or ordered data, whether because of memory storage constraints or because of potential changes in the underlying causal structure over the course of learning. In this paper, we present TDSL, a novel causal structure learning algorithm that processes data sequentially. This algorithm can track changes in the generating causal structure or parameters, and requires significantly less memory in realistic settings. We show by simulation that the algorithm performs comparably to batch-mode learning when the causal structure is stationary, and significantly better in non-stationary environments.

Download options

PhilArchive



    Upload a copy of this work     Papers currently archived: 72,743

External links

  • This entry has no external links. Add one.
Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

  • Only published works are available at libraries.

Analytics

Added to PP
2010-12-22

Downloads
36 (#320,444)

6 months
1 (#387,390)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

David Danks
University of California, San Diego

References found in this work

No references found.

Add more references

Citations of this work

No citations found.

Add more citations

Similar books and articles