David Bourget (Western Ontario)
David Chalmers (ANU, NYU)
Rafael De Clercq
Jack Alan Reynolds
Learn more about PhilPapers
By bootstrapping the output of the PC algorithm (Spirtes et al., 2000; Meek 1995), using larger conditioning sets informed by the current graph state, it is possible to define a novel algorithm, JPC, that improves accuracy of search for i.i.d. data drawn from linear, Gaussian, sparse to moderately dense models. The motivation for constructing sepsets using information in the current graph state is to highlight the differences between d-‐separation information in the graph and conditional independence information extracted from the sample. The same idea can be pursued for any algorithm for which conditioning sets informed by the current graph state can be constructed and for which an orientation procedure capable of orienting undirected graphs can be extracted. Another plausible candidate for such retrofitting is the CPC algorithm (Ramsey et al, 2006), yielding an algorithm, JCPC, which, when the true graph is sparse is somewhat more accurate than JPC. The method is not feasible for discovery for models of categorical variables, i.e., traditional Bayes nets; with alternative tests for conditional independence it may extend to non-‐linear or non-‐Gaussian models, or both
|Keywords||No keywords specified (fix it)|
No categories specified
(categorize this paper)
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
|Through your library||
References found in this work BETA
No references found.
Citations of this work BETA
No citations found.
Similar books and articles
Peter Spirtes (2005). Graphical Models, Causal Inference, and Econometric Models. Journal of Economic Methodology 12 (1):3-34.
J. C. E. Dekker (1981). Twilight Graphs. Journal of Symbolic Logic 46 (3):539-571.
Peter Spirtes, Thomas Richardson, Christopher Meek, Richard Scheines & Clark Glymour, Using D-Separation to Calculate Zero Partial Correlations in Linear Models with Correlated Errors.
Brendan Kitts (1999). Representation Operators and Computation. Minds and Machines 9 (2):223-240.
Richard Scheines, Clark Glymour & Peter Spirtes, Learning the Structure of Linear Latent Variable Models.
Peter Spirtes, A Polynomial Time Algorithm for Determining Dag Equivalence in the Presence of Latent Variables and Selection Bias.
Added to index2010-09-14
Total downloads3 ( #292,272 of 1,100,763 )
Recent downloads (6 months)2 ( #176,465 of 1,100,763 )
How can I increase my downloads?