Statistical learning theory as a framework for the philosophy of induction

Abstract

Statistical Learning Theory (e.g., Hastie et al., 2001; Vapnik, 1998, 2000, 2006) is the basic theory behind contemporary machine learning and data-mining. We suggest that the theory provides an excellent framework for philosophical thinking about inductive inference.

Other Versions

No versions found

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 107,650

External links

  • This entry has no external links. Add one.
Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

  • Only published works are available at libraries.

Analytics

Added to PP
2009-01-28

Downloads
64 (#382,688)

6 months
35 (#121,256)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Gilbert Harman
Princeton University

References found in this work

The Logic of Scientific Discovery.Karl Popper - 1959 - Studia Logica 9:262-265.
The Foundations of Statistics.Leonard J. Savage - 1956 - Philosophy of Science 23 (2):166-166.
The theory of probability.Hans Reichenbach - 1949 - Berkeley,: University of California Press.
Epistemology and the Psychology of Human Judgment.Michael A. Bishop & J. D. Trout - 2004 - New York: OUP USA. Edited by J. D. Trout.
Subjective Probability: The Real Thing.Richard C. Jeffrey - 2002 - Cambridge and New York: Cambridge University Press.

View all 11 references / Add more references