Mind changes and testability: How formal and statistical learning theory converge in the new Riddle of induction

Abstract

This essay demonstrates a previously unnoticed connection between formal and statistical learning theory with regard to Nelson Goodman’s new riddle of induction. Discussions of Goodman’s riddle in formal learning theory explain how conjecturing “all green” before “all grue” can enhance efficient convergence to the truth, where efficiency is understood in terms of minimizing the maximum number of retractions or “mind changes.” Vapnik-Chervonenkis (VC) dimension is a central concept in statistical learning theory and is similar to Popper’s notion of degrees of testability. I show that for a class inductive problems of which Goodman’s riddle is one example, a reliable inductive method minimizes the maximum number of mind changes exactly if it always conjectures the hypothesis from the set with lowest VC dimension consistent with the data. I also discuss the relevance of these results to language invariance and curve fitting.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 91,386

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

  • Only published works are available at libraries.

Similar books and articles

Analytics

Added to PP
2009-01-28

Downloads
148 (#123,629)

6 months
8 (#342,364)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Citations of this work

No citations found.

Add more citations

References found in this work

The Logic of Scientific Discovery.Karl Popper - 1959 - Studia Logica 9:262-265.
The Logic of Scientific Discovery.K. Popper - 1959 - British Journal for the Philosophy of Science 10 (37):55-57.
Studies in the logic of confirmation.Carl A. Hempel - 1983 - In Peter Achinstein (ed.), The Concept of Evidence. Oxford University Press. pp. 1-26.

View all 13 references / Add more references