Abstract
New success criteria of inductive inference in computational learning theory are introduced which model learning total (not necessarily recursive) functions with (possibly everywhere) imprecise theories from (possibly always) inaccurate data. It is proved that for any level of error allowable by the new success criteria, there exists a class ϑ of recursive functions such that not all f ∈ ϑ are identifiable via the criterion at that level of error. Also, necessary and sufficient conditions on the error level are given for when more classes of functions may be identified.
Similar content being viewed by others
References
Case, J. and C. Smith: 1983, ‘Comparison of Identification Criteria for Machine Inductive Inference’, Theoretical Computer Science 25(2), 193–220.
Gold, E. M.: 1967, ‘Language Identification in the Limit’, Information and Control, 10, 447–474.
Machtey, M. and P. Young: 1978, An Introduction to the General Theory of Algorithms, North-Holland, New York.
Rogers, H. Jr.: 1958, ‘Gödel Numberings of Partial Recursive Functions’, Journal of Symbolic Logic, 23, 331–341.
Rogers, H. Jr.: 1967, Theory of Recursive Functions and Effective Computability, McGraw Hill, New York.
Smith, C.: 1994, A Recursive Introduction to the Theory of Computation, Springer-Verlag.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Changizi, M. Function identification from noisy data with recursive error bounds. Erkenntnis 45, 91–102 (1996). https://doi.org/10.1007/BF00226372
Received:
Accepted:
Issue Date:
DOI: https://doi.org/10.1007/BF00226372