David Bourget (Western Ontario)
David Chalmers (ANU, NYU)
Rafael De Clercq
Jack Alan Reynolds
Learn more about PhilPapers
Philosophy of Science 65 (3):472-501 (1998)
Information theory offers a measure of "mutual information" which provides an appropriate measure of tracking efficiency for the naturalistic epistemologist. The statistical entropy on which it is based is arguably the best way of characterizing the uncertainty associated with the behavior of a system, and it is ontologically neutral. Though not appropriate for the naturalization of meaning, mutual information can serve as a measure of epistemic success independent of semantic maps and payoff structures. While not containing payoffs as terms, mutual information places both upper and lower bounds on payoffs. This constitutes a non-trivial relationship to utility
|Keywords||No keywords specified (fix it)|
|Categories||categorize this paper)|
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
|Through your library|
References found in this work BETA
No references found.
Citations of this work BETA
Andrew Robinson & Christopher Southgate (2010). A General Definition of Interpretation and its Application to Origin of Life Research. Biology and Philosophy 25 (2):163-181.
Jonas Clausen Mork (2013). Uncertainty, Credal Sets and Second Order Probability. Synthese 190 (3):353-378.
Similar books and articles
Don Fallis (2004). Epistemic Value Theory and Information Ethics. Minds and Machines 14 (1):101-117.
Joseph F. Hanna (1978). On Transmitted Information as a Measure of Explanatory Power. Philosophy of Science 45 (4):531-562.
Orlin Vakarelov (2010). Pre-Cognitive Semantic Information. Knowledge, Technology & Policy 23 (2):193-226.
Orlin Vakarelov (2012). The Information Medium. Philosophy and Technology 25 (1):47-65.
Jeffrey S. Wicken (1987). Entropy and Information: Suggestions for Common Language. Philosophy of Science 54 (2):176-193.
William Harms (1997). Reliability and Novelty: Information Gain in Multi-Level Selection Systems. [REVIEW] Erkenntnis 46 (3):335-363.
Peter D. Grünwald & Paul M. B. Vitányi (2003). Kolmogorov Complexity and Information Theory. With an Interpretation in Terms of Questions and Answers. Journal of Logic, Language and Information 12 (4):497-529.
Simon D'Alfonso (2011). On Quantifying Semantic Information. Information 2 (1):61-101.
Added to index2009-01-28
Total downloads52 ( #30,973 of 1,101,679 )
Recent downloads (6 months)1 ( #292,019 of 1,101,679 )
How can I increase my downloads?