Philosophy of Science 65 (3):472-501 (1998)
|Abstract||Information theory offers a measure of "mutual information" which provides an appropriate measure of tracking efficiency for the naturalistic epistemologist. The statistical entropy on which it is based is arguably the best way of characterizing the uncertainty associated with the behavior of a system, and it is ontologically neutral. Though not appropriate for the naturalization of meaning, mutual information can serve as a measure of epistemic success independent of semantic maps and payoff structures. While not containing payoffs as terms, mutual information places both upper and lower bounds on payoffs. This constitutes a non-trivial relationship to utility|
|Keywords||No keywords specified (fix it)|
|Categories||categorize this paper)|
|Through your library||Configure|
Similar books and articles
Don Fallis (2004). Epistemic Value Theory and Information Ethics. Minds and Machines 14 (1):101-117.
Joseph F. Hanna (1978). On Transmitted Information as a Measure of Explanatory Power. Philosophy of Science 45 (4):531-562.
Orlin Vakarelov (2010). Pre-Cognitive Semantic Information. Knowledge, Technology & Policy 23 (2):193-226.
Orlin Vakarelov (2012). The Information Medium. Philosophy and Technology 25 (1):47-65.
Jeffrey S. Wicken (1987). Entropy and Information: Suggestions for Common Language. Philosophy of Science 54 (2):176-193.
William Harms (1997). Reliability and Novelty: Information Gain in Multi-Level Selection Systems. [REVIEW] Erkenntnis 46 (3):335-363.
Peter D. Grünwald & Paul M. B. Vitányi (2003). Kolmogorov Complexity and Information Theory. With an Interpretation in Terms of Questions and Answers. Journal of Logic, Language and Information 12 (4):497-529.
Simon D'Alfonso (2011). On Quantifying Semantic Information. Information 2 (1):61-101.
Added to index2009-01-28
Total downloads51 ( #24,357 of 722,765 )
Recent downloads (6 months)2 ( #36,437 of 722,765 )
How can I increase my downloads?