Philosophy of Science 65 (3):472-501 (1998)

Authors
William F. Harms
Seattle Central Community College
Abstract
Information theory offers a measure of "mutual information" which provides an appropriate measure of tracking efficiency for the naturalistic epistemologist. The statistical entropy on which it is based is arguably the best way of characterizing the uncertainty associated with the behavior of a system, and it is ontologically neutral. Though not appropriate for the naturalization of meaning, mutual information can serve as a measure of epistemic success independent of semantic maps and payoff structures. While not containing payoffs as terms, mutual information places both upper and lower bounds on payoffs. This constitutes a non-trivial relationship to utility
Keywords No keywords specified (fix it)
Categories (categorize this paper)
DOI 10.1086/392657
Options
Edit this record
Mark as duplicate
Export citation
Find it on Scholar
Request removal from index
Revision history

Download options

PhilArchive copy


Upload a copy of this paper     Check publisher's policy     Papers currently archived: 56,949
Through your library

References found in this work BETA

No references found.

Add more references

Citations of this work BETA

Semantic Conceptions of Information.Luciano Floridi - 2008 - Stanford Encyclopedia of Philosophy.
Thinking About Semantic Information.Marcin Miłkowski - 2020 - Avant: Trends in Interdisciplinary Studies 11 (2):1-10.

Add more citations

Similar books and articles

Analytics

Added to PP index
2009-01-28

Total views
98 ( #101,529 of 2,410,082 )

Recent downloads (6 months)
2 ( #348,045 of 2,410,082 )

How can I increase my downloads?

Downloads

My notes