The use of information theory in epistemology

Philosophy of Science 65 (3):472-501 (1998)
Information theory offers a measure of "mutual information" which provides an appropriate measure of tracking efficiency for the naturalistic epistemologist. The statistical entropy on which it is based is arguably the best way of characterizing the uncertainty associated with the behavior of a system, and it is ontologically neutral. Though not appropriate for the naturalization of meaning, mutual information can serve as a measure of epistemic success independent of semantic maps and payoff structures. While not containing payoffs as terms, mutual information places both upper and lower bounds on payoffs. This constitutes a non-trivial relationship to utility
Keywords No keywords specified (fix it)
Categories (categorize this paper)
DOI 10.1086/392657
 Save to my reading list
Follow the author(s)
My bibliography
Export citation
Find it on Scholar
Edit this record
Mark as duplicate
Revision history Request removal from index
Download options
PhilPapers Archive

Upload a copy of this paper     Check publisher's policy on self-archival     Papers currently archived: 16,667
External links
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library
References found in this work BETA

No references found.

Add more references

Citations of this work BETA

Add more citations

Similar books and articles

Monthly downloads

Added to index


Total downloads

67 ( #50,874 of 1,726,249 )

Recent downloads (6 months)

5 ( #147,227 of 1,726,249 )

How can I increase my downloads?

My notes
Sign in to use this feature

Start a new thread
There  are no threads in this forum
Nothing in this forum yet.