What Is Information? Three Concepts

Biological Theory 1 (3):230-242 (2006)
  Copy   BIBTEX

Abstract

The concept of information tempts us as a theoretical primitive, partly because of the respectability lent to it by highly successful applications of Shannon’s information theory, partly because of its broad range of applicability in various domains, partly because of its neutrality with respect to what basic sorts of things there are. This versatility, however, is the very reason why information cannot be the theoretical primitive we might like it to be. “Information,” as it is variously used, is systematically ambiguous between whether it involves continuous or discrete quantities, causal or noncausal relationships, and intrinsic or relational properties. Many uses can be firmly grounded in existing theory, however. Continuous quantities of information involving probabilities can be related to information theory proper. Information defined relative to systems of rules or conventions can be understood relative to the theory of meaning . A number of causal notions may possibly be located relative to standard notions in physics. Precise specification of these distinct properties involved in the common notion of information can allow us to map the relationships between them. Consequently, while information is not in itself the kind of single thing that can play a significant unifying role, analyzing its ambiguities may facilitate headway toward that goal

Other Versions

No versions found

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 99,410

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Analytics

Added to PP
2010-08-24

Downloads
69 (#255,092)

6 months
8 (#429,067)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

William F. Harms
Seattle Central Community College