David Bourget (Western Ontario)
David Chalmers (ANU, NYU)
Rafael De Clercq
Jack Alan Reynolds
Learn more about PhilPapers
In many applications of information theory, information measures the reduction of uncertainty that results from the knowledge that an event has occurred. Even so, an item of information learned need not be the occurrence of an event but, rather, the change in probability distribution associated with an ensemble of events. This paper examines the basic account of information, which focuses on events, and reviews how it may be naturally generalized to probability distributions/measures. The resulting information measure is special case of the Rényi information divergence (also known as the Rényi entropy). This information measure, herein dubbed the variational information, meaningfully assigns a numerical bit-value to arbitrary state transitions of physical systems. The information topology of these state transitions is characterized canonically by a right and left continuity spectrum defined in terms of the Kantorovich- Wasserstein metric. These continuity spectra provide a theoretical framework for characterizing the informational continuity of evolving systems and for rigorously assessing the degree to which such systems exhibit, or fail to exhibit, continuous change.
|Keywords||No keywords specified (fix it)|
|Categories||categorize this paper)|
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
|Through your library||
References found in this work BETA
No references found.
Citations of this work BETA
No citations found.
Similar books and articles
Sven Ove Hansson (2009). Measuring Uncertainty. Studia Logica 93 (1):21 - 40.
G. C. (2003). On a Supposed Conceptual Inadequacy of the Shannon Information in Quantum Mechanics. Studies in History and Philosophy of Science Part B 34 (3):441-468.
C. Ricotta & G. C. Avena (2002). On the Information-Theoretical Meaning of Hill's Parametric Evenness. Acta Biotheoretica 50 (1):63-71.
Orlin Vakarelov (2010). Pre-Cognitive Semantic Information. Knowledge, Technology & Policy 23 (2):193-226.
Jaakko Hintikka (1984). Some Varieties of Information. Information Processing and Management 20 (1-2):175-181.
Simon D'Alfonso (2011). On Quantifying Semantic Information. Information 2 (1):61-101.
Orlin Vakarelov (2012). The Information Medium. Philosophy and Technology 25 (1):47-65.
William F. Harms (1998). The Use of Information Theory in Epistemology. Philosophy of Science 65 (3):472-501.
Added to index2009-01-28
Total downloads10 ( #207,220 of 1,696,514 )
Recent downloads (6 months)5 ( #113,165 of 1,696,514 )
How can I increase my downloads?