Chaos and randomness: An equivalence proof of a generalized version of the Shannon entropy and the kolmogorov–sinai entropy for Hamiltonian dynamical systems
|Abstract||Chaos is often explained in terms of random behaviour; and having positive Kolmogorov–Sinai entropy (KSE) is taken to be indicative of randomness. Although seemly plausible, the association of positive KSE with random behaviour needs justification since the definition of the KSE does not make reference to any notion that is connected to randomness. A common way of justifying this use of the KSE is to draw parallels between the KSE and ShannonÕs information theoretic entropy. However, as it stands this no more than a heuristic point, because no rigorous connection between the KSE and ShannonÕs entropy has been established yet. This paper fills this gap by proving that the KSE of a Hamiltonian dynamical system is equivalent to a generalized version of ShannonÕs information theoretic entropy under certain plausible assumptions. Ó 2005 Elsevier Ltd. All rights reserved.|
|Keywords||No keywords specified (fix it)|
|Through your library||Only published papers are available at libraries|
Similar books and articles
Jeffrey S. Wicken (1987). Entropy and Information: Suggestions for Common Language. Philosophy of Science 54 (2):176-193.
Peter D. Grünwald & Paul M. B. Vitányi (2003). Kolmogorov Complexity and Information Theory. With an Interpretation in Terms of Questions and Answers. Journal of Logic, Language and Information 12 (4):497-529.
Janneke Van Lith (1999). Reconsidering the Concept of Equilibrium in Classical Statistical Mechanics. Philosophy of Science 66:S107 - S118.
James Ladyman, Stuart Presnell & Anthony J. Short (2008). The Use of the Information-Theoretic Entropy in Thermodynamics. Studies in History and Philosophy of Science Part B 39 (2):315-324.
Joseph Berkovitz, Roman Frigg & Fred Kronz (2006). The Ergodic Hierarchy, Randomness and Hamiltonian Chaos☆. Studies in History and Philosophy of Science Part B 37 (4):661-691.
Roman Frigg (2006). The Ergodic Hierarchy, Randomness and Hamiltonian Chaos. Studies in History and Philosophy of Science Part B 37 (4):661-691.
Roman Frigg (2004). In What Sense is the Kolmogorov-Sinai Entropy a Measure for Chaotic Behaviour?—Bridging the Gap Between Dynamical Systems Theory and Communication Theory. British Journal for the Philosophy of Science 55 (3):411 - 434.
Added to index2009-01-28
Total downloads16 ( #74,716 of 549,115 )
Recent downloads (6 months)2 ( #37,390 of 549,115 )
How can I increase my downloads?