Chaos and randomness: An equivalence proof of a generalized version of the Shannon entropy and the kolmogorov–sinai entropy for Hamiltonian dynamical systems
David Bourget (Western Ontario)
David Chalmers (ANU, NYU)
Rafael De Clercq
Jack Alan Reynolds
Learn more about PhilPapers
Chaos is often explained in terms of random behaviour; and having positive Kolmogorov–Sinai entropy (KSE) is taken to be indicative of randomness. Although seemly plausible, the association of positive KSE with random behaviour needs justification since the definition of the KSE does not make reference to any notion that is connected to randomness. A common way of justifying this use of the KSE is to draw parallels between the KSE and ShannonÕs information theoretic entropy. However, as it stands this no more than a heuristic point, because no rigorous connection between the KSE and ShannonÕs entropy has been established yet. This paper fills this gap by proving that the KSE of a Hamiltonian dynamical system is equivalent to a generalized version of ShannonÕs information theoretic entropy under certain plausible assumptions. Ó 2005 Elsevier Ltd. All rights reserved.
|Keywords||No keywords specified (fix it)|
|Categories||categorize this paper)|
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
|Through your library||
References found in this work BETA
No references found.
Citations of this work BETA
No citations found.
Similar books and articles
Jeffrey S. Wicken (1987). Entropy and Information: Suggestions for Common Language. Philosophy of Science 54 (2):176-193.
Peter D. Grünwald & Paul M. B. Vitányi (2003). Kolmogorov Complexity and Information Theory. With an Interpretation in Terms of Questions and Answers. Journal of Logic, Language and Information 12 (4):497-529.
Janneke Van Lith (1999). Reconsidering the Concept of Equilibrium in Classical Statistical Mechanics. Philosophy of Science 66:S107 - S118.
Roman Frigg & Charlotte Werndl (2011). Entropy-A Guide for the Perplexed. In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford University Press.
James Ladyman, Stuart Presnell & Anthony J. Short (2008). The Use of the Information-Theoretic Entropy in Thermodynamics. Studies in History and Philosophy of Science Part B 39 (2):315-324.
Joseph Berkovitz, Roman Frigg & Fred Kronz (2006). The Ergodic Hierarchy, Randomness and Hamiltonian Chaos☆. Studies in History and Philosophy of Science Part B 37 (4):661-691.
Joseph Berkovitz, Roman Frigg & Fred Kronz (2006). The Ergodic Hierarchy, Randomness and Hamiltonian Chaos. Studies in History and Philosophy of Science Part B 37 (4):661-691.
Roman Frigg (2006). The Ergodic Hierarchy, Randomness and Hamiltonian Chaos. Studies in History and Philosophy of Science Part B 37 (4):661-691.
Roman Frigg (2004). In What Sense is the Kolmogorov-Sinai Entropy a Measure for Chaotic Behaviour?—Bridging the Gap Between Dynamical Systems Theory and Communication Theory. British Journal for the Philosophy of Science 55 (3):411 - 434.
Added to index2009-01-28
Total downloads23 ( #89,300 of 1,692,891 )
Recent downloads (6 months)1 ( #193,926 of 1,692,891 )
How can I increase my downloads?