Chaos and randomness: An equivalence proof of a generalized version of the Shannon entropy and the kolmogorov–sinai entropy for Hamiltonian dynamical systems

Abstract

Chaos is often explained in terms of random behaviour; and having positive Kolmogorov–Sinai entropy (KSE) is taken to be indicative of randomness. Although seemly plausible, the association of positive KSE with random behaviour needs justification since the definition of the KSE does not make reference to any notion that is connected to randomness. A common way of justifying this use of the KSE is to draw parallels between the KSE and ShannonÕs information theoretic entropy. However, as it stands this no more than a heuristic point, because no rigorous connection between the KSE and ShannonÕs entropy has been established yet. This paper fills this gap by proving that the KSE of a Hamiltonian dynamical system is equivalent to a generalized version of ShannonÕs information theoretic entropy under certain plausible assumptions. Ó 2005 Elsevier Ltd. All rights reserved.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 93,779

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

  • Only published works are available at libraries.

Similar books and articles

Bridging Conceptual Gaps: The Kolmogorov-Sinai Entropy.Massimiliano Badino - forthcoming - Isonomía. Revista de Teoría y Filosofía Del Derecho.
The ergodic hierarchy, randomness and Hamiltonian chaos.Joseph Berkovitz, Roman Frigg & Fred Kronz - 2006 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 37 (4):661-691.
The use of the information-theoretic entropy in thermodynamics.James Ladyman, Stuart Presnell & Anthony J. Short - 2008 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 39 (2):315-324.
Information, entropy and inductive logic.S. Pakswer - 1954 - Philosophy of Science 21 (3):254-259.

Analytics

Added to PP
2009-01-28

Downloads
65 (#242,909)

6 months
1 (#1,720,529)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Roman Frigg
London School of Economics

Citations of this work

Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford, GB: Oxford University Press. pp. 115-142.

Add more citations

References found in this work

No references found.

Add more references