Authors
Orly Shenker
Hebrew University of Jerusalem
Abstract
Information, entropy, probability: these three terms are closely interconnected in the prevalent understanding of statistical mechanics, both when this field is taught to students at an introductory level and in advanced research into the field’s foundations. This paper examines the interconnection between these three notions in light of recent research in the foundations of statistical mechanics. It disentangles these concepts and highlights their differences, at the same time explaining why they came to be so closely linked in the literature. In the literature the term ‘information’ is often linked to entropy and probability in discussions of Maxwell’s Demon and its attempted exorcism by the Landauer-Bennett thesis, and in analyses of the spin echo experiments. The direction taken in the present paper is a different one. Here we discuss the statistical mechanical underpinning of the notions of probability and entropy, and this constructive approach shows that information plays no fundamental role in these concepts, although it can be conveniently used in a sense that we shall specify.
Keywords No keywords specified (fix it)
Categories (categorize this paper)
Reprint years 2019, 2020
ISBN(s)
DOI 10.1007/s13194-019-0274-4
Options
Edit this record
Mark as duplicate
Export citation
Find it on Scholar
Request removal from index
Translate to english
Revision history

Download options

PhilArchive copy


Upload a copy of this paper     Check publisher's policy     Papers currently archived: 56,016
External links

Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library

References found in this work BETA

Every Thing Must Go: Metaphysics Naturalized.James Ladyman & Don Ross - 2007 - In James Ladyman, Don Ross, David Spurrett & John Collier (eds.), Every Thing Must Go: Metaphysics Naturalized. Oxford University Press.
Time and Chance.S. French - 2005 - Mind 114 (453):113-116.
A Treatise on Probability.Harry T. Costello - 1923 - Journal of Philosophy 20 (11):301-306.
The Philosophy of Information.Luciano Floridi - 2010 - The Philosophers' Magazine 50:42-43.

View all 43 references / Add more references

Citations of this work BETA

Calling for explanation: the case of the thermodynamic past state.Dan Baras & Orly Shenker - 2020 - European Journal for Philosophy of Science 10 (3):1-20.

Add more citations

Similar books and articles

Choosing a Definition of Entropy That Works.Robert H. Swendsen - 2012 - Foundations of Physics 42 (4):582-593.
Is - kTr( Ln ) the Entropy in Quantum Mechanics.Orly Shenker - 1999 - British Journal for the Philosophy of Science 50 (1):33-48.
Entropy in Operational Statistics and Quantum Logic.Carl A. Hein - 1979 - Foundations of Physics 9 (9-10):751-786.
Demons in Physics. [REVIEW]Amit Hagar - 2014 - Metascience 23 (2):1-10.
Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford University Press. pp. 115-142.
Maxwell's Demon and the Entropy Cost of Information.Paul N. Fahn - 1996 - Foundations of Physics 26 (1):71-93.
Bridging Conceptual Gaps: The Kolmogorov-Sinai Entropy.Massimiliano Badino - forthcoming - Isonomía. Revista de Teoría y Filosofía Del Derecho.

Analytics

Added to PP index
2020-01-08

Total views
14 ( #676,795 of 2,403,591 )

Recent downloads (6 months)
1 ( #551,240 of 2,403,591 )

How can I increase my downloads?

Downloads

My notes