David Ellerman
University of Ljubljana
The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean logic of subsets is the normalized counting measure of the subsets (events). Thus logical entropy is a measure on the set of ordered pairs, and all the compound notions of entropy (join entropy, conditional entropy, and mutual information) arise in the usual way from the measure (e.g., the inclusion-exclusion principle)--just like the corresponding notions of probability. The usual Shannon entropy of a partition is developed by replacing the normalized count of distinctions (dits) by the average number of binary partitions (bits) necessary to make all the distinctions of the partition.
Keywords Shannon entropy  logical entropy  partition logic
Categories (categorize this paper)
Edit this record
Mark as duplicate
Export citation
Find it on Scholar
Request removal from index
Revision history

Download options

Our Archive
External links

Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library

References found in this work BETA

A Mathematical Theory of Communication.Claude E. Shannon - 1948 - Bell System Technical Journal 27:379–423.

Add more references

Citations of this work BETA

Add more citations

Similar books and articles

An Introduction to Partition Logic.David Ellerman - 2014 - Logic Journal of the IGPL 22 (1):94-125.
Quantum Mutual Entropy Defined by Liftings.Satoshi Iriyama & Masanori Ohya - 2011 - Foundations of Physics 41 (3):406-413.
Entropy in Operational Statistics and Quantum Logic.Carl A. Hein - 1979 - Foundations of Physics 9 (9-10):751-786.
Time Evolution in Macroscopic Systems. II. The Entropy.W. T. Grandy - 2004 - Foundations of Physics 34 (1):21-57.
Choosing a Definition of Entropy That Works.Robert H. Swendsen - 2012 - Foundations of Physics 42 (4):582-593.
Entropy of Formulas.Vera Koponen - 2009 - Archive for Mathematical Logic 48 (6):515-522.
How Does the Entropy/Information Bound Work?Jacob D. Bekenstein - 2005 - Foundations of Physics 35 (11):1805-1823.


Added to PP index

Total views
271 ( #26,765 of 2,325,994 )

Recent downloads (6 months)
37 ( #17,378 of 2,325,994 )

How can I increase my downloads?


My notes