David Bourget (Western Ontario)
David Chalmers (ANU, NYU)
Rafael De Clercq
Jack Alan Reynolds
Learn more about PhilPapers
In a previous paper a theory of program size formally identical to information theory was developed. The entropy of an individual finite object was defined to be the size in bits of the smallest program for calculating it. It was shown that this is − log2 of the probability that the object is obtained by means of a program whose successive bits are chosen by flipping an unbiased coin. Here a theory of the entropy of recursively enumerable sets of objects is proposed which includes the previous theory as the special case of sets having a single element. The primary concept in the generalized theory is the probability that a computing machine enumerates a given set when its program is manufactured by coin flipping. The entropy of a set is defined to be − log2 of this probability.
|Keywords||No keywords specified (fix it)|
|Categories||categorize this paper)|
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
|Through your library||
References found in this work BETA
No references found.
Citations of this work BETA
No citations found.
Similar books and articles
John P. Burgess (1988). Sets and Point-Sets: Five Grades of Set-Theoretic Involvement in Geometry. PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association 1988:456 - 463.
Elliott Sober & Mike Steel (2011). Entropy Increase and Information Loss in Markov Models of Evolution. Biology and Philosophy 26 (2):223-250.
Mujdat Pakkan & Varol Akman (1995). Hypersolver: A Graphical Tool for Commonsense Set Theory. Philosophical Explorations.
Ayhan Sol (2007). Entropy, Disorder, and Traces. The Proceedings of the Twenty-First World Congress of Philosophy 12:149-153.
Yeram Sarkis Touloukian (1956). The Concept on Entropy in Communication. Lafayette, Ind.,Purdue University.
Roman Frigg & Charlotte Werndl (2011). Entropy-A Guide for the Perplexed. In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford University Press.
Jeffrey S. Wicken (1987). Entropy and Information: Suggestions for Common Language. Philosophy of Science 54 (2):176-193.
W. J. (2003). Algorithmic Randomness in Empirical Data. Studies in History and Philosophy of Science Part A 34 (3):633-646.
David Ellerman, A Short Note on the Logico-Conceptual Foundations of Information Theory in Partition Logic.
Jonas Clausen Mork (2013). Uncertainty, Credal Sets and Second Order Probability. Synthese 190 (3):353-378.
Added to index2009-02-15
Total downloads8 ( #192,401 of 1,410,450 )
Recent downloads (6 months)2 ( #107,949 of 1,410,450 )
How can I increase my downloads?