David Bourget (Western Ontario)
David Chalmers (ANU, NYU)
Rafael De Clercq
Jack Alan Reynolds
Learn more about PhilPapers
In a previous paper a theory of program size formally identical to information theory was developed. The entropy of an individual finite object was defined to be the size in bits of the smallest program for calculating it. It was shown that this is − log2 of the probability that the object is obtained by means of a program whose successive bits are chosen by flipping an unbiased coin. Here a theory of the entropy of recursively enumerable sets of objects is proposed which includes the previous theory as the special case of sets having a single element. The primary concept in the generalized theory is the probability that a computing machine enumerates a given set when its program is manufactured by coin flipping. The entropy of a set is defined to be − log2 of this probability.
|Keywords||No keywords specified (fix it)|
No categories specified
(categorize this paper)
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
|Through your library||
References found in this work BETA
No references found.
Citations of this work BETA
No citations found.
Similar books and articles
John P. Burgess (1988). Sets and Point-Sets: Five Grades of Set-Theoretic Involvement in Geometry. PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association 1988:456 - 463.
David Ellerman, A Short Note on the Logico-Conceptual Foundations of Information Theory in Partition Logic.
W. J. (2003). Algorithmic Randomness in Empirical Data. Studies in History and Philosophy of Science Part A 34 (3):633-646.
Jeffrey S. Wicken (1987). Entropy and Information: Suggestions for Common Language. Philosophy of Science 54 (2):176-193.
Roman Frigg & Charlotte Werndl (2011). Entropy-A Guide for the Perplexed. In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford University Press.
Yeram Sarkis Touloukian (1956). The Concept on Entropy in Communication. Lafayette, Ind.,Purdue University.
Ayhan Sol (2007). Entropy, Disorder, and Traces. The Proceedings of the Twenty-First World Congress of Philosophy 12:149-153.
Elliott Sober & Mike Steel (2011). Entropy Increase and Information Loss in Markov Models of Evolution. Biology and Philosophy 26 (2):223-250.
Jonas Clausen Mork (2013). Uncertainty, Credal Sets and Second Order Probability. Synthese 190 (3):353-378.
Added to index2009-02-15
Total downloads6 ( #202,094 of 1,099,035 )
Recent downloads (6 months)1 ( #287,293 of 1,099,035 )
How can I increase my downloads?