Empirical data sets are algorithmically compressible: Reply to McAllister

James McAllister’s 2003 article, “Algorithmic randomness in empirical data” claims that empirical data sets are algorithmically random, and hence incompressible. We show that this claim is mistaken. We present theoretical arguments and empirical evidence for compressibility, and discuss the matter in the framework of Minimum Message Length (MML) inference, which shows that the theory which best compresses the data is the one with highest posterior probability, and the best explanation of the data.
Keywords No keywords specified (fix it)
Categories (categorize this paper)
DOI 10.1016/j.shpsa.2005.04.004
 Save to my reading list
Follow the author(s)
My bibliography
Export citation
Find it on Scholar
Edit this record
Mark as duplicate
Revision history Request removal from index
Download options
PhilPapers Archive

Upload a copy of this paper     Check publisher's policy on self-archival     Papers currently archived: 16,667
External links
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library
References found in this work BETA
Philip Kitcher (1989). Explanatary Unification and the Causal Structure of the World. In Philip Kitcher & Wesley Salmon (eds.), Scientific Explanation. Minneapolis: University of Minnesota Press 410-505.
J. W. McAllister (2003). Algorithmic Randomness in Empirical Data. Studies in History and Philosophy of Science Part A 34 (3):633-646.

Add more references

Citations of this work BETA

Add more citations

Similar books and articles

Monthly downloads

Added to index


Total downloads

12 ( #205,927 of 1,726,249 )

Recent downloads (6 months)

1 ( #369,877 of 1,726,249 )

How can I increase my downloads?

My notes
Sign in to use this feature

Start a new thread
There  are no threads in this forum
Nothing in this forum yet.