4 found
Steve Gardner [3]Steven Gardner [1]
See also:
Profile: Steve Gardner (Monash University)
  1.  39
    David L. Dowe, Steve Gardner & and Graham Oppy (2007). Bayes Not Bust! Why Simplicity Is No Problem for Bayesians. British Journal for the Philosophy of Science 58 (4):709 - 754.
    The advent of formal definitions of the simplicity of a theory has important implications for model selection. But what is the best way to define simplicity? Forster and Sober ([1994]) advocate the use of Akaike's Information Criterion (AIC), a non-Bayesian formalisation of the notion of simplicity. This forms an important part of their wider attack on Bayesianism in the philosophy of science. We defend a Bayesian alternative: the simplicity of a theory is to be characterised in terms of Wallace's (...) Message Length (MML). We show that AIC is inadequate for many statistical problems where MML performs well. Whereas MML is always defined, AIC can be undefined. Whereas MML is not known ever to be statistically inconsistent, AIC can be. Even when defined and consistent, AIC performs worse than MML on small sample sizes. MML is statistically invariant under 1-to-1 re-parametrisation, thus avoiding a common criticism of Bayesian approaches. We also show that MML provides answers to many of Forster's objections to Bayesianism. Hence an important part of the attack on Bayesianism fails. (shrink)
    Direct download (5 more)  
    Export citation  
    My bibliography   4 citations  
  2.  12
    Charles Twardy, Steve Gardner & David Dowe (2005). Empirical Data Sets Are Algorithmically Compressible: Reply to McAllister. Studies in the History and Philosophy of Science, Part A 36 (2):391-402.
    James McAllisters 2003 article, “Algorithmic randomness in empirical dataclaims that empirical data sets are algorithmically random, and hence incompressible. We show that this claim is (...)mistaken. We present theoretical arguments and empirical evidence for compressibility, and discuss the matter in the framework of Minimum Message Length (MML) inference, which shows that the theory which best compresses the data is the one with highest posterior probability, and the best explanation of the data. (shrink)
    Direct download (3 more)  
    Export citation  
    My bibliography   1 citation  
  3. Graham Robert Oppy, Nick Trakakis, Lynda Burns, Steven Gardner & Fiona Leigh (eds.) (2014). A Companion to Philosophy in Australia & New Zealand. Monash University Publishing.
  4. Graham Oppy, Nick Trakakis, Steve Gardner, Fiona Leigh & Lynda Burns (eds.) (forthcoming). Companion to Philosophy in Australasia. Monash E-Press.
    Export citation  
    My bibliography