Authors
Jos Uffink
University of Minnesota
Abstract
The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates the expectation values of certain functions with their empirical averages. There are, however, various other ways in which one can construct constraints from empirical data, which makes the maximum entropy principle lead to very different probability assignments. This paper shows that an argument by Jaynes to justify the usual constraint rule is unsatisfactory and investigates several alternative choices. The choice of a constraint rule is also shown to be of crucial importance to the debate on the question whether there is a conflict between the methods of inference based on maximum entropy and Bayesian conditionalization.
Keywords No keywords specified (fix it)
Categories (categorize this paper)
DOI 10.1016/1355-2198(95)00022-4
Options
Edit this record
Mark as duplicate
Export citation
Find it on Scholar
Request removal from index
Revision history

Download options

PhilArchive copy


Upload a copy of this paper     Check publisher's policy     Papers currently archived: 54,466
Through your library

References found in this work BETA

The Logic of Chance.John Venn - 1888 - Dover Publications.
A Problem for Relative Information Minimizers in Probability Kinematics.Bas C. van Fraassen - 1981 - British Journal for the Philosophy of Science 32 (4):375-379.
Bayesian Conditionalisation and the Principle of Minimum Information.P. M. Williams - 1980 - British Journal for the Philosophy of Science 31 (2):131-144.

View all 18 references / Add more references

Citations of this work BETA

An Empirical Approach to Symmetry and Probability.Jill North - 2010 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 41 (1):27-40.
Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford University Press. pp. 115-142.

View all 15 citations / Add more citations

Similar books and articles

Analysis of the Maximum Entropy Principle “Debate”.John F. Cyranski - 1978 - Foundations of Physics 8 (5-6):493-506.
Entropia a Modelovanie.Ján Paulov - 2002 - Organon F: Medzinárodný Časopis Pre Analytickú Filozofiu 9 (2):157-175.
Maximum Entropy Inference with Quantified Knowledge.Owen Barnett & Jeff Paris - 2008 - Logic Journal of the IGPL 16 (1):85-98.
Can the Maximum Entropy Principle Be Explained as a Consistency Requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
Application of the Maximum Entropy Principle to Nonlinear Systems Far From Equilibrium.H. Haken - 1993 - In E. T. Jaynes, Walter T. Grandy & Peter W. Milonni (eds.), Physics and Probability: Essays in Honor of Edwin T. Jaynes. Cambridge University Press. pp. 239.
Common Sense and Maximum Entropy.Jeff Paris - 1998 - Synthese 117 (1):75-93.

Analytics

Added to PP index
2009-01-28

Total views
58 ( #166,971 of 2,374,877 )

Recent downloads (6 months)
5 ( #161,513 of 2,374,877 )

How can I increase my downloads?

Downloads

My notes