Citations of:
Add citations
You must login to add citations.
|
|
The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates the expectation (...) |
|
The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with (...) |
|
This article distinguishes two different senses of information-theoretic approaches to statistical mechanics that are often conflated in the literature: those relating to the thermodynamic cost of computational processes and those that offer an interpretation of statistical mechanics where the probabilities are treated as epistemic. This distinction is then investigated through Earman and Norton’s ([1999]) ‘sound’ and ‘profound’ dilemma for information-theoretic exorcisms of Maxwell’s demon. It is argued that Earman and Norton fail to countenance a ‘sound’ information-theoretic interpretation and this paper (...) |
|
|