Authors
Abstract
Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities, they should be calibrated to our evidence of physical probabilities, and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three norms of objective Bayesianism are usually justified in different ways. In this paper we show that the three norms can all be subsumed under a single justification in terms of minimising worst-case expected loss. This, in turn, is equivalent to maximising a generalised notion of entropy. We suggest that requiring language invariance, in addition to minimising worst-case expected loss, motivates maximisation of standard entropy as opposed to maximisation of other instances of generalised entropy. Our argument also provides a qualified justification for updating degrees of belief by Bayesian conditionalisation. However, conditional probabilities play a less central part in the objective Bayesian account than they do under the subjective view of Bayesianism, leading to a reduced role for Bayes’ Theorem.
Keywords No keywords specified (fix it)
Categories (categorize this paper)
Options
Edit this record
Mark as duplicate
Export citation
Find it on Scholar
Request removal from index
Revision history

Download options

PhilArchive copy


Upload a copy of this paper     Check publisher's policy     Papers currently archived: 56,016
External links

Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library

References found in this work BETA

Elicitation of Personal Probabilities and Expectations.Leonard Savage - 1971 - Journal of the American Statistical Association 66 (336):783-801.
Entropy and Uncertainty.Teddy Seidenfeld - 1986 - Philosophy of Science 53 (4):467-491.
From Bayesian Epistemology to Inductive Logic.Jon Williamson - 2013 - Journal of Applied Logic 11 (4):468-486.

View all 8 references / Add more references

Citations of this work BETA

Justifying the Principle of Indifference.Jon Williamson - 2018 - European Journal for Philosophy of Science 8 (3):559-586.
Invariant Equivocation.Jürgen Landes & George Masterton - 2017 - Erkenntnis 82 (1):141-167.

View all 9 citations / Add more citations

Similar books and articles

Analytics

Added to PP index
2013-09-13

Total views
78 ( #126,545 of 2,403,592 )

Recent downloads (6 months)
5 ( #156,166 of 2,403,592 )

How can I increase my downloads?

Downloads

My notes