Global Robustness with Respect to the Loss Function and the Prior

Theory and Decision 48 (4):359-381 (2000)
We propose a class [I,S] of loss functions for modeling the imprecise preferences of the decision maker in Bayesian Decision Theory. This class is built upon two extreme loss functions I and S which reflect the limited information about the loss function. We give an approximation of the set of Bayes actions for every loss function in [I,S] and every prior in a mixture class; if the decision space is a subset of R, we obtain the exact set
Keywords Bayesian Decision Theory  Global robustness  Loss function  Mixture class
Categories (categorize this paper)
DOI 10.1023/A:1005212125699
 Save to my reading list
Follow the author(s)
My bibliography
Export citation
Find it on Scholar
Edit this record
Mark as duplicate
Revision history Request removal from index
Download options
PhilPapers Archive

Upload a copy of this paper     Check publisher's policy on self-archival     Papers currently archived: 15,822
External links
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library
References found in this work BETA

No references found.

Add more references

Citations of this work BETA

No citations found.

Add more citations

Similar books and articles
Bas C. Van Fraassen (2006). Vague Expectation Value Loss. Philosophical Studies 127 (3):483 - 491.
Ezio Di Nucci (2013). Embryo Loss and Double Effect. Journal of Medical Ethics 39 (8):537-540.
Brian Sayers (1987). Death as a Loss. Faith and Philosophy 4 (2):149-159.
Freya Mathews (2010). Planetary Collapse Disorder. Environmental Ethics 32 (4):353-367.

Monthly downloads

Added to index


Total downloads

12 ( #200,185 of 1,724,742 )

Recent downloads (6 months)

1 ( #349,121 of 1,724,742 )

How can I increase my downloads?

My notes
Sign in to use this feature

Start a new thread
There  are no threads in this forum
Nothing in this forum yet.