Probabilistic models have much to offer to philosophy. We continually receive information from a variety of sources: from our senses, from witnesses, from scientific instruments. When considering whether we should believe this information, we assess whether the sources are independent, how reliable they are, and how plausible and coherent the information is. Bovens and Hartmann provide a systematic Bayesian account of these features of reasoning. Simple Bayesian Networks allow us to model alternative assumptions about the nature of the information sources. (...) Measurement of the coherence of information is a controversial matter: arguably, the more coherent a set of information is, the more confident we may be that its content is true, other things being equal. The authors offer a new treatment of coherence which respects this claim and shows its relevance to scientific theory choice. Bovens and Hartmann apply this methodology to a wide range of much discussed issues regarding evidence, testimony, scientific theories, and voting. Bayesian Epistemology is an essential tool for anyone working on probabilistic methods in philosophy, and has broad implications for many other disciplines. (shrink)
In their recently published book Nudge (2008) Richard H. Thaler and Cass R. Sunstein (T&S) defend a position labelled as ‘libertarian paternalism’. Their thinking appeals to both the right and the left of the political spectrum, as evidenced by the bedfellows they keep on either side of the Atlantic. In the US, they have advised Barack Obama, while, in the UK, they were welcomed with open arms by the David Cameron's camp (Chakrabortty 2008). I will consider the following questions. What (...) is Nudge? How is it different from social advertisement? Does Nudge induce genuine preference change? Does Nudge build moral character? Is there a moral difference between the use of Nudge as opposed to subliminal images to reach policy objectives? And what are the moral constraints on Nudge? (shrink)
Hope obeys Aristotle's doctrine of the mean: one should neither hope too much, nor too little. But what determines what constitutes too much and what constitutes too little for a particular person at a particular time? The sceptic presents an argument to the effect that it is never rational to hope. An attempt to answer the sceptic leads us in different directions. Decision-theoretic and preference-theoretic arguments support the instrumental value of hope. An investigation into the nature of hope permits us (...) to assess the intrinsic value of hope. However, it must be granted to the sceptic that there is a tension between hope and epistemic rationality. I conclude with some reflections about the relationship between hope and character features that are constitutive of inner strength. (shrink)
This paper addresses a problem for theories of epistemic democracy. In a decision on a complex issue which can be decomposed into several parts, a collective can use different voting procedures: Either its members vote on each sub-question and the answers that gain majority support are used as premises for the conclusion on the main issue, or the vote is conducted on the main issue itself. The two procedures can lead to different results. We investigate which of these procedures is (...) better as a truth-tracker, assuming that there exists a true answer to be reached. On the basis of the Condorcet jury theorem, we show that the pbp is universally superior if the objective is to reach truth for the right reasons. If one instead is after truth for whatever reasons, right or wrong, there will be cases in which the cbp is more reliable, even though, for the most part, the pbp still is to be preferred. (shrink)
John Locke proposed a straightforward relationship between qualitative and quantitative doxastic notions: belief corresponds to a sufficiently high degree of confidence. Richard Foley has further developed this Lockean thesis and applied it to an analysis of the preface and lottery paradoxes. Following Foley's lead, we exploit various versions of these paradoxes to chart a precise relationship between belief and probabilistic degrees of confidence. The resolutions of these paradoxes emphasize distinct but complementary features of coherent belief. These features suggest principles that (...) tie together qualitative and quantitative doxastic notions. We show how these principles may be employed to construct a quantitative model - in terms of degrees of confidence - of an agent's qualitative doxastic state. This analysis fleshes out the Lockean thesis and provides the foundation for a logic of belief that is responsive to the logic of degrees of confidence. (shrink)
A coherent story is a story that fits together well. This notion plays a central role in the coherence theory of justification and has been proposed as a criterion for scientific theory choice. Many attempts have been made to give a probabilistic account of this notion. A proper account of coherence must not start from some partial intuitions, but should pay attention to the role that this notion is supposed to play within a particular context. Coherence is a property of (...) an information set that boosts our confidence that its content is true ceteris paribus when we receive information from independent and partially reliable sources. We construct a measure cr that relies on hypothetical sources with certain idealized characteristics. A maximally coherent information set, i.e. a set with equivalent propositions, affords a maximal confidence boost. cr is the ratio of the actual confidence boost over the confidence boost that we would have received, had the information been presented in the form of maximally coherent information, ceteris paribus. This measure is functionally dependent on the degree of reliability r of the sources. We use cr to construct a coherence quasi-ordering over information sets S and S’: S is no less coherent than S’ just in case c_r is not smaller than c_r for any value of the reliability parameter. We show that, on our account, the coherence of the story about the world gives us a reason to believe that the story is true and that the coherence of a scientific theory, construed as a set of models, is a proper criterion for theory choice. (shrink)
The coherentist theory of justification provides a response to the sceptical challenge: even though the independent processes by which we gather information about the world may be of dubious quality, the internal coherence of the information provides the justification for our empirical beliefs. This central canon of the coherence theory of justification is tested within the framework of Bayesian networks, which is a theory of probabilistic reasoning in artificial intelligence. We interpret the independence of the information gathering processes (IGPs) in (...) terms of conditional independences, construct a minimal sufficient condition for a coherence ranking of information sets and assess whether the confidence boost that results from receiving information through independent IGPs is indeed a positive function of the coherence of the information set. There are multiple interpretations of what constitute IGPs of dubious quality. Do we know our IGPs to be no better than randomization processes? Or, do we know them to be better than randomization processes but not quite fully reliable, and if so, what is the nature of this lack of full reliability? Or, do we not know whether they are fully reliable or not? Within the latter interpretation, does learning something about the quality of some IGPs teach us anything about the quality of the other IGPs? The Bayesian-network models demonstrate that the success of the coherentist canon is contingent on what interpretation one endorses of the claim that our IGPs are of dubious quality. (shrink)
This paper addresses a problem for theories of epistemic democracy. In a decision on a complex issue which can be decomposed into several parts, a collective can use different voting procedures: Either its members vote on each sub-question and the answers that gain majority support are used as premises for the conclusion on the main issue, or the vote is conducted on the main issue itself. The two procedures can lead to different results. We investigate which of these procedures is (...) better as a truth-tracker, assuming that there exists a true answer to be reached. On the basis of the Condorcet jury theorem, we show that the pbp is universally superior if the objective is to reach truth for the right reasons. If one instead is after truth for whatever reasons, right or wrong, there will be cases in which the cbp is more reliable, even though, for the most part, the pbp still is to be preferred. (shrink)
I investigate whether any plausible moral arguments can be made for ‘grandfathering’ emission rights (that is, for setting emission targets for developed countries in line with their present or past emission levels) on the basis of a Lockean theory of property rights.
Apologies.Luc Bovens - 2008 - Proceedings of the Aristotelian Society 108 (1pt3):219-239.details
There is a cognitive, an affective, a conative, and an attitudinal component to a genuine apology. In discussing these components, I address the following questions. Might apologies be due for non-culpable actions? Might apologies be due for choices in moral dilemmas? What is the link between sympathy, remorse and making amends? Is it meaningful for resilient akratics to apologize? How much moral renewal is required when one apologizes? Why should apologies be offered in a humble manner? And is there some (...) truth to P. G. Wodehouse's dictum that 'the right sort of people do not want apologies'? (shrink)
John Locke proposed a straightforward relationship between qualitative and quantitative doxastic notions: belief corresponds to a sufficiently high degree of confidence. Richard Foley has further developed this Lockean thesis and applied it to an analysis of the preface and lottery paradoxes. Following Foley's lead, we exploit various versions of these paradoxes to chart a precise relationship between belief and probabilistic degrees of confidence. The resolutions of these paradoxes emphasize distinct but complementary features of coherent belief. These features suggest principles that (...) tie together qualitative and quantitative doxastic notions. We show how these principles may be employed to construct a quantitative model - in terms of degrees of confidence - of an agent's qualitative doxastic state. This analysis fleshes out the Lockean thesis and provides the foundation for a logic of belief that is responsive to the logic of degrees of confidence. (shrink)
Bayesian Coherence Theory of Justification or, for short, Bayesian Coherentism, is characterized by two theses, viz. (i) that our degree of confidence in the content of a set of propositions is positively affected by the coherence of the set, and (ii) that coherence can be characterized in probabilistic terms. There has been a longstanding question of how to construct a measure of coherence. We will show that Bayesian Coherentism cannot rest on a single measure of coherence, but requires a vector (...) whose components exhaustively characterize the coherence properties of the set. Our degree of confidence in the content of the information set is a function of the reliability of the sources and the components of the coherence vector. The components of this coherence vector are weakly but not strongly separable, which blocks the construction of a single coherence measure. (shrink)
We appeal to the theory of Bayesian Networks to model different strategies for obtaining confirmation for a hypothesis from experimental test results provided by less than fully reliable instruments. In particular, we consider (i) repeated measurements of a single test consequence of the hypothesis, (ii) measurements of multiple test consequences of the hypothesis, (iii) theoretical support for the reliability of the instrument, and (iv) calibration procedures. We evaluate these strategies on their relative merits under idealized conditions and show some surprising (...) repercussions on the variety-of-evidence thesis and the Duhem-Quine thesis. (shrink)
Utilitarianism, it has been said, is not sensitive to the distribution of welfare. In making risky decisions for others there are multiple sensitivities at work. I present examples of risky decision-making involving drug allocations, charitable giving, breast-cancer screening and C-sections. In each of these examples there is a different sensitivity at work that pulls away from the utilitarian prescription. Instances of saving fewer people at a greater risk to many is more complex because there are two distributional sensitivities at work (...) that pull in opposite directions from the utilitarian calculus. I discuss objections to these sensitivities and conclude with some reflections on the value of formal modelling in thinking about societal risk. (shrink)
Why does an offender find it upsetting when the victim of their wrongdoing refuses to accept their apologies? Why do they find it upsetting when the victim is unwilling to grant them the forgiveness that they are asking for? I present an account of apologising and accepting apologies that can explain why this distress into an apt emotion.
This paper addresses a problem for theories of epistemic democracy. In a decision on a complex issue which can be decomposed into several parts, a collective can use different voting procedures: Either its members vote on each sub-question and the answers that gain majority support are used as premises for the conclusion on the main issue, or the vote is conducted on the main issue itself. The two procedures can lead to different results. We investigate which of these procedures is (...) better as a truth-tracker, assuming that there exists a true answer to be reached. On the basis of the Condorcet jury theorem, we show that the pbp is universally superior if the objective is to reach truth for the right reasons. If one instead is after truth for whatever reasons, right or wrong, there will be cases in which the cbp is more reliable, even though, for the most part, the pbp still is to be preferred. (shrink)
In “Judy Benjamin is a Sleeping Beauty” (2010) Bovens recognises a certain similarity between the Sleeping Beauty (SB) and the Judy Benjamin (JB). But he does not recognise the dissimilarity between underlying protocols (as spelled out in Shafer (1985). Protocols are expressed in conditional probability tables that spell out the probability of coming to learn various propositions conditional on the actual state of the world. The principle of total evidence requires that we not update on the content of the proposition (...) learned but rather on the fact that we learn the proposition in question. Now attention to protocols drives a wedge between the SB and the JB. We have shown that the solution to a close variant of the SB which involves a clear protocol is P*(Heads) = 1/3 and since Beauty’s has precisely the same information at her disposal in the original SB at the time that she is asked to state her credence for Heads, the same solution should hold. The solution to the JB, on the other hand, is dependent on Judy’s probability distribution over protocols. One reasonable protocol yields P(Red) = 1/2, but Judy could also defend alternative values or a range of values in the interval [1/3, 1/2] depending on her probability distribution over protocols. (shrink)
Belgium has recently extended its euthanasia legislation to minors, making it the first legislation in the world that does not specify any age limit. I consider two strands in the opposition to this legislation. First, I identify five arguments in the public debate to the effect that euthanasia for minors is somehow worse than euthanasia for adults—viz. arguments from weightiness, capability of discernment, pressure, sensitivity and sufficient palliative care—and show that these arguments are wanting. Second, there is another position in (...) the public debate that wishes to keep the current age restriction on the books and have ethics boards exercise discretion in euthanasia decisions for minors. I interpret this position on the background of Velleman’s “Against the Right to Die” and show that, although costs remain substantial, it actually can provide some qualified support against extending euthanasia legislation to minors. (shrink)
If you believe more things you thereby run a greater risk of being in error than if you believe fewer things. From the point of view of avoiding error, it is best not to believe anything at all, or to have very uncommitted beliefs. But considering the fact that we all in fact do entertain many specific beliefs, this recommendation is obviously in flagrant dissonance with our actual epistemic practice. Let us call the problem raised by this apparent conflict the (...) Addition Problem. In this paper we will find reasons to reject a particular premise used in the formulation of the Addition Problem, namely, the fundamental premise according to which believing more things increases the risk of error. As we will see, acquiring more beliefs need not decrease the probability of the whole, and hence need not increase the risk of error. In fact, more beliefs can mean an increase in the probability of the whole and a corresponding decrease in the risk of error. We will consider the Addition Problem as it arises in the context of the coherence theory of epistemic justification, while keeping firmly in mind that the point we wish to make is of epistemological importance also outside the specific coherentist dispute. The problem of determining exactly how the probability of the whole system depends on such factors as coherence, reliability and independence will be seen to open up an interesting area of research in which the theory of conditional independence structures is a helpful tool. (shrink)
If we receive information from multiple independent and partially reliable information sources, then whether we are justified to believe these information items is affected by how reliable the sources are, by how well the information coheres with our background beliefs and by how internally coherent the information is. We consider the following question. Is coherence a separable determinant of our degree of belief, i.e. is it the case that the more coherent the new information is, the more justified we are (...) in believing the new information, ceteris paribus? We show that if we consider sets of information items of any size (Holism), and if we assume that there exists a coherence Ordering over such sets and that coherence is a function of the probability distribution over the propositions in such sets (Probabilism), then Separability fails to hold. (shrink)
The Distribution View provides a model that integrates four distributional concerns in the evaluation of risky prospects. Starting from these concerns, we can generate an ordering over a set of risky prospects, or, starting from an ordering, we can extract a characterization of the underlying distributional concerns. Separability of States and/or Persons for multiple-person risky prospects, for single-person risky prospects and for multiple-person certain prospects are discussed within the model. The Distribution View sheds light on public health policies and provides (...) a framework for the discussion of Parfit's Priority View for risky prospects. (shrink)
I provide a taxonomy of the various circumstances under which one might reasonably say "P and I will believe that not-P" or violate the Reflection Principle.
Nancy Cartwright is one of the most distinguished and influential contemporary philosophers of science. Despite the profound impact of her work, there is neither a systematic exposition of Cartwright’s philosophy of science nor a collection of articles that contains in-depth discussions of the major themes of her philosophy. This book is devoted to a critical assessment of Cartwright’s philosophy of science and contains contributions from Cartwright's champions and critics. Broken into three parts, the book begins by addressing Cartwright's views on (...) the practice of model building in science and the question of how models represent the world before moving on to a detailed discussion of methodologically and metaphysically challenging problems. Finally, the book addresses Cartwright's original attempts to clarify profound questions concerning the metaphysics of science. With contributions from leading scholars, such as Ronald N. Giere and Paul Teller, this unique volume will be extremely useful to philosophers of science the world over. (shrink)
This paper addresses a problem for theories of epistemic democracy. In a decision on a complex issue which can be decomposed into several parts, a collective can use different voting procedures: Either its members vote on each sub-question and the answers that gain majority support are used as premises for the conclusion on the main issue, or the vote is conducted on the main issue itself. The two procedures can lead to different results. We investigate which of these procedures is (...) better as a truth-tracker, assuming that there exists a true answer to be reached. On the basis of the Condorcet jury theorem, we show that the pbp is universally superior if the objective is to reach truth for the right reasons. If one instead is after truth for whatever reasons, right or wrong, there will be cases in which the cbp is more reliable, even though, for the most part, the pbp still is to be preferred. (shrink)
Corroborating Testimony, Probability and Surprise’, Erik J. Olsson ascribes to L. Jonathan Cohen the claims that if two witnesses provide us with the same information, then the less probable the information is, the more confident we may be that the information is true (C), and the stronger the information is corroborated (C*). We question whether Cohen intends anything like claims (C) and (C*). Furthermore, he discusses the concurrence of witness reports within a context of independent witnesses, whereas the witnesses in (...) Olsson's model are not independent in the standard sense. We argue that there is much more than, in Olsson's words, ‘a grain of truth’ to claim (C), both on his own characterization as well as on Cohen's characterization of the witnesses. We present an analysis for independent witnesses in the contexts of decision-making under risk and decision-making under uncertainty and generalize the model for n witnesses. As to claim (C*), Olsson's argument is contingent on the choice of a particular measure of corroboration and is not robust in the face of alternative measures. Finally, we delimit the set of cases to which Olsson's model is applicable. 1 Claim (C) examined for Olsson's characterization of the relationship between the witnesses 2 Claim (C) examined for two or more independent witnesses 3 Robustness and multiple measures of corroboration 4 Discussion. (shrink)
Considerations of objective-value freedom and status freedom do impose constraints on policies that restrict access to cigarettes. As to the objective-value freedom, something of value is lost when anti-alcohol policies lead to pub closures interfering with valued life styles, and a similar, though weaker, argument can be made for cigarettes. As to status freedom, non-arbitrariness requires consultation with vulnerable populations to learn what might aid them with smoking cessation.
Some proponents of the pro-life movement argue against morning after pills, IUDs, and contraceptive pills on grounds of a concern for causing embryonic death. What has gone unnoticed, however, is that the pro-life line of argumentation can be extended to the rhythm method of contraception as well. Given certain plausible empirical assumptions, the rhythm method may well be responsible for a much higher number of embryonic deaths than some other contraceptive techniques.
I investigate what, if anything, can be said in defense of Volkswagen's decision to install a cheat device in their diesel engines to evade NOx emission testing.
Choice often proceeds in two stages: We construct a shortlist on the basis of limited and uncertain information about the options and then reduce this uncertainty by examining the shortlist in greater detail. The goal is to do well when making a final choice from the option set. I argue that we cannot realise this goal by constructing a ranking over the options at shortlisting stage which determines of each option whether it is more or less worthy of being included (...) in a shortlist. This is relevant to the 2010 UK Equality Act. The Act requires that shortlists be constructed on grounds of candidate rankings and affirmative action is only permissible for equally qualified candidates. This is misguided: Shortlisting candidates with lower expected qualifications but higher variance may raise the chance of finding an exceptionally strong candidate. If it does, then shortlisting such candidates would make eminent business sense and there is nothing unfair about it. This observation opens up room for including more underrepresented candidates with protected characteristics, as they are more likely to display greater variance in the selector’s credence functions at shortlisting stage. (shrink)
Risky prospects represent policies that impose different types of risks on multiple people. I present an example from food safety. A utilitarian following Harsanyi's Aggregation Theorem ranks such prospects according to their mean expected utility or the expectation of the social utility. Such a ranking is not sensitive to any of four types of distributional concerns. I develop a model that lets the policy analyst rank prospects relative to the distributional concerns that she considers fitting in the context at hand. (...) I name this model ‘the Distribution View’ posing an alternative to Parfit's Priority View for risky prospects. (shrink)
The Puzzle of the Hats is a puzzle in social epistemology. It describes a situation in which a group of rational agents with common priors and common goals seems vulnerable to a Dutch book if they are exposed to different information and make decisions independently. Situations in which this happens involve violations of what might be called the Group-Reflection Principle. As it turns out, the Dutch book is flawed. It is based on the betting interpretation of the subjective probabilities, but (...) ignores the fact that this interpretation disregards strategic considerations that might influence betting behavior. A lesson to be learned concerns the interpretation of probabilities in terms of fair bets and, more generally, the role of strategic considerations in epistemic contexts. Another lesson concerns Group-Reflection, which in its unrestricted form is highly counter-intuitive. We consider how this principle of social epistemology should be re-formulated so as to make it tenable. (shrink)
We consider a special set of risky prospects in which the outcomes are either life or death. There are various alternatives to the utilitarian objective of minimizing the expected loss of lives in such prospects. We start off with the two-person case with independent risks and construct taxonomies of ex ante and ex post evaluations for such prospects. We examine the relationship between the ex ante and the ex post in this restrictive framework: There are more possibilities to respect ex (...) ante and ex post objectives simultaneously than in the general framework, i.e. without the restriction to binary utilities. We extend our results to n persons and to dependent risks. We study optimal strategies for allocating risk reductions given different objectives. We place our results against the backdrop of various pro-poorly off value functions for the evaluation of risky prospects. View HTML Send article to KindleTo send this article to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle. Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply. Find out more about the Kindle Personal Document Service.EVALUATING LIFE OR DEATH PROSPECTSVolume 28, Issue 2Luc Bovens and Marc Fleurbaey DOI: https://doi.org/10.1017/S0266267112000235Your Kindle email address Please provide your Kindle [email protected]@kindle.com Available formats PDF Please select a format to send. By using this service, you agree that you will only keep articles for personal use, and will not openly distribute them via Dropbox, Google Drive or other file sharing services. Please confirm that you accept the terms of use. Cancel Send ×Send article to Dropbox To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about sending content to Dropbox. EVALUATING LIFE OR DEATH PROSPECTSVolume 28, Issue 2Luc Bovens and Marc Fleurbaey DOI: https://doi.org/10.1017/S0266267112000235Available formats PDF Please select a format to send. By using this service, you agree that you will only keep articles for personal use, and will not openly distribute them via Dropbox, Google Drive or other file sharing services. Please confirm that you accept the terms of use. Cancel Send ×Send article to Google Drive To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about sending content to Google Drive. EVALUATING LIFE OR DEATH PROSPECTSVolume 28, Issue 2Luc Bovens and Marc Fleurbaey DOI: https://doi.org/10.1017/S0266267112000235Available formats PDF Please select a format to send. By using this service, you agree that you will only keep articles for personal use, and will not openly distribute them via Dropbox, Google Drive or other file sharing services. Please confirm that you accept the terms of use. Cancel Send ×Export citation Request permission. (shrink)
The Puzzle of the Hats is a betting arrangement which seems to show that a Dutch book can be made against a group of rational players with common priors who act in the common interest and have full trust in the other players’ rationality. But we show that appearances are misleading—no such Dutch book can be made. There are four morals. First, what can be learned from the puzzle is that there is a class of situations in which credences and (...) betting rates diverge. Second, there is an analogy between ways of dealing with situations of this kind and different policies for sequential choice. Third, there is an analogy with strategic voting, showing that the common interest is not always served by expressing how things seem to you in social decision-making. And fourth, our analysis of the Puzzle of the Hats casts light on a recent controversy about the Dutch book argument for the Sleeping Beauty. (shrink)
We construct a probabilistic coherence measure for information sets which determines a partial coherence ordering. This measure is applied in constructing a criterion for expanding our beliefs in the face of new information. A number of idealizations are being made which can be relaxed by an appeal to Bayesian Networks.
We develop a utilitarian framework to assess different decision rules for the European Council of Ministers. The proposals to be decided on are conceptualized as utility vectors and a probability distribution is assumed over the utilities. We first show what decision rules yield the highest expected utilities for different means of the probability distri- bution. For proposals with high mean utility, simple bench- mark rules (such as majority voting with proportional weights) tend to outperform rules that have been proposed in (...) the political arena. For proposals with low mean utility, it is the other way round. We then compare the expected utilities for smaller and larger countries and look for Pareto- dominance relations. Finally, we provide an extension of the model, discuss its restrictions, and compare our approach with assessments of decision rules that are based on the Penrose measure of voting power. (shrink)
I argue in this paper that there are two considerations which govern the dynamics of a two-person bargaining game, viz. relative proportionate utility loss from conceding to one's opponent's proposal and relative non-proportionate utility loss from not conceding to one's opponent's proposal, if she were not to concede as well. The first consideration can adequately be captured by the information contained in vNM utilities. The second requires measures of utility which allow for an interpersonal comparison of utility differences. These considerations (...) respectively provide for a justification of the Nash solution and the Kalai egalitarian solution. However, none of these solutions taken by themselves can provide for a full story of bargaining, since, if within a context of bargaining one such consideration is overriding, the solution which does not match this consideration will yield unreasonable results. I systematically present arguments to the effect that each justification from self-interest for respectively the Nash and the Kalai egalitarian solution is vulnerable to this kind of objection. I suggest that the search for an integrative model may be a promising line of research. (shrink)
In this commentary on Yashar Saghai's article "Salvaging the Concept of Nudge" (JME 2013) I discuss his distinction between a 'prod' (which is 'substantially controlling') and a 'nudge' (which is ‘substantially non-controlling’).
Meijs and Douven (2005) present an interesting pair of alleged counterexamples and an algorithm to generate such counter-examples to our criterion for a coherence quasi-ordering over information sets as outlined in our 2003a and 2003b accounts. We agree that our criterion does not always provide an ordering when we would intuitively say that one set is more coherent than the other. Nonetheless, we think that our criterion can be salvaged.
The Puzzle of the Hats is a betting arrangement which seems to show that a Dutch book can be made against a group of rational players with common priors who act in the common interest and have full trust in the other players’ rationality. But we show that appearances are misleading—no such Dutch book can be made. There are four morals. First, what can be learned from the puzzle is that there is a class of situations in which credences and (...) betting rates diverge. Second, there is an analogy between ways of dealing with situations of this kind and different policies for sequential choice. Third, there is an analogy with strategic voting, showing that the common interest is not always served by expressing how things seem to you in social decision-making. (shrink)
Blackburn argues that moral supervenience in conjunction with the lack of entailments from naturalistic to moral judgments poses a challenge to moral realism. Klagge and McFetridge try to avert the challenge by appealing to synthetically necessary connections between natural and moral properties. Blackburn rejoins that, even if there are such connections, the challenge still remains. We remain agnostic on the question whether there are such connections, but argue against Blackburn that, if there are indeed such connections, then the challenge to (...) moral realism, properly phrased, does not hold up. (shrink)
We consider a decision board with representatives who vote on proposals on behalf of their constituencies. We look for decision rules that realize utilitarian and egalitarian ideals. We set up a simple model and obtain roughly the following results. If the interests of people from the same constituency are uncorrelated, then a weighted rule with square root weights does best in terms of both ideals. If there are perfect correlations, then the utilitarian ideal requires proportional weights, whereas the egalitarian ideal (...) requires equal weights. We investigate correlations that are in between these extremes and provide analytic arguments to connect our results to Barberà and Jackson :317–339, 2006) and to Banzhaf voting power. (shrink)
Real Nudge.Luc Bovens - 2012 - European Journal of Risk Regulation 3 (1):43-6.details
The novelty in Adam Burgess’ paper is that he assesses nudge policies in the context of the shift in the UK government’s approach to risk from the nannying policies of Labour to the nudge policies of the Conservatives. There is a wealth of ideas in this paper. I find it useful to disentangle some of these ideas focusing on the following two questions: 1. In what respects do Labour’s nannying policies and the Conservatives’ nudge policies differ? 2. What is problematic (...) about Labour’s nannying and the Conservatives’ nudge policies? Subsequently I will reflect on how a particular strand of research in the social sciences can be made relevant to designing a more responsible way of dealing with societal risk and show how this approach can evade some of Burgess’ concerns. (shrink)
There are three slogans in the history of Socialism that are very close in wording, viz. the famous Cabet-Blanc-Marx slogan: "From each according to his ability; To each according to his needs"; the earlier Saint-Simon-Pecqueur slogan: "To each according to his ability; To each according to his works"; and the later slogan in Stalin’s Soviet Constitution: "From each according to his ability; To each according to his work." We will consider the following questions regarding these slogans: a) What are the (...) earliest occurrences of each of these slogans? b) Where does the inspiration for each half of each slogan come from? c) What do the Saint-Simonians mean by “To each according to his ability”? d) What do they mean by “To each according to his works”? e) What motivates the shift from “To each according to his ability” to “From each according to his ability”? f) How should we envisage the progression toward “To each according to his needs”? g) What is the distinction between from “To each according to his works” and “To each according to his work”? (shrink)
There are two curious features about the backward induction argument (BIA) to the effect that repeated non-cooperation is the rational solution to the finite iterated prisoner’s dilemma (FIPD). First, however compelling the argument may seem, one remains hesitant either to recommend this solu- tion to players who are about to engage in cooperation or to explain cooperation as a deviation from rational play in real-life FIPD’s. Second, there seems to be a similarity between the BIA for the FIPD and the (...) surprise exam paradox (SEP) and one cannot help but wonder whether the former is indeed no more than an instance of the latter. I argue that there is an important difference between the BIA for the FIPD and the SEP, but that a comparison to the SEP can help us understand why the conclusion of the BIA for the FIPD strikes us as a counterintuitive solution to real-life FIPD’s. (shrink)
I argue that by constructing an identity of Bohemian whim and spontaneity one can make what was previously an akratic action into a fully rational action, since in performing the action, one asserts one identity.
In the novel "Het Been" by the Flemish writer Willem Elsschot. In the novel, a businessman becomes obsessive over the fact that a victim of his unscrupulous business practices refuses to forgive him. This raises the following questions: Why does one find it upsetting when the victim of one's wrongdoing refuses to accept our apologies? Why does one find it upsetting when the victim is unwilling to grant us the forgiveness that we are asking for?
Gender-neutral bathrooms are usually framed as an accommodation for trans and other gender-nonconforming individuals. In this paper, we show that the benefits of gender-neutral bathrooms are much broader. First, our simulations show that gender-neutral bathrooms reduce average waiting times: while waiting times for women go down invariably, waiting times for men either go down or slightly increase depending on usage intensity, occupancy-time differentials and the presence of urinals. Second, our result can be turned on its head: firms have an opportunity (...) to reduce the number of facilities and cut costs by making them all gender-neutral without increasing waiting times. These observations can be used to reframe the gender-neutral bathrooms debate so that they appeal to a larger constituency, cutting across the usual dividing lines in the ‘bathroom wars’. Finally, there are improved designs and behavioural strategies that can help overcome resistance. We explore what strategies can be invoked to mitigate the objections that gender-neutral bathrooms (1) are unsafe, (2) elicit discomfort and (3) are unhygienic. (shrink)