The paper has a twofold aim. On the one hand, it provides what appears to be the first game-theoretic modeling of Napoleon’s last campaign, which ended dramatically on 18 June 1815 at Waterloo. It is specifically concerned with the decision Napoleon made on 17 June 1815 to detach part of his army against the Prussians he had defeated, though not destroyed, on 16 June at Ligny. Military historians agree that this decision was crucial but disagree about whether it was rational. (...) Hypothesizing a zero-sum game between Napoleon and Blücher, and computing its solution, we show that it could have been a cautious strategy on the former's part to divide his army, a conclusion which runs counter to the charges of misjudgement commonly heard since Clausewitz. On the other hand, the paper addresses methodological issues. We defend its case study against the objections of irrelevance that have been raised elsewhere against “analytic narratives”, and conclude that military campaigns provide an opportunity for successful application of the formal theories of rational choice. Generalizing the argument, we finally investigate the conflict between narrative accounts – the historians' standard mode of expression – and mathematical modeling. (shrink)
The Pareto principle states that if the members of society express the same preference judgment between two options, this judgment is compelling for society. A building block of normative economics and social choice theory, and often borrowed by contemporary political philosophy, the principle has rarely been subjected to philosophical criticism. The paper objects to it on the ground that it indifferently applies to those cases in which the individuals agree on both their expressed preferences and their reasons for entertaining them, (...) and those cases in which they agree on their expressed preferences, while differing on their reasons. The latter are cases of "spurious unanimity", and it is normatively inappropriate, or so the paper argues, to defend unanimity preservation at the social level for them, so the Pareto principle is formulated much too broadly. The objection seems especially powerful when the principle is applied in an ex ante context of uncertainty, in which individuals can disagree on both their probabilities and utilities, and nonetheless agree on their preferences over prospects. (shrink)
Judgment aggregation theory, or rather, as we conceive of it here, logical aggregation theory generalizes social choice theory by having the aggregation rule bear on judgments of all kinds instead of merely preference judgments. It derives from Kornhauser and Sager’s doctrinal paradox and List and Pettit’s discursive dilemma, two problems that we distinguish emphatically here. The current theory has developed from the discursive dilemma, rather than the doctrinal paradox, and the final objective of the paper is to give the latter (...) its own theoretical development along the line of recent work by Dietrich and Mongin. However, the paper also aims at reviewing logical aggregation theory as such, and it covers impossibility theorems by Dietrich, Dietrich and List, Dokow and Holzman, List and Pettit, Mongin, Nehring and Puppe, Pauly and van Hees, providing a uniform logical framework in which they can be compared with each other. The review goes through three historical stages: the initial paradox and dilemma, the scattered early results on the independence axiom, and the so-called canonical theorem, a collective achievement that provided the theory with its specific method of analysis. The paper goes some way towards philosophical logic, first by briefly connecting the aggregative framework of judgment with the modern philosophy of judgment, and second by thoroughly discussing and axiomatizing the ‘general logic’ built in this framework. (shrink)
We investigate judgment aggregation by assuming that some formulas of the agenda are singled out as premisses, and the Independence condition (formula-wise aggregation) holds for them, though perhaps not for others. Whether premiss-based aggregation thus de…ned is non-degenerate depends on how premisses are logically connected, both among themselves and with other formulas. We identify necessary and su¢ cient conditions for dictatorship or oligarchy on the premisses, and investigate when these results extend to the whole agenda. Our theorems recover or strengthen (...) several existing ones and are formulated for in…nite populations, an innovation of this paper. JEL identi…cation numbers: D70, D71. (shrink)
According to a theorem recently proved in the theory of logical aggregation, any nonconstant social judgment function that satisfies independence of irrelevant alternatives (IIA) is dictatorial. We show that the strong and not very plausible IIA condition can be replaced with a minimal independence assumption plus a Pareto-like condition. This new version of the impossibility theorem likens it to Arrow’s and arguably enhances its paradoxical value.
The paper discusses the sense in which the changes undergone by normative economics in the twentieth century can be said to be progressive. A simple criterion is proposed to decide whether a sequence of normative theories is progressive. This criterion is put to use on the historical transition from the new welfare economics to social choice theory. The paper reconstructs this classic case, and eventually concludes that the latter theory was progressive compared with the former. It also briefly comments on (...) the recent developments in normative economics and their connection with the previous two stages. (Published Online April 18 2006) Footnotes1 This paper suspersedes an earlier one entitled “Is There Progress in Normative Economics?” (Mongin 2002). I thank the organizers of the Fourth ESHET Conference (Graz 2000) for the opportunity they gave me to lecture on this topic. Thanks are also due to J. Alexander, K. Arrow, A. Bird, R. Bradley, M. Dascal, W. Gaertner, N. Gravel, D. Hausman, B. Hill, C. Howson, N. McClennen, A. Trannoy, J. Weymark, J. Worrall, two annonymous referees of this journal, and especially the editor M. Fleurbaey, for helpful comments. The editor's suggestions contributed to determine the final orientation of the paper. The author is grateful to the LSE and the Lachmann Foundation for their support at the time when he was writing the initial version. (shrink)
The paper analyses economic evaluations by distinguishing evaluative statements from actual value judgments. From this basis, it compares four solutions to the value neutrality problem in economics. After rebutting the strong theses about neutrality (normative economics is illegitimate) and non-neutrality (the social sciences are value-impregnated), the paper settles the case between the weak neutrality thesis (common in welfare economics) and a novel, weak non-neutrality thesis that extends the realm of normative economics more widely than the other weak thesis does.
The paper extends a result in Dutta and Ray's (1989) theory of constrained egalitarianism initiated by relying on the concept of proportionate rather than absolute equality. We apply this framework to redistributive systems in which what the individuals get depends on what they receive or pay qua members of generally overlapping groups. We solve the constrained equalization problem for this class of models. The paper ends up comparing our solution with the alternative solution based on the Shapley value, which has (...) been recommended in some distributive applications. (shrink)
We show that several logics of common belief and common knowledge are not only complete, but also strongly complete, hence compact. These logics involve a weakened monotonicity axiom, and no other restriction on individual belief. The semantics is of the ordinary fixed-point type.
Following a long-standing philosophical tradition, impartiality is a distinctive and determining feature of moral judgments, especially in matters of distributive justice. This broad ethical tradition was revived in welfare economics by Vickrey, and above all, Harsanyi, under the form of the so-called Impartial Observer Theorem. The paper offers an analytical reconstruction of this argument and a step-wise philosophical critique of its premisses. It eventually provides a new formal version of the theorem based on subjective probability.
The relations between rationality and optimization have been widely discussed in the wake of Herbert Simon's work, with the common conclusion that the rationality concept does not imply the optimization principle. The paper is partly concerned with adding evidence for this view, but its main, more challenging objective is to question the converse implication from optimization to rationality, which is accepted even by bounded rationality theorists. We discuss three topics in succession: (1) rationally defensible cyclical choices, (2) the revealed preference (...) theory of optimization, and (3) the infinite regress of optimization. We conclude that (1) and (2) provide evidence only for the weak thesis that rationality does not imply optimization. But (3) is seen to deliver a significant argument for the strong thesis that optimization does not imply rationality. (shrink)
A reply to Fransisco Vergara's attack on Halévy's interpretation of Bentham in Philosophy, January, 1998. Vergara had argued that Halévy was mistaken in interpreting Bentham's principle of utility as a psychological law as well as the ethical greatest happiness principle. Mongin and Sigot show that Halévy correctly interpreted Bentham's texts and that the psychological law is necessary to Bentham's legal theory, economics and politics; they also argue that it is incorrect to confuse the principle of utility with a theory of (...) universal selfishness, and that this misunderstanding underlies Vergara's mistaken picture of both Halévy and Bentham. (shrink)
The paper surveys the currently available axiomatizations of common belief (CB) and common knowledge (CK) by means of modal propositional logics. (Throughout, knowledge ‚Äî whether individual or common ‚Äî is defined as true belief.) Section 1 introduces the formal method of axiomatization followed by epistemic logicians, especially the syntax-semantics distinction, and the notion of a soundness and completeness theorem. Section 2 explains the syntactical concepts, while briefly discussing their motivations. Two standard semantic constructions, Kripke structures and neighbourhood structures, are introduced (...) in Sections 3 and 4, respectively. It is recalled that Aumann's partitional model of CK is a particular case of a definition in terms of Kripke structures. The paper also restates the well-known fact that Kripke structures can be regarded as particular cases of neighbourhood structures. Section 3 reviews the soundness and completeness theorems proved w.r.t. the former structures by Fagin, Halpern, Moses and Vardi, as well as related results by Lismont. Section 4 reviews the corresponding theorems derived w.r.t. the latter structures by Lismont and Mongin. A general conclusion of the paper is that the axiomatization of CB does not require as strong systems of individual belief as was originally thought ‚Äî onlymonotonicity has thusfar proved indispensable. Section 5 explains another consequence of general relevance: despite the ‚Äúinfinitary‚Äù nature of CB, the axiom systems of this paper admit of effective decision procedures, i.e., they aredecidable in the logician's sense. (shrink)
This article attempts to assess Jon Elster's contribution to rational choice in Ulysses and the Sirens and Sour Grapes. After reviewing Elster's analysis of functional versus intentional explanations, the essay moves on to the crucial distinction between the thin and broad theories of rationality. The former elabo rates on the traditional economist's preference / feasible set apparatus; the latter is the more demanding theory which inquires into the rationality of beliefs and preferences. Elster's approach to the broad theory normally consists (...) in using the thin theory as a reference point and in making purposefully limited departures from it. The essay illustrates the method while commenting on Elster's discus sion of autonomous preferences in Sour Grapes. It goes on to stress some impor tant analogies between Elster's use of the thin and broad theories, on one hand, and Weber's ideal-typical method, on the other. The final assessment is phrased in terms of these analogies; it is suggested that Elster is at his best when the ideal-typical method and his own separate from each other, that is, when he comes to grips with the broad theory in its own terms. (shrink)
This note aims at critically assessing a little-noticed proposal made by Popper in the second edition ofObjective Knowledge to the effect that verisimilitude of scientific theories should be made relative to the problems they deal with. Using a simple propositional calculus formalism, it is shown that the relativized definition fails for the very same reason why Popper's original concept of verisimilitude collapsed-only if one of two theories is true can they be compared in terms of the suggested definition of versimilitude.
The article discusses Friedman's classic claim that economics can be based on irrealistic assumptions. It exploits Samuelson's distinction between two "F-twists" (that is, "it is an advantage for an economic theory to use irrealistic assumptions" vs "the more irrealistic the assumptions, the better the economic theory"), as well as Nagel's distinction between three philosophy-of-science construals of the basic claim. On examination, only one of Nagel's construals seems promising enough. It involves the neo-positivistic distinction between theoretical and non-theoretical ("observable") terms; so (...) Friedman would in some sense argue for the major role of theoretical terms in economics. The paper uses a model-theoretic apparatus to refine the selected construal and check whether it can be made to support the claim. This inquiry leads to essentially negative results for both F-twists, and the final conclusion is that they are left unsupported. (shrink)