If called upon would you fight in a war you thought unjust? This article attempts to explain why the majority of military officers and soldiers when faced with this question do fight despite moral misgivings they may have. I will explain why on one hand officers are morally obligated to refuse unjust orders in jus in bello cases, but on the other hand it can be argued that they are also obligated to follow orders they believe to be unjust concerning (...) jus ad bellum. The war in Iraq has and continues to presents military officers and their soldiers with a particularly difficult moral dilemma. As citizens of a democratic government it is important that we understand why our military officers act in ways that may seem counter to what we believe is right. (shrink)
This paper explores the central question of why soldiers in democratic societies might decide to fight in wars that they may have reason to believe are objectively or questionably unjust. First, I provide a framework for understanding the dilemma caused by an unjust war and a soldier's competing moral obligations; namely, the obligations to self and state. Next, I address a few traditional key thoughts concerning soldiers and jus ad bellum. This is followed by an exploration of the unique and (...) contradicting moral problems that confront modern soldiers and their officers. I argue that although traditional positions such as invincible ignorance provide a rather dangerous ‘head-in-the-sand’ mentality, soldiers serving a democratic government are nonetheless very limited in their legal and moral ability to interpret what is a justifiable war. However, a very few select senior officers are in positions to make such legal and moral decisions concerning jus ad bellum. (shrink)
In Zadig, published in 1748, Voltaire wrote of “the great principle that it is better to run the risk of sparing the guilty than to condemn the innocent.” At about the same time, Blackstone noted approvingly that “the law holds that it is better that ten guilty persons escape, than that one innocent suffer.” In 1824, Thomas Fielding cited the principle as an Italian proverb and a maxim of English law. John Stuart Mill endorsed it in an address to Parliament (...) in 1868. General acceptance of this maxim continues into our own period, yet it is difficult to find systematic attempts to defend the maxim. It is treated as a truism in no need of defense. But the principle within it is not at all obvious; and since it undergirds many of our criminal justice policies, we should be sure that it is justifiable. First, however, we must clarify what the principle means. (shrink)
Hobbes conception of reason as computation or reckoning is significantly different in Part I of De Corpore from what I take to be the later treatment in Leviathan. In the late actual computation with words starts with making an affirmation, framing a proposition. Reckoning then has to do with the consequences of propositions, or how they connect the facts, states of affairs or actions which they refer tor account. Starting from this it can be made clear how Hobbes understood the (...) crucial application of this conception to natural law, identified as 'right reason'. (shrink)
A fascinating study of moral languages and their discontents, Ethics after Babel explains the links that connect contemporary moral philosophy, religious ethics, and political thought in clear, cogent, even conversational prose. Princeton's paperback edition of this award-winning book includes a new postscript by the author that responds to the book's noted critics, Stanley Hauerwas and the late Alan Donagan. In answering his critics, Jeffrey Stout clarifies the book's arguments and offers fresh reasons for resisting despair over the prospects of (...) democratic discourse. (shrink)
The Epistemic Objection says that certain theories of time imply that it is impossible to know which time is absolutely present. Standard presentations of the Epistemic Objection are elliptical—and some of the most natural premises one might fill in to complete the argument end up leading to radical skepticism. But there is a way of filling in the details which avoids this problem, using epistemic safety. The new version has two interesting upshots. First, while Ross Cameron alleges that the Epistemic (...) Objection applies to presentism as much as to theories like the growing block, the safety version does not overgeneralize this way. Second, the Epistemic Objection does generalize in a different, overlooked way. The safety objection is a serious problem for a widely held combination of views: “propositional temporalism” together with “metaphysical eternalism”. (shrink)
Belief in propositions has had a long and distinguished history in analytic philosophy. Three of the founding fathers of analytic philosophy, Gottlob Frege, Bertrand Russell, and G. E. Moore, believed in propositions. Many philosophers since then have shared this belief; and the belief is widely, though certainly not universally, accepted among philosophers today. Among contemporary philosophers who believe in propositions, many, and perhaps even most, take them to be structured entities with individuals, properties, and relations as constituents. For example, the (...) proposition that Glenn loves Tracy has Glenn, the loving relation, and Tracy as constituents. What is it, then, that binds these constituents together and imposes structure on them? And if the proposition that Glenn loves Tracy is distinct from the proposition that Tracy loves Glenn yet both have the same constituents, what is about the way these constituents are structured or bound together that makes them two different propositions? In The Nature and Structure of Content, Jeffrey C. King formulates a detailed account of the metaphysical nature of propositions, and provides fresh answers to the above questions. In addition to explaining what it is that binds together the constituents of structured propositions and imposes structure on them, King deals with some of the standard objections to accounts of propositions: he shows that there is no mystery about what propositions are; that given certain minimal assumptions, it follows that they exist; and that on his approach, we can see how and why propositions manage to have truth conditions and represent the world as being a certain way. The Nature and Structure of Content also contains a detailed account of the nature of tense and modality, and provides a solution to the paradox of analysis. Scholars and students working in the philosophy of mind and language will find this book rewarding reading. (shrink)
In 1929 Ernst Cassirer and Martin Heidegger participated in a momentous debate in Davos, Switzerland, which is widely held to have marked an important division in twentieth-century European thought. Peter E. Gordon’s recent book, Continental Divide: Heidegger, Cassirer, Davos, centers on this debate between these two philosophical adversaries. In his book Gordon examines the background of the debate, the issues that distinguished the respective positions of Cassirer and Heidegger, and the legacy of the debate for later decades. Throughout the work, (...) Gordon concisely portrays the source of disagreement between the two adversaries in terms of a difference between Cassirer’s philosophy of spontaneity and Heidegger’s philosophy of receptivity, or of “thrownness” , into a situation that finite human beings can never hope to master. Although it recognizes that this work provides an important contribution to our understanding of the Davos debate and to twentieth-century European thought, this review essay subjects Gordon’s manner of interpreting the distinction between Cassirer and Heidegger to critical scrutiny. Its purpose is to examine the possibility that important aspects of the debate, which do not conform to the grid imposed by Gordon’s interpretation, might have been set aside in the context of his analysis. (shrink)
How should a group with different opinions (but the same values) make decisions? In a Bayesian setting, the natural question is how to aggregate credences: how to use a single credence function to naturally represent a collection of different credence functions. An extension of the standard Dutch-book arguments that apply to individual decision-makers recommends that group credences should be updated by conditionalization. This imposes a constraint on what aggregation rules can be like. Taking conditionalization as a basic constraint, we gather (...) lessons from the established work on credence aggregation, and extend this work with two new impossibility results. We then explore contrasting features of two kinds of rules that satisfy the constraints we articulate: one kind uses fixed prior credences, and the other uses geometric averaging, as opposed to arithmetic averaging. We also prove a new characterisation result for geometric averaging. Finally we consider applications to neighboring philosophical issues, including the epistemology of disagreement. (shrink)
For nearly half a century, Quentin Skinner has been the world's foremost interpreter of Thomas Hobbes. When the contextualist mode of intellectual history now known as the “Cambridge School” was first asserting itself in the 1960s, the life and writings of John Locke were the primary topic for pioneers such as Peter Laslett and John Dunn. At that time, Hobbes was still the plaything of philosophers and political scientists, virtually all of whom wrote in an ahistorical, textual-analytic manner. Hobbes had (...) not been the subject of serious contextual research for decades, since the foundational writings of Ferdinand Tönnies. For Skinner, he was thus an ideal subject, providing a space for original research on a major figure, and an occasion for some polemically charged methodological manifestos. Both of these purposes animated his 1965 article “History and Ideology in the English Revolution,” and his 1966 article “The Ideological Context of Hobbes's Political Thought”. The latter of these remains to this day one of the most widely cited scholarly articles in the fifty-year run of Cambridge's Historical Journal. Among other results of these early efforts was the scholarly controversy during which Howard Warrender chided Skinner for having reduced the “classic texts in political philosophy” to mere “tracts for the times”. (shrink)
We prove a representation theorem for preference relations over countably infinite lotteries that satisfy a generalized form of the Independence axiom, without assuming Continuity. The representing space consists of lexicographically ordered transfinite sequences of bounded real numbers. This result is generalized to preference orders on abstract superconvex spaces.
David Lewis holds that a single possible world can provide more than one way things could be. But what are possible worlds good for if they come apart from ways things could be? We can make sense of this if we go in for a metaphysical understanding of what the world is. The world does not include everything that is the case—only the genuine facts. Understood this way, Lewis's “cheap haecceitism” amounts to a kind of metaphysical anti-haecceitism: it says there (...) aren't any genuine facts about individuals over and above their qualitative roles. (shrink)
This chapter presents the main formalism of the book, which is used in subsequent chapters to describe a variety of concepts in Husserlian phenomenology, and thereby unify them. A dynamical systems approach to Husserl is introduced, and several dynamical laws of Husserlian phenomenology are described. The first is an expectation rule according to which expectations are determined by what a person knows, sees, and does. The second is a learning rule according to which background knowledge is updated in a specific (...) way when experiences fulfill or frustrate prior expectations. In addition to these rules, a “supervenience function” is described, which associates how a thing is seen with a “trail set,” the set ways that thing is expected to look, relative to all possible ways of moving around it. This function further illustrates the explanatory dimension of phenomenology described in Chap. 2, whereby how we immanently experience things is determined by how we expect them to look relative to counterfactual movement patterns. (shrink)
Philosophy, science, and common sense all refer to propositions--things we believe and say, and things which are true or false. But there is no consensus on what sorts of things these entities are. Jeffrey C. King, Scott Soames, and Jeff Speaks argue that commitment to propositions is indispensable, and each defend their own views on the debate.
I examine three ‘anti-object’ metaphysical views: nihilism, generalism, and anti-quantificationalism. After setting aside nihilism, I argue that generalists should be anti-quantificationalists. Along the way, I attempt to articulate what a ‘metaphysically perspicuous’ language might even be.
Resolving the Vexing Question of Credentialing: Finding the Aristotelian Mean Content Type Journal Article Pages 263-273 DOI 10.1007/s10730-009-9100-2 Authors Jeffrey P. Spike, University of Texas Health Science Center at Houston Center for Health, Humanities, and the Human Spirit, Director of the Campus Wide Ethics Program 6431 Fannin, JJL 400 Houston Texas 77030 USA Journal HEC Forum Online ISSN 1572-8498 Print ISSN 0956-2737 Journal Volume Volume 21 Journal Issue Volume 21, Number 3.
Over the last two decades, scientific accounts of religion have received a great deal of scholarly and popular attention both because of their intrinsic interest and because they are widely as constituting a threat to the religion they analyse. The Believing Primate aims to describe and discuss these scientific accounts as well as to assess their implications. The volume begins with essays by leading scientists in the field, describing these accounts and discussing evidence in their favour. Philosophical and theological reflections (...) on these accounts follow, offered by leading philosophers, theologians, and scientists. This diverse group of scholars address some fascinating underlying questions: Do scientific accounts of religion undermine the justification of religious belief? Do such accounts show religion to be an accidental by-product of our evolutionary development? And, whilst we seem naturally disposed toward religion, would we fare better or worse without it? Bringing together dissenting perspectives, this provocative collection will serve to freshly illuminate ongoing debate on these perennial questions. (shrink)
“There are no gaps in logical space,” David Lewis writes, giving voice to sentiment shared by many philosophers. But different natural ways of trying to make this sentiment precise turn out to conflict with one another. One is a *pattern* idea: “Any pattern of instantiation is metaphysically possible.” Another is a *cut and paste* idea: “For any objects in any worlds, there exists a world that contains any number of duplicates of all of those objects.” We use resources from model (...) theory to show the inconsistency of certain packages of combinatorial principles and the consistency of others. (shrink)
Like many, though of course not all, philosophers, I believe in propositions. I take propositions to be structured, sentence-like entities whose structures are identical to the syntactic structures of the sentences that express them; and I have defended a particular version of such a view of propositions elsewhere. In the present work, I shall assume that the structures of propositions are at least very similar to the structures of the sentences that express them. Further, I shall assume that ordinary names (...) are devices of direct reference and contribute only their bearers to propositions, that n-place predicates contribute n-place properties or relations to propositions, and that verbs of propositional attitude contribute to propositions two-place relations between individuals and propositions. The broad outline of a framework that includes these assumptions is one that I think many, though again not all, philosophers of language find congenial. I am concerned here to investigate and explain, from the standpoint of this framework, a puzzling phenomenon. The explanation I give of the phenomenon could be adapted to fit with frameworks somewhat different from the one adopted here. I adopt the present framework in part simply for definiteness. (shrink)
Richard Jeffrey is beyond dispute one of the most distinguished and influential philosophers working in the field of decision theory and the theory of knowledge. His work is distinctive in showing the interplay of epistemological concerns with probability and utility theory. Not only has he made use of standard probabilistic and decision theoretic tools to clarify concepts of evidential support and informed choice, he has also proposed significant modifications of the standard Bayesian position in order that it provide a (...) better fit with actual human experience. Probability logic is viewed not as a source of judgment but as a framework for explaining the implications of probabilistic judgments and their mutual compatability. This collection of essays spans a period of some 35 years and includes what have become some of the classic works in the literature. There is also one completely new piece, while in many instances Jeffrey includes afterthoughts on the older essays. (shrink)
“Pragmatic encroachers” about knowledge generally advocate two ideas: (1) you can rationally act on what you know; (2) knowledge is harder to achieve when more is at stake. Charity Anderson and John Hawthorne have recently argued that these two ideas may not fit together so well. I extend their argument by working out what “high stakes” would have to mean for the two ideas to line up, using decision theory.
Neuropsychological research on the neural basis of behaviour generally posits that brain mechanisms will ultimately sufﬁce to explain all psychologically described phenomena. This assumption stems from the idea that the brain is made up entirely of material particles and ﬁelds, and that all causal mechanisms relevant to neuroscience can therefore be formulated solely in terms of properties of these elements. Thus, terms having intrinsic mentalistic and/or experiential content (e.g. ‘feeling’, ‘knowing’ and ‘effort’) are not included as primary causal factors. This (...) theoretical restriction is motivated primarily by ideas about the natural world that have been known to be fundamentally incorrect for more than three-quarters of a century. Contemporary basic physical theory differs profoundly from classic physics on the important matter of how the consciousness of human agents enters into the structure of empirical phenomena. The new principles contradict the older idea that local mechanical processes alone can account for the structure of all observed empirical data. Contemporary physical theory brings directly and irreducibly into the overall causal structure certain psychologically described choices made by human agents about how they will act. This key development in basic physical theory is applicable to neuroscience, and it provides neuroscientists and psychologists with an alternative conceptual framework for describing neural processes. Indeed, owing to certain structural features of ion channels critical to synaptic function, contemporary physical theory must in principle be used when analysing human brain dynamics. The new framework, unlike its classic-physics-based predecessor, is erected directly upon, and is compatible with, the prevailing principles of physics. It is able to represent more adequately than classic concepts the neuroplastic mechanisms relevant to the growing number of empirical studies of the capacity of directed attention and mental effort to systematically alter brain function.. (shrink)
Could space consist entirely of extended regions, without any regions shaped like points, lines, or surfaces? Peter Forrest and Frank Arntzenius have independently raised a paradox of size for space like this, drawing on a construction of Cantor’s. I present a new version of this argument and explore possible lines of response.
Retributive restrictions are principles of justice according to which what a criminal deserves on account of his individual conduct and character restricts how states are morally permitted to treat him. The main arguments offered in defense of retributive restrictions involve thought experiments in which the state punishes the innocent, a practice known as telishment. In order to derive retributive restrictions from the wrongness of telishment, one must engage in moral argument from generalization. I show how generalization arguments of the same (...) form can be used subversively to derive morally unacceptable conclusions from other scenarios in which the state intentionally inflicts undeserved coercion. For example, our considered moral convictions approve of punishment policies that inflict collateral damage, such as the ubiquitous policy of excluding the family members of inmates from prison facilities outside visiting hours. I present a generalization argument for the conclusion that these policies are seriously unjust. If we firmly believe that these policies are not unjust, then we should put less stock in generalization arguments. We should not use them to support retributive restrictions. This conclusion has broad implications for the theory and practice of criminal justice. (shrink)
That believing truly as a matter of luck does not generally constitute knowing has become epistemic commonplace. Accounts of knowledge incorporating this anti-luck idea frequently rely on one or another of a safety or sensitivity condition. Sensitivity-based accounts of knowledge have a well-known problem with necessary truths, to wit, that any believed necessary truth trivially counts as knowledge on such accounts. In this paper, we argue that safety-based accounts similarly trivialize knowledge of necessary truths and that two ways of responding (...) to this problem for safety, issuing from work by Williamson and Pritchard, are of dubious success. (shrink)
What is it to "value" something, in the semi-technical sense of the term that Gary Watson establishes? I argue that valuing something consists in caring about it. Caring involves not only emotional dispositions of the sort that Agnieszka Jaworska has elaborated, but also a distinctive cognitive disposition – namely, a (defeasible) disposition to believe the object cared about to be a source of agent-relative reasons for action and for emotion. Understood in this way, an agent's carings have a stronger claim (...) to "speak for" her as her values than do other attitudes that have been proposed for this role. In particular, an agent's carings establish more robust psychological continuities and cross-temporal connections than do self-governing policies of the sort that Michael Bratman has described; and they forge diachronic coherence not just in her deliberation and action, as self-governing policies do, but also in her cognitive and emotional life. An agent's carings thus help to constitute her identity as a temporally persisting subject . Self-governing policies are at best ersatz -values, which an agent may choose to adopt when she finds that her proper values – her cares – leave her course underdetermined. (shrink)
Famous results by David Lewis show that plausible-sounding constraints on the probabilities of conditionals or evaluative claims lead to unacceptable results, by standard probabilistic reasoning. Existing presentations of these results rely on stronger assumptions than they really need. When we strip these arguments down to a minimal core, we can see both how certain replies miss the mark, and also how to devise parallel arguments for other domains, including epistemic “might,” probability claims, claims about comparative value, and so on. A (...) popular reply to Lewis's results is to claim that conditional claims, or claims about subjective value, lack truth conditions. For this strategy to have a chance of success, it needs to give up basic structural principles about how epistemic states can be updated—in a way that is strikingly parallel to the commitments of the project of dynamic semantics. (shrink)
Suppose that several individuals who have separately assessed prior probability distributions over a set of possible states of the world wish to pool their individual distributions into a single group distribution, while taking into account jointly perceived new evidence. They have the option of first updating their individual priors and then pooling the resulting posteriors or first pooling their priors and then updating the resulting group prior. If the pooling method that they employ is such that they arrive at the (...) same final distribution in both cases, the method is said to be externally Bayesian, a property first studied by Madansky . We show that a pooling method for discrete distributions is externally Bayesian if and only if it commutes with Jeffrey conditioning, parameterized in terms of certain ratios of new to old odds, as in Wagner , rather than in terms of the posterior probabilities of members of the disjoint family of events on which such conditioning originates. (shrink)
In this original and compelling book, Jeffrey P. Bishop, a philosopher, ethicist, and physician, argues that something has gone sadly amiss in the care of the dying by contemporary medicine and in our social and political views of death, as shaped by our scientific successes and ongoing debates about euthanasia and the "right to die"--or to live. __The Anticipatory Corpse: Medicine, Power, and the Care of the Dying__, informed by Foucault's genealogy of medicine and power as well as by (...) a thorough grasp of current medical practices and medical ethics, argues that a view of people as machines in motion--people as, in effect, temporarily animated corpses with interchangeable parts--has become epistemologically normative for medicine. The dead body is subtly anticipated in our practices of exercising control over the suffering person, whether through technological mastery in the intensive care unit or through the impersonal, quasi-scientific assessments of psychological and spiritual "medicine." The result is a kind of nihilistic attitude toward the dying, and troubling contradictions and absurdities in our practices. Wide-ranging in its examples, from organ donation rules in the United States, to ICU medicine, to "spiritual surveys," to presidential bioethics commissions attempting to define death, and to high-profile cases such as Terri Schiavo's, __The Anticipatory Corpse__ explores the historical, political, and philosophical underpinnings of our care of the dying and, finally, the possibilities of change. A ground-breaking work in bioethics, this book will provoke thought and argument for all those engaged in medicine, philosophy, theology, and health policy. "With extraordinary philosophical sophistication as well as knowledge of modern medicine, Bishop argues that the body that shapes the work of modern medicine is a dead body. He defends this claim decisively with with urgency. I know of no book that is at once more challenging and informative as __The Anticipatory Corpse. __To say this book is the most important one written in the philosophy of medicine in the last twenty-five years would not do it justice. This book is destined to change the way we think and, hopefully, practice medicine." --_Stanley Hauerwas, Duke Divinity School _ "Jeffrey Bishop carefully builds a detailed, scholarly case that medicine is shaped by its attitudes toward death. Clinicians, ethicists, medical educators, policy makers, and administrators need to understand the fraught relationship between clinical practices and death, and __The Anticipatory Corpse __is an essential text. Bishop's use of the writings of Michel Foucault is especially provocative and significant. This book is the closest we have to a genealogy of death." --_Arthur W. Frank, University of Calgary _ "Jeffrey Bishop has produced a masterful study of how the living body has been placed within medicine's metaphysics of efficient causality and within its commitment to a totalizing control of life and death, which control has only been strengthened by medicine's taking on the mantle of a bio-psycho-socio-spiritual model. This volume's treatment of medicine's care of the dying will surely be recognized as a cardinal text in the philosophy of medicine." --_H. Tristram Engelhardt, Jr., Rice University, Baylor College of Medicine_. (shrink)
The counterpart theorist has a problem: there is no obvious way to understand talk about actuality in terms of counterparts. Fara and Williamson have charged that this obstacle cannot be overcome. Here I defend the counterpart theorist by offering systematic interpretations of a quantified modal language that includes an actuality operator. Centrally, I disentangle the counterpart relation from a related notion, a ‘representation relation’. The relation of possible things to the actual things they represent is variable, and an adequate account (...) of modal language must keep track of the way it is systematically shifted by modal operators. I apply my account to resolve several puzzles about counterparts and actuality. In technical appendices, I prove some important logical results about this ‘representational’ counterpart system and its relationship to other modal systems. (shrink)
Suppose that all non-qualitative facts are grounded in qualitative facts. I argue that this view naturally comes with a picture in which trans-world identity is indeterminate. But this in turn leads to either pervasive indeterminacy in the non-qualitative, or else contingency in what facts about modality and possible worlds are determinate.
Some hold that the lesson of Russell’s paradox and its relatives is that mathematical reality does not form a ‘definite totality’ but rather is ‘indefinitely extensible’. There can always be more sets than there ever are. I argue that certain contact puzzles are analogous to Russell’s paradox this way: they similarly motivate a vision of physical reality as iteratively generated. In this picture, the divisions of the continuum into smaller parts are ‘potential’ rather than ‘actual’. Besides the intrinsic interest of (...) this metaphysical picture, it has important consequences for the debate over absolute generality. It is often thought that ‘indefinite extensibility’ arguments at best make trouble for mathematical platonists; but the contact arguments show that nominalists face the same kind of difficulty, if they recognize even the metaphysical possibility of the picture I sketch. (shrink)
Two distinguished social and political philosophers take opposing positions in this highly engaging work. Louis P. Pojman justifies the practice of execution by appealing to the principle of retribution while Jeffrey Reiman argues that although the death penalty is a just punishment for murder, we are not morally obliged to execute murderers.
Jeffrey E. Brower presents and explains the hylomorphic conception of the material world developed by Thomas Aquinas, according to which material objects are composed of both matter and form. In addition to presenting and explaining Aquinas's views, Brower seeks wherever possible to bring them into dialogue with the best recent literature on related topics. Along the way, he highlights the contribution that Aquinas's views make to a host of contemporary metaphysical debates, including the nature of change, composition, material constitution, (...) the ontology of stuff vs. things, the proper analysis of ordinary objects, the truthmakers for essential vs. accidental predication, and the metaphysics of property possession. (shrink)
Data science, and the related field of big data, is an emerging discipline involving the analysis of data to solve problems and develop insights. This rapidly growing domain promises many benefits to both consumers and businesses. However, the use of big data analytics can also introduce many ethical concerns, stemming from, for example, the possible loss of privacy or the harming of a sub-category of the population via a classification algorithm. To help address these potential ethical challenges, this paper maps (...) and describes the main ethical themes that were identified via systematic literature review. It then identifies a possible structure to integrate these themes within a data science project, thus helping to provide some structure in the on-going debate with respect to the possible ethical situations that can arise when using data science analytics. (shrink)
Richard Jeffrey's generalization of Bayes' rule of conditioning follows, within the theory of belief functions, from Dempster's rule of combination and the rule of minimal extension. Both Jeffrey's rule and the theory of belief functions can and should be construed constructively, rather than normatively or descriptively. The theory of belief functions gives a more thorough analysis of how beliefs might be constructed than Jeffrey's rule does. The inadequacy of Bayesian conditioning is much more general than Jeffrey's (...) examples of uncertain perception might suggest. The ``parameter α '' that Hartry Field has introduced into Jeffrey's rule corresponds to the "weight of evidence" of the theory of belief functions. (shrink)