This brief paperback is designed for symbolic/formal logic courses. It features the tree method proof system developed by Jeffrey. The new edition contains many more examples and exercises and is reorganized for greater accessibility.
Richard Jeffrey is beyond dispute one of the most distinguished and influential philosophers working in the field of decision theory and the theory of knowledge. His work is distinctive in showing the interplay of epistemological concerns with probability and utility theory. Not only has he made use of standard probabilistic and decision theoretic tools to clarify concepts of evidential support and informed choice, he has also proposed significant modifications of the standard Bayesian position in order that it provide a (...) better fit with actual human experience. Probability logic is viewed not as a source of judgment but as a framework for explaining the implications of probabilistic judgments and their mutual compatability This collection of essays spans a period of some 35 years and includes what have become some of the classic works in the literature. There is also one completely new piece, while in many instances Jeffrey includes afterthoughts on the older essays. (shrink)
The misunderstanding of philosopher Immanuel Kant's principle of morality - the categorical imperative - by journalism professionals, professors, and students comes in many forms. To better understand Kant's ethical theory, however, one must go beyond Kant's Groundwork for the Metaphysics of Morals and study his Doctrine of Virtue: Part 2 of The Metaphysics of Morals; to apply the categorical imperative, one must also understand the importance Kant placed on moral education.
"To some people, life is very simple . . . no shadings and grays, all blacks and whites. . . . Now, others of us find that good, bad, right, wrong, are many-sided, complex things. We try to see every side; but the more we see, the less sure we are.".
We review three possible theoretical mechanisms for the placebo effect: conditioning, expectancy and endogenous opiates and consider the implications of the first two for clinical research and practice in the area of pain management. Methodological issues in the use of placebos as controls are discussed and include subtractive versus additive expectancy effects, no treatment controls, active placebo controls, the balanced placebo design, between- versus within-group designs, triple blind methodology and the double expectancy design. Therapeutically, the possibility of shaping negative placebo (...) responses through placebo sag, overservicing and the use of placebos on their own are explored. Suggestions for using conditioned placebos strategically in conjunction with nonplacebos are made and ways of maximizing the placebo component of nonplacebo treatments are examined. Finally, the importance of investigating the placebo effect in its own right is advocated in order to better understand the long-neglected psychological aspects of the therapeutic transaction. (shrink)
From a point of view like de Finetti's, what is the judgmental reality underlying the objectivistic claim that a physical magnitude X determines the objective probability that a hypothesis H is true? When you have definite conditional judgmental probabilities for H given the various unknown values of X, a plausible answer is sufficiency, i.e., invariance of those conditional probabilities as your probability distribution over the values of X varies. A different answer, in terms of conditional exchangeability, is offered for use (...) when such definite conditional probabilities are absent. (shrink)
Computer simulation has become important in ecological modeling, but there have been few assessments on how complex simulation models differ from more traditional analytic models. In Part I of this paper, I review the challenges faced in complex ecological modeling and how models have been used to gain theoretical purchase for understanding natural systems. I compare the use of traditional analytic simulation models and point how that the two methods require different kinds of practical engagement. I examine a case study (...) of three models from the insect resistance literature in transgenic crops to illustrate and explore differences in analytic and computer simulation models. I argue that analyzing simulation models has been often inappropriately managed with expectations derived from handling analytic models. In Part II, I look at simulation as a hermeneutic practice. I argue that simulation models are a practice or techné. I the explore five aspects of philosophical hermeneutics that may be useful in complex ecological simulation: (1) an openness to multiple perspectives allowing multiple levels of scientific pluralism, (2) the hermeneutic circle, a back and forth in active communication among both modelers and ecologists; (3) the recognition of human factors and the nature of human practices as such, including recognizing the role of judgments and choices in the modeling enterprise; (4) the importance of play in modeling; (5) the non-closed nature of hermeneutic engagement, continued dialogue, and recognizing the situatedness, incompleteness, and tentative nature of simulation models. (shrink)
Isaac Levi and I have different views of probability and decision making. Here, without addressing the merits, I will try to answer some questions recently asked by Levi (1985) about what my view is, and how it relates to his.
The approach to decision theory floated in my 1965 book is reviewed (I), challenged in various related ways (II–V) and defended, firstad hoc (II–IV) and then by a general argument of Ellery Ells's (VI). Finally, causal decision theory (in a version sketched in VII) is exhibited as a special case of my 1965 theory, according to the Eellsian argument.
Logicism Lite counts number‐theoretical laws as logical for the same sort of reason for which physical laws are counted as as empirical: because of the character of the data they are responsible to. In the case of number theory these are the data verifying or falsifying the simplest equations, which Logicism Lite counts as true or false depending on the logical validity or invalidity of first‐order argument forms in which no numbertheoretical notation appears.
Jonathan Weisberg has argued that Jeffrey Conditioning is inherently “anti-holistic” By this he means, inter alia, that JC does not allow us to take proper account of after-the-fact defeaters for our beliefs. His central example concerns the discovery that the lighting in a room is red-tinted and the relationship of that discovery to the belief that a jelly bean in the room is red. Weisberg’s argument that the rigidity required for JC blocks the defeating role of the red-tinted light (...) rests on the strong assumption that all posteriors within the distribution in this example are rigid on a partition over the proposition that the jelly bean is actually red. But individual JC updates of propositions do not require such a broad rigidity assumption. Jeffrey conditionalizers should consider the advantages of a modest project of targeted updating focused on particular propositions rather than seeking to update the entire distribution using one obvious partition. Although Weisberg’s example fails to show JC to be irrelevant or useless, other problems he raises for JC (the commutativity and inputs problems) remain and actually become more pressing when we recognize the important role of background information. (shrink)
This paper discusses simultaneous belief updates. I argue here that modeling such belief updates using the Principle of Minimum Information can be regarded as applying Jeffrey conditionalization successively, and so that, contrary to what many probabilists have thought, the simultaneous belief updates can be successfully modeled by means of Jeffrey conditionalization.
Bayesian decision theory can be viewed as the core of psychological theory for idealized agents. To get a complete psychological theory for such agents, you have to supplement it with input and output laws. On a Bayesian theory that employs strict conditionalization, the input laws are easy to give. On a Bayesian theory that employs Jeffrey conditionalization, there appears to be a considerable problem with giving the input laws. However, Jeffrey conditionalization can be reformulated so that the problem (...) disappears, and in fact the reformulated version is more natural and easier to work with on independent grounds. (shrink)
Since the beginning of the ?eighties of the present century, a circle of relatively young American sociologists who are followers of Jeffrey Alexander are making energetic and spectacular efforts to supply sociology with a uniform and comprehensive theoretical framework by continuing Talcott Parsons' lifework. The present article is an appreciation of Alexander's achievements in the justification of a general sociological theory (especially a theory of action and social order) while pointing to objections that can be raised against the character (...) of his theory. A scrutiny of Alexander's metatheoretical deliberations and of his interpretations of sociological classics such as Marx, Durkheim, Weber, and Parsons reveals that Alexander's metatheoretical frame is not flexible enough to actually reconstruct the problem situation of the classics. Pointers are given toward a theory of action that is not subject to the antinomy of utilitarianism and normativism, so that it is more adequate and appropriate to the heritage of the sociological classics, both from a theoretical and an interpretative angle. (shrink)