Part 1 Background on de Finetti’s twin criteria of coherence: Coherence1: 2-sided previsions free from dominance through a Book. Coherence2: Forecasts free from dominance under Brier (squared error) score. Part 2 IP theory based on a scoring rule.
Force Fields collects the recent essays of Martin Jay, an intellectual historian and cultural critic internationally known for his extensive work on the history of Western Marxism and the intellectual migration from Germany to America.
Experimenters sometimes insist that it is unwise to examine data before determining how to analyze them, as it creates the potential for biased results. I explore the rationale behind this methodological guideline from the standpoint of an error statistical theory of evidence, and I discuss a method of evaluating evidence in some contexts when this predesignation rule has been violated. I illustrate the problem of potential bias, and the method by which it may be addressed, with an example from the (...) search for the top quark. A point in favor of the error statistical theory is its ability, demonstrated here, to explicate such methodological problems and suggest solutions, within the framework of an objective theory of evidence. (shrink)
When real-valued utilities for outcomes are bounded, or when all variables are simple, it is consistent with expected utility to have preferences defined over probability distributions or lotteries. That is, under such circumstances two variables with a common probability distribution over outcomes – equivalent variables – occupy the same place in a preference ordering. However, if strict preference respects uniform, strict dominance in outcomes between variables, and if indifference between two variables entails indifference between their difference and the status quo, (...) then preferences over rich sets of unbounded variables, such as variables used in the St. Petersburg paradox, cannot preserve indifference between all pairs of equivalent variables. In such circumstances, preference is not a function only of probability and utility for outcomes. Then the preference ordering is not defined in terms of lotteries. (shrink)
Customer orientation (CO) and the development of long-term relationships with customers are known conditions for growth and profit sustainability. Businesses use special treatments, inducements, and personal gestures to show their appreciation to customers. However, there are concerns about whether these inducements really create the right perceptions in customer’s mind. This study suggests that when customers believe that the firm is ethical, the inducements and special treatments received are seen in a positive light and can help develop loyalty. The hypotheses were (...) tested with responses from 299 customers of financial institutions in Chile. Results support the hypotheses that firm’s ethical reputation helps in retaining customers. Managerial implications are provided. (shrink)
The popularity of films like Titanic betokens a massive shift in the nature of aesthetic spectatorship in our time. The contemplative, distanced viewer who is able to judge from afar the spectacle before him or her, has been replaced by a more proximate, involved "kinaesthetic" subject whose body is stimulated as much as his or her eye. This is evident not only in mass culture with amusement thrill rides and the return of what has been called the "cinema of attractions"; (...) this new spectator can also be discerned in avant-garde culture, as shown by the Sensation exhibition of Young British Artists which caused such a stir in London and New York. This spectator is especially attracted to simulacral scenes of destruction and catastrophe, in which he or she is virtually immersed. If aesthetic judgement is to be a model for its political counterpart, as has been argued by theorists like Lyotard and Arendt, it cannot do so on the basis of this aesthetics of violent immersion. (shrink)
We discuss several features of coherent choice functions —where the admissible options in a decision problem are exactly those that maximize expected utility for some probability/utility pair in fixed set S of probability/utility pairs. In this paper we consider, primarily, normal form decision problems under uncertainty—where only the probability component of S is indeterminate and utility for two privileged outcomes is determinate. Coherent choice distinguishes between each pair of sets of probabilities regardless the “shape” or “connectedness” of the sets of (...) probabilities. We axiomatize the theory of choice functions and show these axioms are necessary for coherence. The axioms are sufficient for coherence using a set of probability/almost-state-independent utility pairs. We give sufficient conditions when a choice function satisfying our axioms is represented by a set of probability/state-independent utility pairs with a common utility. (shrink)
We extend de Finetti’s (1974) theory of coherence to apply also to unbounded random variables. We show that for random variables with mandated infinite prevision, such as for the St. Petersburg gamble, coherence precludes indifference between equivalent random quantities. That is, we demonstrate when the prevision of the difference between two such equivalent random variables must be positive. This result conflicts with the usual approach to theories of Subjective Expected Utility, where preference is defined over lotteries. In addition, we explore (...) similar results for unbounded variables when their previsions, though finite, exceed their expected values, as is permitted within de Finetti’s theory. In such cases, the decision maker’s coherent preferences over random quantities is not even a function of probability and utility. One upshot of these findings is to explain further the differences between Savage’s theory (1954), which requires bounded utility for non-simple acts, and de Finetti’s theory, which does not. And it raises a question whether there is a theory that fits between these two. (shrink)
Statistical decision theory, whether based on Bayesian principles or other concepts such as minimax or admissibility, relies on the idea of minimizing expected loss or maximizing expected utility. Loss and utility functions are generally treated as unitless numerical measures of how costly or valuable are the various consequences of potential decisions. In this paper, we address directly the issue of the units in which loss and utility are settled and the implications that those units have on the rankings of potential (...) decisions. The simplest example is to imagine that the loss will be paid in units of some currency. If there are multiple currencies available for paying the loss, one must take explicit account of which currency is used as well as the exchange rates between the various available currencies. (shrink)
The "Dutch Book" argument, tracing back to Ramsey and to deFinetti, offers prudential grounds for action in conformity with personal probability. Under several structural assumptions about combinations of stakes (that is, assumptions about the combination of wagers), your betting policy is coherent only if your fair odds are probabilities. The central question posed here is the following one: Besides providing an operational test of coherent betting, does the "Book" argument also provide for adequate measurement (elicitation) of the agents degrees (...) of beliefs? That is, are an agent's fair odds also his/her personal probabilities for those events? We argue the answer is "No!" The problem is caused by the possibility of state dependent utilities. (shrink)
Sometimes conducting an experiment to ascertain the state of a system changes the state of the system being measured. Kahneman & Tversky modelled this effect with âsupport theoryâ. Quantum physics models this effect with probability amplitude mechanics. As this paper shows, probability amplitude mechanics is similar to support theory. Additionally, Viscusi's proposed generalized expected utility model has an analogy in quantum mechanics.
Methodology for conducting clinical trials of new drugs and treatments on people need not be regarded as fixed. After reviewing the currently most popular method (randomization) and its ethical problems, this paper explores the possibilities of a new method for conducting such trials. It relies on new Bayesian technology for eliciting the opinions of medical experts. These opinions are conditioned on specific predictor variables, and are held in a computer. At any stage in a trial, these opinions can be updated (...) in the computer using the information collected in the trial up to that point. Consider as an admissible treatment for a patient having specific values of predictor variables only those treatments that at least one expert regards as best (in the computer model) for this patient. It is proposed that only admissible treatments, so defined, be allowed to be assigned to the patient. The ethical and statistical consequences of this principle are explored. Experience to date with a trial at Johns Hopkins designed on this principle is reported. Keywords: Bayesian statistics, information, clinical trial CiteULike Connotea Del.icio.us What's this? (shrink)
When can a Bayesian investigator select an hypothesis H and design an experiment (or a sequence of experiments) to make certain that, given the experimental outcome(s), the posterior probability of H will be lower than its prior probability? We report an elementary result which establishes sufficient conditions under which this reasoning to a foregone conclusion cannot occur. Through an example, we discuss how this result extends to the perspective of an onlooker who agrees with the investigator about the statistical model (...) for the data but who holds a different prior probability for the statistical parameters of that model. We consider, specifically, one-sided and two-sided statistical hypotheses involving i.i.d. Normal data with conjugate priors. In a concluding section, using an "improper" prior, we illustrate how the preceding results depend upon the assumption that probability is countably additive. (shrink)
The degree of incoherence, when previsions are not made in accordance with a probability measure, is measured by either of two rates at which an incoherent bookie can be made a sure loser. Each bet is considered as an investment from the points of view of both the bookie and a gambler who takes the bet. From each viewpoint, we define an amount invested (or escrowed) for each bet, and the sure loss of incoherent previsions is divided by the escrow (...) to determine the rate of incoherence. Potential applications include the treatment of arbitrage opportunities in financial markets and the degree of incoherence of classical statistical procedures. We illustrate the latter with the example of hypothesis testing at a fixed size. (shrink)
The laissez-faire attitude towards dishonesty in research has simply created an environment for widespread escalation of the problem. Can we now believe anything we read? Why should we have confidence in an author because of his eminence? Should we automatically accept that clinical trials are always conducted with total integrity? Why have we been afraid to tackle this crisis head-on?
A donation paradox occurs when a player gives an apparently valuable prerogative to another player, but âdoes betterâ, according to some criterion. Peremptory challenges, used in choosing a American jury, permit each side to veto a certain number of potential jurors. With even a very simple model of jury selection, it is shown that for one side to give a peremptory challenge to the other side may lead to a more favorable jury, an instance of the donation paradox. Both a (...) theorem and examples are given concerning the existence of the donation paradox in the optimal use of peremptory challenges. (shrink)
Taking on the stigma of inauthenticity : Adorno's critique of genuineness -- Is experience still in crisis? : reflections on a Frankfurt school lament -- Mourning a metaphor: the revolution is over -- Cultural relativism and the visual turn -- Scopic regimes of modernity revisited -- No state of grace : violence in the garden -- Visual parrhesia? : Foucault and the truth of the gaze -- The Kremlin of modernism -- Phenomenology and lived experience -- Aesthetic experience and historical (...) experience : a twenty-first-century constellation -- Still waiting to hear from derrida -- Pseudology : Derrida on Arendt and lying in politics -- The menace of consilience : keeping the disciplines unreconciled -- Can there be national philosophies in a transnational world? -- Straddling a watershed? -- Allons enfants de l'humanité : the French and human rights -- Intellectual family values : William Phillips, Hannah Arendt, and the Partisan review -- Still sleeping rough : Colin Wilson's the outsider at fifty. (shrink)
Note: The Simpson's, television's popular prime-time cartoon known for its satirical commentary on various social issues, recently took a shot at the creation-evolution debate by featuring Stephen Jay Gould prominently in one of its episodes. Here is Bill Dembski's review and observations of that episode.
‘Experience is the best teacher’ goes the cliché without ever making clear just want is meant by that slippery first term. ‘Experience is never remembered unaltered’ goes another. Is experience something to be undergone, like a journey, or is it perhaps the relational immediacy between organism and environment? What do we reference when we use the term experience? -/- Martin Jay, renowned intellectual historian from UC Berkeley, here examines these questions in a grand survey of the term’s use throughout the (...) intellectual history of what was once called Western Civilization. Beginning with the ancient Greeks (of course), he reviews the surprising number of variations employed and assumed by philosophers, theologians critical theorists, and right up to the poststructuralists. Jay knows his territory and reading this survey of it — for anyone with any sort of background in the history of philosophy — is often as pleasant as hearing a familiar symphony well-played in a unique way. (shrink)
Stephen Jay Gould’s monumental The Structure of Evolutionary Theory ‘‘attempts to expand and alter the premises of Darwinism, in order to build an enlarged and distinctive evolutionary theory . . . while remaining within the tradition, and under the logic, of Darwinian argument.’’ The three branches or ‘‘fundamental principles of Darwinian logic’’ are, according to Gould: agency (natural selection acting on individual organisms), efﬁcacy (producing new species adapted to their environments), and scope (accumulation of changes that through geological time yield (...) the living world’s panoply of diversity and morphological complexity). Gould’s efforts to contribute something important to each of these three fundamental components of Darwinian Theory are far from successful. (shrink)
tephen Jay Gould's Wonderful Life: The Burgess Shale and the Nature of History , has become something of a watershed for those who study contingency and complexity, especially applied to organisms, societies, and history, and discussions of it can be found in many works. Walter Fontana and Leo Buss, for example, ask in the title of their chapter "What Would Be Conserved If 'The Tape Were Played Twice'?" This is a direct reference to Gould's suggestion in Wonderful Life that if (...) the tape of life were rewound to the time of the organisms found in the Canadian outcrop known as the Burgess Shale, dated to about 530 million years ago, and replayed with a few contingencies tweaked here and there, humans would most likely never have evolved. (shrink)
: In response to Jay Gallagher's criticism, I emphasize that my article "The Dilemma Faced by Chinese Feminists" (2000) is aimed at showing how both the level of economic development and sexual difference are relevant to the realization of sexual equality. It is a much more serious theoretical attempt than to argue that men have a physical advantage in a society where heavy labor is still in great demand.
P.F. Strawson’s work on moral responsibility is well-known. However, an important implication of the landmark “Freedom and Resentment” has gone unnoticed. Specifically, a natural development of Strawson’s position is that we should understand being morally responsible as having externalistically construed pragmatic criteria, not individualistically construed psychological ones. This runs counter to the contemporary ways of studying moral responsibility. I show the deficiencies of such contemporary work in relation to Strawson by critically examining the positions of John Martin Fischer and Mark (...) Ravizza, R. Jay Wallace, and Philip Pettit for problems due to individualistic assumptions. (shrink)
I begin by warmly thanking Professors Garfield and Hansen for participating in this dialogue. I greatly value the work of both and appreciate having the opportunity to engage in a dialogue with them. Aside from the many important insights I gain from their replies, I believe that both Garfield and Hansen misrepresent my position. In response, I shall clarify the argument contained in my preceding comment, and will consider the objections as they bear on this clarified position.Both Garfield and Hansen (...) characterize the central argument of my comment as presupposing a relatively mainstream Western account of action. They suggest that, with a mainstream Western account in hand, I challenge Classical Chinese and Indo .. (shrink)
A familiar feature of our moral responsibility practices are pleas: considerations, such as “That was an accident”, or “I didn’t know what else to do”, that attempt to get agents accused of wrongdoing off the hook. But why do these pleas have the normative force they do in fact have? Why does physical constraint excuse one from responsibility, while forgetfulness or laziness does not? I begin by laying out R. Jay Wallace’s (Responsibility and the moral sentiments, 1994 ) theory of (...) the normative force of excuses and exemptions. For each category of plea, Wallace offers a single governing moral principle that explains their normative force. The principle he identifies as governing excuses is the Principle of No Blameworthiness without Fault: an agent is blameworthy only if he has done something wrong. The principle he identifies as governing exemptions is the Principle of Reasonableness: an agent is morally accountable only if he is normatively competent. I argue that Wallace’s theory of exemptions is sound, but that his account of the normative force of excuses is problematic, in that it fails to explain the full range of excuses we offer in our practices, especially the excuses of addiction and extreme stress. I then develop a novel account of the normative force of excuses, which employs what I call the “Principle of Reasonable Opportunity,” that can explain the full range of excuses we offer and that is deeply unified with Wallace’s theory of the normative force of exemptions. An important implication of the theory I develop is that moral responsibility requires free will. (shrink)
Stephen Jay Gould argued that replaying the “tape of life” would result in a radically different evolutionary outcome. Some biologists and philosophers, however, have pointed to convergent evolution as evidence for robust replicability in macroevolution. These authors interpret homoplasy, or the independent origination of similar biological forms, as evidence for the power of natural selection to guide form toward certain morphological attractors, notwithstanding the diversionary tendencies of drift and the constraints of phylogenetic inertia. In this paper, I consider the implications (...) of homoplasy for the debate over the nature of macroevolution. I argue that once the concepts of contingency and convergence are fleshed out, it becomes clear that many instances of homoplasy fail to negate Gould’s overarching thesis, and may in fact support a Gouldian view of life. My argument rests on the distinction between parallelism and convergence, which I defend against a recent challenge from developmental biology. I conclude that despite the difficulties in defining and identifying parallelism, the concept remains useful and relevant to the contingency controversy insofar as it underscores the common developmental origins of iterated evolution. (shrink)
I outline Gould's conception of evolutionary theory and his ways of contrasting it with contemporary Darwinism; a contemporary Darwinism that focuses on the natural selection of individual organisms. Gould argues for a hierarchical conception of the living world and of the evolutionary processes that have built that living world: organisms are built from smaller components (genes, cells) and are themselves components of groups, populations, species, lineages. Selection, drift and constraint are important to all of these levels of biological organization, not (...) just that of individual organisms. Moreover, both drift and constraint are more important than orthodoxy supposes. While having some sympathy for both of these lines of argument, I argue that they are more problematic than Gould supposes, and that he understates the power and the heterogeneity of orthodox conceptions of life's evolution. (shrink)
This pleasantly written book has two related themes. The first is a statistical argument which Gould believes has great generality, uniting baseball, a moving personal response to the serious illness from which, thankfully, the author has now recovered, and his second theme: that of whether evolution is progressive.
arly in December of 1981, the federal courtroom in Little Rock, Arkansas, was packed. It was the first week of a trial brought on by the American Civil Liberties Union to challenge the constitutionality of a state law passed earlier that year. The law mandated "balanced treatment," in the publicly supported schools, between evolutionary ideas and so-called Creation Science, better known as the early chapters of Genesis taken absolutely literally (Ruse 1988). By the end of the third day, the case (...) for the plaintiffs was going well. Theologians had testified that Christianity had long interpreted the Bible metaphorically; a philosopher (me!) had argued that Creation Science fails every criterion of demarcation between science and pseudo-science; and the scientists were pointing to error after error in the claims of the literalists. (shrink)
n the pioneering days of radio, my grandfather's job was to lecture to young engineers who were joining Marconi's company. To illustrate that any complex wave form can be broken down into summed simple waves of different frequencies (important in both radio and acoustics), he took wheels of different diameters and attached them with pistons to a clothesline. When the wheels went round, the clothesline was jerked up and down, causing waves of movement to snake along it. The wriggling clothesline (...) was a model of a radio wave, giving the students a more vivid picture of wave summation than mathematical equations could ever have done. (shrink)
Theories postulating saltational evolution are a necessary consequence of essentialism. If one believes in constant types, only the sudden production of a new type can lead to evolutionary change. That such saltations can occur and indeed that their occurrence is a necessity is an old belief. Almost all of the theories of evolution described by H. F. Osborn (1894) in his From the Greek s to Darwin were saltational theories, that is, theories of the sudden origin of new kinds. The (...) Darwinian revolution (Darwin, 1859) did not end this tradition, which continued to flourish in the writings of Thomas H. Huxley, William Bateson, Hugo De Vries, J. C. Willis, Richard Goldschmidt, and Otto Schindewolf. Traces of this idea can even be found in the writings of some of the punctuationists. (shrink)
little more than a year ago, I participated in a meeting, organized by the National Academy of Sciences , on the subject of enhancing public understanding of science by encouraging greater collaboration between scientists and the media. Most of the scientists present were members of the academy, which serves both as an elected honor society and as an official adviser on science policy to the U.S. government. Across the room I spotted a slim man who seemed somehow familiar. His deliberate (...) movements suggested an inner passion concealed beneath a subdued exterior. When I came close enough to read his name tag, I saw that he was the famous astronomer Carl Sagan, whom I had corresponded with but never met. (shrink)