Part 1 Background on de Finetti’s twin criteria of coherence: Coherence1: 2-sided previsions free from dominance through a Book. Coherence2: Forecasts free from dominance under Brier (squared error) score. Part 2 IP theory based on a scoring rule.
Force Fields collects the recent essays of Martin Jay, an intellectual historian and cultural critic internationally known for his extensive work on the history of Western Marxism and the intellectual migration from Germany to America.
Experimenters sometimes insist that it is unwise to examine data before determining how to analyze them, as it creates the potential for biased results. I explore the rationale behind this methodological guideline from the standpoint of an error statistical theory of evidence, and I discuss a method of evaluating evidence in some contexts when this predesignation rule has been violated. I illustrate the problem of potential bias, and the method by which it may be addressed, with an example from the (...) search for the top quark. A point in favor of the error statistical theory is its ability, demonstrated here, to explicate such methodological problems and suggest solutions, within the framework of an objective theory of evidence. (shrink)
When real-valued utilities for outcomes are bounded, or when all variables are simple, it is consistent with expected utility to have preferences defined over probability distributions or lotteries. That is, under such circumstances two variables with a common probability distribution over outcomes – equivalent variables – occupy the same place in a preference ordering. However, if strict preference respects uniform, strict dominance in outcomes between variables, and if indifference between two variables entails indifference between their difference and the status quo, (...) then preferences over rich sets of unbounded variables, such as variables used in the St. Petersburg paradox, cannot preserve indifference between all pairs of equivalent variables. In such circumstances, preference is not a function only of probability and utility for outcomes. Then the preference ordering is not defined in terms of lotteries. (shrink)
Customer orientation (CO) and the development of long-term relationships with customers are known conditions for growth and profit sustainability. Businesses use special treatments, inducements, and personal gestures to show their appreciation to customers. However, there are concerns about whether these inducements really create the right perceptions in customer’s mind. This study suggests that when customers believe that the firm is ethical, the inducements and special treatments received are seen in a positive light and can help develop loyalty. The hypotheses were (...) tested with responses from 299 customers of financial institutions in Chile. Results support the hypotheses that firm’s ethical reputation helps in retaining customers. Managerial implications are provided. (shrink)
The popularity of films like Titanic betokens a massive shift in the nature of aesthetic spectatorship in our time. The contemplative, distanced viewer who is able to judge from afar the spectacle before him or her, has been replaced by a more proximate, involved "kinaesthetic" subject whose body is stimulated as much as his or her eye. This is evident not only in mass culture with amusement thrill rides and the return of what has been called the "cinema of attractions"; (...) this new spectator can also be discerned in avant-garde culture, as shown by the Sensation exhibition of Young British Artists which caused such a stir in London and New York. This spectator is especially attracted to simulacral scenes of destruction and catastrophe, in which he or she is virtually immersed. If aesthetic judgement is to be a model for its political counterpart, as has been argued by theorists like Lyotard and Arendt, it cannot do so on the basis of this aesthetics of violent immersion. (shrink)
We discuss several features of coherent choice functions —where the admissible options in a decision problem are exactly those that maximize expected utility for some probability/utility pair in fixed set S of probability/utility pairs. In this paper we consider, primarily, normal form decision problems under uncertainty—where only the probability component of S is indeterminate and utility for two privileged outcomes is determinate. Coherent choice distinguishes between each pair of sets of probabilities regardless the “shape” or “connectedness” of the sets of (...) probabilities. We axiomatize the theory of choice functions and show these axioms are necessary for coherence. The axioms are sufficient for coherence using a set of probability/almost-state-independent utility pairs. We give sufficient conditions when a choice function satisfying our axioms is represented by a set of probability/state-independent utility pairs with a common utility. (shrink)
We extend de Finetti’s (1974) theory of coherence to apply also to unbounded random variables. We show that for random variables with mandated infinite prevision, such as for the St. Petersburg gamble, coherence precludes indifference between equivalent random quantities. That is, we demonstrate when the prevision of the difference between two such equivalent random variables must be positive. This result conflicts with the usual approach to theories of Subjective Expected Utility, where preference is defined over lotteries. In addition, we explore (...) similar results for unbounded variables when their previsions, though finite, exceed their expected values, as is permitted within de Finetti’s theory. In such cases, the decision maker’s coherent preferences over random quantities is not even a function of probability and utility. One upshot of these findings is to explain further the differences between Savage’s theory (1954), which requires bounded utility for non-simple acts, and de Finetti’s theory, which does not. And it raises a question whether there is a theory that fits between these two. (shrink)
The "Dutch Book" argument, tracing back to Ramsey and to deFinetti, offers prudential grounds for action in conformity with personal probability. Under several structural assumptions about combinations of stakes (that is, assumptions about the combination of wagers), your betting policy is coherent only if your fair odds are probabilities. The central question posed here is the following one: Besides providing an operational test of coherent betting, does the "Book" argument also provide for adequate measurement (elicitation) of the agents degrees (...) of beliefs? That is, are an agent's fair odds also his/her personal probabilities for those events? We argue the answer is "No!" The problem is caused by the possibility of state dependent utilities. (shrink)
Methodology for conducting clinical trials of new drugs and treatments on people need not be regarded as fixed. After reviewing the currently most popular method (randomization) and its ethical problems, this paper explores the possibilities of a new method for conducting such trials. It relies on new Bayesian technology for eliciting the opinions of medical experts. These opinions are conditioned on specific predictor variables, and are held in a computer. At any stage in a trial, these opinions can be updated (...) in the computer using the information collected in the trial up to that point. Consider as an admissible treatment for a patient having specific values of predictor variables only those treatments that at least one expert regards as best (in the computer model) for this patient. It is proposed that only admissible treatments, so defined, be allowed to be assigned to the patient. The ethical and statistical consequences of this principle are explored. Experience to date with a trial at Johns Hopkins designed on this principle is reported. Keywords: Bayesian statistics, information, clinical trial CiteULike Connotea Del.icio.us What's this? (shrink)
Statistical decision theory, whether based on Bayesian principles or other concepts such as minimax or admissibility, relies on minimizing expected loss or maximizing expected utility. Loss and utility functions are generally treated as unit-less numerical measures of value for consequences. Here, we address the issue of the units in which loss and utility are settled and the implications that those units have on the rankings of potential decisions. When multiple currencies are available for paying the loss, one must take explicit (...) account of which currency is used as well as the exchange rates between the various available currencies. (shrink)
Sometimes conducting an experiment to ascertain the state of a system changes the state of the system being measured. Kahneman & Tversky modelled this effect with âsupport theoryâ. Quantum physics models this effect with probability amplitude mechanics. As this paper shows, probability amplitude mechanics is similar to support theory. Additionally, Viscusi's proposed generalized expected utility model has an analogy in quantum mechanics.
When can a Bayesian investigator select an hypothesis H and design an experiment (or a sequence of experiments) to make certain that, given the experimental outcome(s), the posterior probability of H will be lower than its prior probability? We report an elementary result which establishes sufficient conditions under which this reasoning to a foregone conclusion cannot occur. Through an example, we discuss how this result extends to the perspective of an onlooker who agrees with the investigator about the statistical model (...) for the data but who holds a different prior probability for the statistical parameters of that model. We consider, specifically, one-sided and two-sided statistical hypotheses involving i.i.d. Normal data with conjugate priors. In a concluding section, using an "improper" prior, we illustrate how the preceding results depend upon the assumption that probability is countably additive. (shrink)
The degree of incoherence, when previsions are not made in accordance with a probability measure, is measured by either of two rates at which an incoherent bookie can be made a sure loser. Each bet is considered as an investment from the points of view of both the bookie and a gambler who takes the bet. From each viewpoint, we define an amount invested (or escrowed) for each bet, and the sure loss of incoherent previsions is divided by the escrow (...) to determine the rate of incoherence. Potential applications include the treatment of arbitrage opportunities in financial markets and the degree of incoherence of classical statistical procedures. We illustrate the latter with the example of hypothesis testing at a fixed size. (shrink)
The laissez-faire attitude towards dishonesty in research has simply created an environment for widespread escalation of the problem. Can we now believe anything we read? Why should we have confidence in an author because of his eminence? Should we automatically accept that clinical trials are always conducted with total integrity? Why have we been afraid to tackle this crisis head-on?
In this paper I do three things. Firstly, I defend the view that in his most familiar arguments about morality and the theological postulates, the arguments which appeal to the epistemological doctrines of the first Critique, Kant is as much of a fictionalist as anybody not working explicitly with that conceptual apparatus could be: his notion of faith as subjectively and not objectively grounded is precisely what fictionalists are concerned with in their talk of nondoxastic attitudes. Secondly, I reconstruct a (...) logically distinct argument to a fictionalist conclusion which I argue Kant also gives us, this time an argument to the conclusion that it is a good thing if our commitment to the existence of God is nondoxastic. And finally, I argue that this argument is of continuing interest, to Kantians and non-Kantians alike, not only because it raises interesting questions about the relation of morality to belief in God (which go in the opposite direction to most discussions, which focus on whether and to what extent belief in God can be an aid to morality), but also because this ‘Moral Hazard Argument’ seems to be available to religious realists and non-realists alike, thus suggesting that religious fictionalism is not by any means just an interesting version of religious non-realism. (shrink)
A donation paradox occurs when a player gives an apparently valuable prerogative to another player, but âdoes betterâ, according to some criterion. Peremptory challenges, used in choosing a American jury, permit each side to veto a certain number of potential jurors. With even a very simple model of jury selection, it is shown that for one side to give a peremptory challenge to the other side may lead to a more favorable jury, an instance of the donation paradox. Both a (...) theorem and examples are given concerning the existence of the donation paradox in the optimal use of peremptory challenges. (shrink)
Traditional combinatory logic uses combinators S and K to represent all Turing-computable functions on natural numbers, but there are Turing-computable functions on the combinators themselves that cannot be so represented, because they access internal structure in ways that S and K cannot. Much of this expressive power is captured by adding a factorisation combinator F. The resulting SF-calculus is structure complete, in that it supports all pattern-matching functions whose patterns are in normal form, including a function that decides structural equality (...) of arbitrary normal forms. A general characterisation of the structure complete, confluent combinatory calculi is given along with some examples. These are able to represent all their Turing-computable functions whose domain is limited to normal forms. The combinator F can be typed using an existential type to represent internal type information. (shrink)
Taking on the stigma of inauthenticity : Adorno's critique of genuineness -- Is experience still in crisis? : reflections on a Frankfurt school lament -- Mourning a metaphor: the revolution is over -- Cultural relativism and the visual turn -- Scopic regimes of modernity revisited -- No state of grace : violence in the garden -- Visual parrhesia? : Foucault and the truth of the gaze -- The Kremlin of modernism -- Phenomenology and lived experience -- Aesthetic experience and historical (...) experience : a twenty-first-century constellation -- Still waiting to hear from derrida -- Pseudology : Derrida on Arendt and lying in politics -- The menace of consilience : keeping the disciplines unreconciled -- Can there be national philosophies in a transnational world? -- Straddling a watershed? -- Allons enfants de l'humanité : the French and human rights -- Intellectual family values : William Phillips, Hannah Arendt, and the Partisan review -- Still sleeping rough : Colin Wilson's the outsider at fifty. (shrink)
It is customary to begin essays of this kind with an arresting quotation from an eminent source, a practice that both displays the author's ostensible erudition and coverdy betrays his need to draw on an external authority to support the argument he is about to make. In order to remain true to this time-honored convention, I have chosen as my opening text for today the following passage from Theodor Adorno's Negative Dialectics, written in 1966: “All culture after Auschwitz, including its (...) urgent critique, is garbage. In restoring itself after the things that happened without resistance in its own countryside, culture has turned entirely into the ideology it had been potentially — had been ever since it presumed, in opposition to material existence, to inspire that existence with the light denied it by the separation of the mind from manual labor. (shrink)
Ever since Clifford Geertz urged the “blurring of genres” in the social sciences, many scholars have considered the crossing of disciplinary boundaries a healthy alternative to rigidly maintaining them. But what precisely does the metaphor of “blurring” imply? By unpacking the varieties of visual experiences that are normally grouped under this rubric, this essay seeks to provide some precision to our understanding of the implications of fuzziness. It extrapolates from the blurring caused by differential focal distances, velocities of objects in (...) the visual field, and competing perspectival vantage points to comparable effects in the intersection of different scholarly disciplines. Arguing against the holistic implications of Geertz's metaphor, as well as the even more totalizing concept of “consilience” introduced by E. O. Wilson, it suggests that blurring implies new types of complexity between or among those disciplines. (shrink)