In this paper we present a continuous extension for longitudinal analysis settings of the recently proposed Covariate Balancing Propensity Score methodology. While extensions of the CBPS methodology to both marginal structural models and general treatment regimes have been proposed, these extensions have been kept separately. We propose to bring them together using the generalized method of moments to estimate inverseprobabilityweights such that after weighting the association between time-varying covariates and the treatment is minimized. (...) A simulation analysis confirms the correlation-breaking performance of the proposed technique. As an empirical application we look at the impact the gradual roll-out of Seguro Popular, a universal health insurance program, has had on the resources available for the provision of healthcare services in Mexico. (shrink)
We propose a discrete time branching process to model the appearance of drug resistance under treatment. Under our assumptions at every discrete time a pathogen may die with probability 1−p or divide in two with probability p. Each newborn pathogen is drug resistant with probability μ. We start with N drug sensitive pathogens and with no drug resistant pathogens. We declare the treatment successful if all pathogens are eradicated before drug resistance appears. The model predicts (...) that success is possible only if p<1/2. Even in this case the probability of success decreases exponentially with the parameter m=μN. In particular, even with a very potent drug (i.e. p very small) drug resistance is likely if m is large. (shrink)
Women are commonly stereotyped as more risk averse than men in financial decision making. In this paper we examine whether this stereotype reflects gender differences in actual risk-taking behavior by means of a laboratory experiment with monetary incentives. Gender differences in risk taking may be due to differences in valuations of outcomes or in probabilityweights. The results of our experiment indicate that value functions do not differ significantly between men and women. Men and women differ in their (...)probability weighting schemes, however. In general, women tend to be less sensitive to probability changes. They also tend to underestimate large probabilities of gains more strongly than do men. This effect is particularly pronounced when the decisions are framed in investment terms. As a result, women appear to be more risk averse than men in specific circumstances. (shrink)
.This paper is devoted to a logical and algebraic treatment of conditional probability. The main ideas are the use of non-standard probabilities and of some kind of standard part function in order to deal with the case where the conditioning event has probability zero, and the use of a many-valued modal logic in order to deal probability of an event φ as the truth value of the sentence φ is probable, along the lines of Hájek’s book (...) [H98] and of [EGH96]. To this purpose, we introduce a probabilistic many-valued logic, called FP, which is sound and complete with respect a class of structures having a non-standard extension [0,1]⋆ of [0,1] as set of truth values. We also prove that the coherence of an assessment of conditional probabilities is equivalent to the coherence of a suitably defined theory over FP whose proper axioms reflect the assessment itself. (shrink)
I examine Hume’s proposal about rationally considering testimonial evidence for miracles. He proposes that we compare the probability of the miracle (independently of the testimony) with the probability that the testimony is false, rejecting whichever has the lower probability. However, this superficially plausible proposal is massively ignored in our treatment of testimonial evidence in nonreligious contexts. I argue that it should be ignored, because in many cases, including the resurrection of Jesus, neither we nor Hume have (...) any experience which is at all relevant to assigning a prior probability to the alleged event. (shrink)
If people believe that some property is true of all members of a class such as sofas, then they should also believe that the same property is true of all members of a conjunctively defined subset of that class such as uncomfortable handmade sofas. A series of experiments demonstrated a failure to observe this constraint, leading to what is termed the inverse conjunction fallacy. Not only did people often express a belief in the more general statement but not in (...) the more specific, but also when they accepted both beliefs, they were inclined to give greater confidence to the more general. It is argued that this effect underlies a number of other demonstrations of fallacious reasoning, particularly in category-based induction. Alternative accounts of the phenomenon are evaluated, and it is concluded that the effect is best interpreted in terms of intensional reasoning [Tversky, A., & Kahneman, D.. Extensional versus intuitive reasoning: the conjunction fallacy in probability judgment. Psychological Review, 90, 293-315.]. (shrink)
Two-stage voting is prone to majority inversions, a situation in which the outcome of an election is not backed by a majority of popular votes. We study the probability of majority inversion in a model with two candidates, three states and uniformly distributed fractions of supporters for each candidate. The model encompasses equal or distinct population sizes, with equal, population-based or arbitrary voting weights in the second stage. We prove that, when no state can dictate the outcome of (...) the election by commanding a voting weight in excess of one half, the probability of majority inversion increases with the size disparity among the states. (shrink)
We explore ways in which purely qualitative belief change in the AGM tradition throws light on options in the treatment of conditional probability. First, by helping see why it can be useful to go beyond the ratio rule defining conditional from one-place probability. Second, by clarifying what is at stake in different ways of doing that. Third, by suggesting novel forms of conditional probability corresponding to familiar variants of qualitative belief change, and conversely. Likewise, we explain (...) how recent work on the qualitative part of probabilistic inference leads to a very broad class of 'proto-probability' functions. (shrink)
When a doctor tells you there’s a one percent chance that an operation will result in your death, or a scientist claims that his theory is probably true, what exactly does that mean? Understanding probability is clearly very important, if we are to make good theoretical and practical choices. In this engaging and highly accessible introduction to the philosophy of probability, Darrell Rowbottom takes the reader on a journey through all the major interpretations of probability, with reference (...) to real–world situations. In lucid prose, he explores the many fallacies of probabilistic reasoning, such as the ‘gambler’s fallacy’ and the ‘inverse fallacy’, and shows how we can avoid falling into these traps by using the interpretations presented. He also illustrates the relevance of the interpretation of probability across disciplinary boundaries, by examining which interpretations of probability are appropriate in diverse areas such as quantum mechanics, game theory, and genetics. Using entertaining dialogues to draw out the key issues at stake, this unique book will appeal to students and scholars across philosophy, the social sciences, and the natural sciences. (shrink)
A class of probability functions is studied. This class contains the probability functions of half-spin particles and spinning classical objects. A notion of realisability for these functions is defined. In terms of this notion two versions of Bell's theorem and their inverses are stated and proved.
Jonathan Cohen has claimed that in cases of witness agreement there is an inverse relationship between the prior probability and the posterior probability of what is being agreed: the posterior rises as the prior falls. As is demonstrated in this paper, this contention is not generally valid. In fact, in the most straightforward case exactly the opposite is true: a lower prior also means a lower posterior. This notwithstanding, there is a grain of truth to what Cohen (...) is saying, as there are special circumstances under which a thesis similar to his holds good. What characterises these circumstances is that they allow for the fact of agreement to be surprising. In making this precise, I draw on Paul Horwich's probabilistic analysis of surprise. I also consider a related claim made by Cohen concerning the effect of lowering the prior on the strength of corroboration. 1 Introduction 2 Cohen's claim 3 A counterexample 4 A weaker claim 5 A counterexample to the weaker claim 6 The grain of truth in Cohen's claim 7 Prior probability and strength of corroboration 8 Conclusion. (shrink)
The problem of inferring probability comparisons between events from an initial set of comparisons arises in several contexts, ranging from decision theory to artificial intelligence to formal semantics. In this paper, we treat the problem as follows: beginning with a binary relation ≥ on events that does not preclude a probabilistic interpretation, in the sense that ≥ has extensions that are probabilistically representable, we characterize the extension ≥+ of ≥ that is exactly the intersection of all probabilistically representable extensions (...) of ≥. This extension ≥+ gives us all the additional comparisons that we are entitled to infer from ≥, based on the assumption that there is some probability measure of which ≥ gives us partial qualitative information. We pay special attention to the problem of extending an order on states to an order on events. In addition to the probabilistic interpretation, this problem has a more general interpretation involving measurement of any additive quantity: e.g., given comparisons between the weights of individual objects, what comparisons between the weights of groups of objects can we infer? (shrink)
When a doctor tells you there’s a one percent chance that an operation will result in your death, or a scientist claims that his theory is probably true, what exactly does that mean? Understanding probability is clearly very important, if we are to make good theoretical and practical choices. In this engaging and highly accessible introduction to the philosophy of probability, Darrell Rowbottom takes the reader on a journey through all the major interpretations of probability, with reference (...) to real–world situations. In lucid prose, he explores the many fallacies of probabilistic reasoning, such as the ‘gambler’s fallacy’ and the ‘inverse fallacy’, and shows how we can avoid falling into these traps by using the interpretations presented. He also illustrates the relevance of the interpretation of probability across disciplinary boundaries, by examining which interpretations of probability are appropriate in diverse areas such as quantum mechanics, game theory, and genetics. Using entertaining dialogues to draw out the key issues at stake, this unique book will appeal to students and scholars across philosophy, the social sciences, and the natural sciences. (shrink)
In Hailperin 1996 , in addition to its formal development of Probability Logic, there are many sections devoted to historical origins, illustrative examples, and discussion of related work by other authors. Here selected portions of its formal treatment are summarized and then used as a basis for a probability logic treatment of combining evidence.
Sometimes different partitions of the same space each seem to divide that space into propositions that call for equal epistemic treatment. Famously, equal treatment in the form of equal point-valued credence leads to incoherence. Some have argued that equal treatment in the form of equal interval-valued credence solves the puzzle. This paper shows that, once we rule out intervals with extreme endpoints, this proposal also leads to incoherence.
Numerous studies have convincingly shown that prospect theory can better describe risky choice behavior than the classical expected utility model because it makes the plausible assumption that risk aversion is driven not only by the degree of sensitivity toward outcomes, but also by the degree of sensitivity toward probabilities. This article presents the results of an experiment aimed at testing whether agents become more sensitive toward probabilities over time when they repeatedly face similar decisions, receive feedback on the consequences of (...) their decisions, and are given ample incentives to reflect on their decisions, as predicted by Plott’s Discovered Preference Hypothesis (DPH). The results of a laboratory experiment with N = 62 participants support this hypothesis. The elicited subjective probability weighting function converges significantly toward linearity when respondents are asked to make repeated choices and are given direct feedback after each choice. Such convergence to linearity is absent in an experimental treatment where respondents are asked to make repeated choices but do not experience the resolution of risk directly after each choice, as predicted by the DPH. (shrink)
L. Jonathan Cohen has written a number of important books and articles in which he argues that mathematical probability provides a poor model of much of what paradigmatically passes for sound reasoning, whether this be in the sciences, in common discourse, or in the law. In his book, The Probable and the Provable, Cohen elaborates six paradoxes faced by advocates of mathematical probability (PM) when treating issues of evidence as they would arise in a court of law. He (...) argues that his system of inductive probability (PI) satisfactorily handles the issues that proved paradoxical for mathematical probability, and consequently PI deserves to be thought of as an important standard of rational thinking. I argue that a careful look at each of the alleged paradoxes shows that there is no conflict between mathematical probability and the law, except when for reasons of policy we opt for values in addition to accuracy maximization. Recognizing the role of such policies provides no basis for questioning the adequacy of PM. The significance of this critical treatment of Cohen's work is that those interested in revising the laws of evidence to allow for more explicitly mathematical approaches ought to feel that such revisions will not violate the spirit of forensic rationality. (shrink)
Interest in the Keynesian concept of evidential weight has led to divergent views concerning the burden of proof in adjudication. It is argued that Keynes's concept is properly engaged only in the context of one special kind of decision, the decision whether or not the evidence is ripe for a decision on the underlying merits, whether the latter decision is based on probability, relative plausibility, coherence or otherwise. As a general matter, this question of ripeness is appropriately assigned to (...) the judiciary for resolution as part of the burden of production, rather than to the jury or other factfinder as part of the burden of persuasion. (shrink)
The aim of this paper is to distinguish between, and examine, three issues surrounding Humphreys's paradox and interpretation of conditional propensities. The first issue involves the controversy over the interpretation of inverse conditional propensities — conditional propensities in which the conditioned event occurs before the conditioning event. The second issue is the consistency of the dispositional nature of the propensity interpretation and the inversion theorems of the probability calculus, where an inversion theorem is any theorem of probability (...) that makes explicit (or implicit) appeal to a conditional probability and its corresponding inverse conditional probability. The third issue concerns the relationship between the notion of stochastic independence which is supported by the propensity interpretation, and various notions of causal independence. In examining each of these issues, it is argued that the dispositional character of the propensity interpretation provides a consistent and useful interpretation of the probability calculus. (shrink)
The project of constructing a logic of scientific inference on the basis of mathematical probability theory was first undertaken in a systematic way by the mid-nineteenth-century British logicians Augustus De Morgan, George Boole and William Stanley Jevons. This paper sketches the origins and motivation of that effort, the emergence of the inverseprobability (IP) model of theory assessment, and the vicissitudes which that model suffered at the hands of its critics. Particular emphasis is given to the influence (...) which competing interpretations of probability had on the project, and to the role of the 'lottery' or 'ballot box' metaphor in the philosophical imagination of the proponents of the IP model. (shrink)
Appropriate for upper-level undergraduates and graduate students, this volume includes a variety of Boole's writings on logical subjects, along with papers on related questions of probability. His earlier work, The Mathematical Analysis of Logic, appears here, together with an account of the notes Boole made on his own interleaved copy. In addition, the appendices contain relevant papers by contemporaries with whom the author engaged in discussion, making it possible to trace interesting developments in Boolean reasoning-particularly in regard to his (...) extended treatment of the relation between formal logic and the theory of probabilities. 1952 ed. (shrink)
The logical treatment of the nature of religious belief (here I will concentrate on belief in Christianity) has been distorted by the acceptance of a false dilemma. On the one hand, many (e.g., Braithwaite, Hare) have placed the significance of religious belief entirely outside the realm of intellectual cognition. According to this view, religious statements do not express factual propositions: they are not made true or false by the ways things are. Religious belief consists in a certain attitude toward (...) the world, life, or other human beings, or in what sorts of things one values. On the other hand, others (such as Swinburne, 1981, Chapers 1 and 4) have taken religious belief to include (at least) being certain of the truth of particular factual religious propositions. The strength of a person's religious belief is identified with his degree of confidence in the truth of those propositions, measured by the "subjective probability" which those propositions have for that person. I propose a third alternative, according to which, (1) contrary to the first view, religious belief does involve a relation to factual religious propositions, such as that God exists, that Jesus was God and man, etc., -- propositions which are made true or false by the way things actually are -- but, (2) contrary to the second view, the strength of religious belief is measured, not by the degree of one's confidence1 in the truth of these propositions, but rather by the way in which the value or desirability to oneself of the various ways the world could be is affected by their including or not including the truth of these religious propositions. Thus, religious belief does consist in what one values or prizes, not in what.. (shrink)
Intergenerational impartiality requires putting the welfare of future generations at par with that of our own. However, rational choice requires weighting all welfare values by the respective probabilities of realization. As the risk of non-survival of mankind is strictly positive for all time periods and as the probability of non-survival is cumulative, the probabilityweights operate like discount factors, though justified on a morally justifiable and completely different ground. Impartial intertemporal welfare maximization is acceptable, though the welfare (...) of people in the very far future has lower effects as the probabilities of their existence are also lower. However, the effective discount rate on future welfare values (distinct from monetary values) justified on this ground is likely to be less than 0.1 per annum. Such discounting does not compromise environmental protection and sustainability unduly. The finiteness of our universe implies that the sum of our expected welfare to infinity remains finite, solving the paradox of having to compare different infinite values in optimal growth/conservation theories. (shrink)
Mann–Whitney-type causal effects are clinically relevant, easy to interpret, and readily applicable to a wide range of study settings. This article considers estimation of such effects when the outcome variable is a survival time subject to right censoring. We derive and discuss several methods: an outcome regression method based on a regression model for the survival outcome, an inverseprobability weighting method based on models for treatment assignment and censoring, and two doubly robust methods that involve both (...) types of models and that remain valid under correct specification of the outcome model or the other two models. The methods are compared in a simulation study and applied to an observational study of hospitalized pneumonia. (shrink)
Philosophers can learn a lot about scientific methodology when great scientists square off to debate the foundations of their discipline. The Leibniz/newton controversy over the nature of physical space and the Einstein/bohr exchanges over quantum theory provide paradigm examples of this phenomenon. David Howie’s splendid recent book describes another philosophically laden dispute of this sort. Throughout the 1930s, R. A. Fisher and Harold Jeffries squabbled over the methodology for the nascent discipline of statistics. Their debate has come to symbolize the (...) controversy between the “frequentist” and “Bayesian” schools of statistical thought. Though much has been written about the Fisher/jeffreys exchange, Howie’s book is now the definitive treatment of the subject. Though billed as a piece of history of science, it brims with philosophical insights. (shrink)
Tyler Andrew Wunder, in his article, “Alvin Plantinga on Paul Draper’s evolutionary atheology: implications of theism’s non-contingency,” argues that Plantinga makes a serious error regarding probabilities in his critique of Draper. Properly modified, Wunder believes the argument “works,” but only in a trivial sense. This paper argues that Wunder’s objection, based on an assumed probability calculus, is merely asserted; whereas, there are other competing axiomatic systems consistent with Plantinga’s treatment of probability. As to the modified argument, it (...) is demonstrated that Wunder mistakenly concludes that two key propositions are contradictory. The consequence of this is not that Plantinga’s argument “works” in a trivial sense, but rather that the argument becomes incoherent. Lastly, this paper will explore the consequences of both Wunder’s and Plantinga’s assumptions concerning conditional probability for Draper’s evidentiary argument and Plantiga’s Evolutionary Argument Against Naturalism. (shrink)
The problem of simultaneous measurement of incompatible observables in quantum mechanics is studied on the one hand from the viewpoint of an axiomatic treatment of quantum mechanics and on the other hand starting from a theory of measurement. It is argued that it is precisely such a theory of measurement that should provide a meaning to the axiomatically introduced concepts, especially to the concept of observable. Defining an observable as a class of measurement procedures yielding a certain prescribed result (...) for the probability distribution of the set of values of some quantity (to be described by the set of eigenvalues of some Hermitian operator), this notion is extended to joint probability distributions of incompatible observables. It is shown that such an extension is possible on the basis of a theory of measurement, under the proviso that in simultaneously measuring such observables there is a disturbance of the measurement results of the one observable, caused by the presence of the measuring instrument of the other observable. This has as a consequence that the joint probability distribution cannot obey the marginal distribution laws usually imposed. This result is of great importance in exposing quantum mechanics as an axiomatized theory, since overlooking it seems to prohibit an axiomatic description of simultaneous measurement of incompatible observables by quantum mechanics. (shrink)
Wittgenstein was not only an inspirational figure for Schlick but also contributed to scientific philosophy as Neurath demanded. His verificationism is one instance of this, but it is also shown in his treatment of probability (where his ideas were developed further by Waismann). Wittgenstein revived Bolzano's logical interpretation of probability, anticipating Carnap and many moderns. He construed laws of nature as hypotheses that we had to assume. It is the general form of these hypotheses (what he later (...) called a worldview) and not (pace von Wright) relative frequency that provides the basis for judgements of probability. (shrink)
Intrauterine insemination is one of many treatments provided to infertility patients. Many factors such as, but not limited to, quality of semen, the age of a woman, and reproductive hormone levels contribute to infertility. Therefore, the aim of our study is to establish a statistical probability concerning the prediction of which groups of patients have a very good or poor prognosis for pregnancy after IUI insemination. For that purpose, we compare the results of two analyses: Cluster Analysis and Kohonen (...) Neural Networks. The k-means algorithm from the clustering methods was the best to use for selecting patients with a good prognosis but the Kohonen Neural Networks was better for selecting groups of patients with the lowest chances for pregnancy. (shrink)
Several philosophers of science have claimed that the conceptual difficulties of quantum mechanics can be resolved by appealing to a particular interpretation of probability theory. For example, Popper bases his treatment of quantum mechanics on the propensity interpretation of probability, and Margenau bases his treatment of quantum mechanics on the frequency interpretation of probability. The purpose of this paper is (i) to consider and reject such claims, and (ii) to discuss the question of whether the (...) ψ -function refers to an individual system or to an ensemble of systems. (shrink)
Wittgenstein was not only an inspirational figure for Schlick but also contributed to scientific philosophy as Neurath demanded. His verificationism is one instance of this, but it is also shown in his treatment of probability. Wittgenstein revived Bolzano's logical interpretation of probability, anticipating Carnap and many moderns. He construed laws of nature as hypotheses that we had to assume. It is the general form of these hypotheses and not relative frequency that provides the basis for judgements of (...)probability. (shrink)
In this slim but excessively priced volume, Paul Horwich attempts "to exhibit a unified approach to philosophy of science, based on the concept of subjective probability... by offering new treatments of several problems... and... by providing a more complete probabilistic account of scientific methods and assumptions than has been given before". Starting with the view that beliefs are not all-or-nothing matters but rather are susceptible to varying degrees of intensity, and interpreting this via a modified Bayesian use of subjective (...)probability, Horwich treats well-known puzzles in philosophy of science by considering the following topics: accommodation of data, statistical evidence, severe tests, surprising predictions, paradox of confirmation, the "grue" problem, simplicity, ad hoc hypotheses, diverse evidence, prediction vs. accommodation, desirability of further evidence, and realism vs. instrumentalism. Preceding treatment of these topics is an acceptance of the Bayesian principle that "the degrees of belief of an ideally rational person conform to the mathematical principles of probability theory" and a treatment of probability theory and its various standard interpretations which comprises a quarter of the text. (shrink)
Elicitation methods in decision-making under risk allow us to infer the utilities of outcomes as well as the probabilityweights from the observed preferences of an individual. An optimally efficient elicitation method is proposed, which takes the inevitable distortion of preferences by random errors into account and minimizes the effect of such errors on the inferred utility and probability weighting functions. Under mild assumptions, the optimally efficient method for eliciting utilities and probabilityweights is the (...) following three-stage procedure. First, a probability is elicited whose subjective weight is one half. Second, the utility function is elicited through the midpoint chaining certainty equivalent method using the probability elicited at the first stage. Finally, the probability weighting function is elicited through the probability equivalent method. (shrink)
Data envelopment analysis has proven to be a powerful technique for assessing the relative performance of a set of homogeneous decision-making units. A critical feature of conventional DEA approaches is that only one or several sets of optimal virtual weights are used to aggregate the ratio performance efficiencies, and thus, the efficiency scores might be too extreme or even unrealistic. Alternatively, this paper aims at developing a new performance dominance probability approach and applying it to analyze the banking (...) operations in China. Towards that purpose, we first propose an extended eco-inefficiency model based on the DEA methodology to address banking activities and their possible relative performances. Since the eco-inefficiency will be obtained using a set of optimal weights, we further build a performance dominance structure by considering all sets of feasible weights from a data-driven perspective. Then, we develop two pairwise eco-inefficiency dominance concepts and propose the inefficiency dominance probability model. Finally, we illustrate the eco-inefficiency dominance probability approach with 32 Chinese listed banks from 2014 to 2018 to demonstrate the usefulness and efficacy of the proposed method. (shrink)
I offer an argument regarding chances that appears to yield a dilemma: either the chances at time t must be determined by the natural laws and the history through t of instantiations of categorical properties, or the function ch(•) assigning chances need not satisfy the axioms of probability. The dilemma's first horn might seem like a remnant of determinism. On the other hand, this horn might be inspired by our best scientific theories. In addition, it is entailed by the (...) familiar view that facts about chances at t are ontologically reducible to facts about the laws and the categorical history through t. However, that laws are ontologically prior to chances stands in some tension with the view that chances are governed by laws just as categorical-property instantiations are. The dilemma's second horn entails that if chances are in fact probabilities, then this is a matter of natural law rather than logical or conceptual necessity. I conclude with a suggestion for going between the horns of the dilemma. This suggestion involves a generalization of the notion that chances evolve by conditionalization. Introduction "Chances evolve by conditionalization" How might the lawful magnitude principle be defended? A historical interlude What if chances failed to be determined by the laws and categorical facts? (shrink)
A logic for classical conditional events was investigated by Dubois and Prade. In their approach, the truth value of a conditional event may be undetermined. In this paper we extend the treatment to many-valued events. Then we support the thesis that probability over partially undetermined events is a conditional probability, and we interpret it in terms of bets in the style of de Finetti. Finally, we show that the whole investigation can be carried out in a logical (...) and algebraic setting, and we find a logical characterization of coherence for assessments of partially undetermined events. (shrink)
In intensive care, disputes sometimes arise when patients or surrogates strongly desire treatment, yet health professionals regard it as potentially inappropriate. While professional guidelines confirm that physicians are not always obliged to provide requested treatment, determining when treatment would be inappropriate is extremely challenging. One potential reason for refusing to provide a desired and potentially beneficial treatment is because this would harm other patients. Elsewhere in public health systems, cost effectiveness analysis is sometimes used to decide (...) between different priorities for funding. In this paper, we explore whether cost-effectiveness could be used to determine the appropriateness of providing intensive care. We explore a set of treatment thresholds: the probability threshold, the cost threshold, the duration threshold, and the quality threshold. One common objection to cost-effectiveness analysis is that it might lead to rationing of life-saving treatment. The analysis in this paper might be used to inform debate about the implications of applying cost-effectiveness thresholds to clinical decisions around potentially inappropriate treatment. (shrink)
Making up your mind can include making up your mind about how to change your mind. Here a suggestion for coding imputations of influence into the kinematics of judgmental probabilities is applied to the treatment of Newcomb problems in The Logic of Decision framework. The suggestion is that what identifies you as treating judgmental probabilistic covariance of X and Y as measuring an influence of X on Y is constancy of your probabilities for values of Y conditionally on values (...) of X as your judgmental probability distribution for values of X changes. (shrink)
From a leading figure in the field of psychotherapy, this new book is the first dedicated to the topic of the fear of contamination. The fear of contamination is the driving force behind compulsive washing, the most common manifestation of obsessive compulsive disorder. It is one of the most extraordinary of all human fears. People who have an abnormally elevated fear of contamination over-estimate the probability and the potential seriousness of becoming contaminated. They believe that they are more susceptible (...) than other people to contamination. People who labour under the illusion that they are particularly vulnerable to contamination are persistently anxious, excessively vigilant and highly avoidant. The fear is complex, powerful, probably universal, easily provoked, intense, and difficult to control. Usually it is caused by physical contact with a contaminant and spreads rapidly and widely. When a person feels contaminated it drives a strong urge to remove the contamination, usually by washing. The fear and subsequent urges over-ride other behaviour. A fear of contamination can also be established mentally and without physical contact. The fear can arise after exposure to violation, physical or non-physical, and from self-contamination. The book starts by defining the disorder, before considering the various manifestations of this fear, examining both mental contamination and contact contamination, and feelings of disgust. Most significantly, it develops a theory for how this problem can be treated, providing clinical guidelines - based around cognitive behavioural techniques. (shrink)
We discuss Herzberg’s :319–337, 2015) treatment of linear aggregation for profiles of infinitely many finitely additive probabilities and suggest a natural alternative to his definition of linear continuous aggregation functions. We then prove generalizations of well-known characterization results due to :410–414, 1981). We also characterize linear aggregation of probabilities in terms of a Pareto condition, de Finetti’s notion of coherence, and convexity.