Jenann’s central metaphysical thesis is that there is an objective conditional probability function PrG(A/B), the domain of which includes a great many, perhaps all, pairs of contingent propositions. This pair can be synchronic or diachronic: both can concern how things are at the same time, or not. Jenann’s central epistemological thesis is antiskepticism about PrG, in the following sense: prima facie, the subjective credence functions of epistemically reasonable agents converge on PrG: roughly, if you’ve done a lot of science, for (...) all A, B, your C(A/B) is similar to PrG(A/B). (Compare antiskepticism about perceptual knowledge: prima facie, if circumstances are good and one’s visual experience represents that p, p.) These theses have two cool consequences: first, the possibility of a novel approach to objective Bayesianism; second, a way of doing away with dynamical laws. (shrink)
The inflation of Type I error rates is thought to be one of the causes of the replication crisis. Questionable research practices such as p-hacking are thought to inflate Type I error rates above their nominal level, leading to unexpectedly high levels of false positives in the literature and, consequently, unexpectedly low replication rates. In this article, I offer an alternative view. I argue that questionable and other research practices do not usually inflate relevant Type I error rates. I begin (...) with an introduction to Type I error rates that distinguishes them from theoretical errors. I then illustrate my argument with respect to model misspecification, multiple testing, selective inference, forking paths, exploratory analyses, p-hacking, optional stopping, double dipping, and HARKing. In each case, I demonstrate that relevant Type I error rates are not usually inflated above their nominal level, and in the rare cases that they are, the inflation is easily identified and resolved. I conclude that the replication crisis may be explained, at least in part, by researchers’ misinterpretation of statistical errors and their underestimation of theoretical errors. (shrink)
The major competing statistical paradigms share a common remarkable but unremarked thread: in many of their inferential applications, different probability interpretations are combined. How this plays out in different theories of inference depends on the type of question asked. We distinguish four question types: confirmation, evidence, decision, and prediction. We show that Bayesian confirmation theory mixes what are intuitively “subjective” and “objective” interpretations of probability, whereas the likelihood-based account of evidence melds three conceptions of what constitutes an “objective” probability.
I argue that when we use ‘probability’ language in epistemic contexts—e.g., when we ask how probable some hypothesis is, given the evidence available to us—we are talking about degrees of support, rather than degrees of belief. The epistemic probability of A given B is the mind-independent degree to which B supports A, not the degree to which someone with B as their evidence believes A, or the degree to which someone would or should believe A if they had B as (...) their evidence. My central argument is that the degree-of-support interpretation lets us better model good reasoning in certain cases involving old evidence. Degree-of-belief interpretations make the wrong predictions not only about whether old evidence confirms new hypotheses, but about the values of the probabilities that enter into Bayes’ Theorem when we calculate the probability of hypotheses conditional on old evidence and new background information. (shrink)
In this paper, we illustrate some serious difficulties involved in conveying information about uncertain risks and securing informed consent for risky interventions in a clinical setting. We argue that in order to secure informed consent for a medical intervention, physicians often need to do more than report a bare, numerical probability value. When probabilities are given, securing informed consent generally requires communicating how probability expressions are to be interpreted and communicating something about the quality and quantity of the evidence for (...) the probabilities reported. Patients may also require guidance on how probability claims may or may not be relevant to their decisions, and physicians should be ready to help patients understand these issues. (shrink)
One popular approach to statistical mechanics understands statistical mechanical probabilities as measures of rational indifference. Naive formulations of this ``indifference approach'' face reversibility worries - while they yield the right prescriptions regarding future events, they yield the wrong prescriptions regarding past events. This paper begins by showing how the indifference approach can overcome the standard reversibility worries by appealing to the Past Hypothesis. But, the paper argues, positing a Past Hypothesis doesn't free the indifference approach from all reversibility worries. For (...) while appealing to the Past Hypothesis allows it to escape one kind of reversibility worry, it makes it susceptible to another - the Meta-Reversibility Objection. And there is no easy way for the indifference approach to escape the Meta-Reversibility Objection. As a result, reversibility worries pose a steep challenge to the viability of the indifference approach. (shrink)
This paper develops a probabilistic analysis of conditionals which hinges on a quantitative measure of evidential support. In order to spell out the interpreta- tion of ‘if’ suggested, we will compare it with two more familiar interpretations, the suppositional interpretation and the strict interpretation, within a formal framework which rests on fairly uncontroversial assumptions. As it will emerge, each of the three interpretations considered exhibits specific logical features that deserve separate consideration.
Current planetary defense policy prioritizes a probability assessment of risk of Earth impact by an asteroid or a comet in the planning of detection and mitigation strategies and in setting the levels of urgency and budgeting to operationalize them. The result has been a focus on asteroids of Tunguska size, which could destroy a city or a region, since this is the most likely sort of object we would need to defend against. However a complete risk assessment would consider not (...) only the probability of an impact but also the magnitude of its consequences, which in the case of an object of Chicxulub size could be the end of civilization or even human extinction. This paper argues that a planetary defense policy based on a complete (or one could say genuine) risk assessment would justify expenditures much higher than at present. (shrink)
Why is the concept of truth so important to us? After all, it is not at all obvious why human intelligence would have evolved to do anything other than to dissimulate, deceive, cheat, and trick. Pragmatic genealogies like the genealogies of the value of truth told by Nietzsche and Williams can help us grasp why we think as we do. But instead of explaining concepts by tracing them to antecedent objects in reality, they trace them to practical needs and reverse-engineer (...) the functions performed by the concepts. (shrink)
The epistemic probability of A given B is the degree to which B evidentially supports A, or makes A plausible. This paper is a first step in answering the question of what determines the values of epistemic probabilities. I break this question into two parts: the structural question and the substantive question. Just as an object’s weight is determined by its mass and gravitational acceleration, some probabilities are determined by other, more basic ones. The structural question asks what probabilities are (...) not determined in this way—these are the basic probabilities which determine values for all other probabilities. The substantive question asks how the values of these basic probabilities are determined. I defend an answer to the structural question on which basic probabilities are the probabilities of atomic propositions conditional on potential direct explanations. I defend this against the view, implicit in orthodox mathematical treatments of probability, that basic probabilities are the unconditional probabilities of complete worlds. I then apply my answer to the structural question to clear up common confusions in expositions of Bayesianism and shed light on the “problem of the priors.”. (shrink)
Many researchers want to unify probability and logic by defining logical probability or probabilistic logic reasonably. This paper tries to unify statistics and logic so that we can use both statistical probability and logical probability at the same time. For this purpose, this paper proposes the P–T probability framework, which is assembled with Shannon’s statistical probability framework for communication, Kolmogorov’s probability axioms for logical probability, and Zadeh’s membership functions used as truth functions. Two kinds of probabilities are connected by an (...) extended Bayes’ theorem, with which we can convert a likelihood function and a truth function from one to another. Hence, we can train truth functions (in logic) by sampling distributions (in statistics). This probability framework was developed in the author’s long-term studies on semantic information, statistical learning, and color vision. This paper first proposes the P–T probability framework and explains different probabilities in it by its applications to semantic information theory. Then, this framework and the semantic information methods are applied to statistical learning, statistical mechanics, hypothesis evaluation (including falsification), confirmation, and Bayesian reasoning. Theoretical applications illustrate the reasonability and practicability of this framework. This framework is helpful for interpretable AI. To interpret neural networks, we need further study. (shrink)
A historical review and philosophical look at the introduction of “negative probability” as well as “complex probability” is suggested. The generalization of “probability” is forced by mathematical models in physical or technical disciplines. Initially, they are involved only as an auxiliary tool to complement mathematical models to the completeness to corresponding operations. Rewards, they acquire ontological status, especially in quantum mechanics and its formulation as a natural information theory as “quantum information” after the experimental confirmation the phenomena of “entanglement”. Philosophical (...) interpretations appear. A generalization of them is suggested: ontologically, they correspond to a relevant generalization to the relation of a part and its whole where the whole is a subset of the part rather than vice versa. The structure of “vector space” is involved necessarily in order to differ the part “by itself” from it in relation to the whole as a projection within it. That difference is reflected in the new dimension of vector space both mathematically and conceptually. Then, “negative or complex probability” are interpreted as a quantity corresponding the generalized case where the part can be “bigger” than the whole, and it is represented only partly in general within the whole. (shrink)
Bài mới xuất bản vào ngày 19-5-2020 với tác giả liên lạc là NCS Nguyễn Minh Hoàng, cán bộ nghiên cứu của Trung tâm ISR, trình bày tiếp cận thống kê Bayesian cho việc nghiên cứu dữ liệu khoa học xã hội. Đây là kết quả của định hướng Nhóm nghiên cứu SDAG được nêu rõ ngay từ ngày 18-5-2019.
Tom Stoppard’s “Rosencrantz and Guildenstern Are Dead” opens with a puzzling scene in which the title characters are betting on coin throws and observe a seemingly astonishing run of 92 heads in a row. Guildenstern grows uneasy and proposes a number of unsettling explanations for what is occurring. Then, in a sudden change of heart, he appears to suggest that there is nothing surprising about what they are witnessing, and nothing that needs any explanation. He says ‘…each individual coin spun (...) individually is as likely to come down heads as tails and therefore should cause no surprise each time it does.’ In this article I argue that Guildenstern is right – there is nothing surprising about throwing 92 heads in a row. I go on to consider the relationship between surprise, probability and belief. (shrink)
Abstract : A measurement result is never absolutely accurate: it is affected by an unknown “measurement error” which characterizes the discrepancy between the obtained value and the “true value” of the quantity intended to be measured. As a consequence, to be acceptable a measurement result cannot take the form of a unique numerical value, but has to be accompanied by an indication of its “measurement uncertainty”, which enunciates a state of doubt. What, though, is the value of measurement uncertainty? What (...) is its numerical value: how does one calculate it? What is its epistemic value: how one should interpret a measurement result? Firstly, we describe the statistical models that scientists make use of in contemporary metrology to perform an uncertainty analysis, and we show that the issue of the interpretation of probabilities is vigorously debated. This debate brings out epistemological issues about the nature and function of physical measurements, metrologists insisting in particular on the subjective aspect of measurement. Secondly, we examine the philosophical elaboration of metrologists in their technical works, where they criticize the use of the notion of “true value” of a physical quantity. We then challenge this elaboration and defend such a notion. The third part turns to a specific use of measurement uncertainty in order to address our thematic from the perspective of precision physics, considering the activity of the adjustments of physical constants. In the course of this activity, physicists have developed a dynamic conception of the accuracy of their measurement results, oriented towards a future progress of knowledge, and underlining the epistemic virtues of a never-ending process of identification and correction of measurement errors. (shrink)
Assumptions of stochastic independence are crucial to statistical models in science. Under what circumstances is it reasonable to suppose that two events are independent? When they are not causally or logically connected, so the standard story goes. But scientific models frequently treat causally dependent events as stochastically independent, raising the question whether there are kinds of causal connection that do not undermine stochastic independence. This paper provides one piece of an answer to this question, treating the simple case of two (...) tossed coins with and without a midair collision. (shrink)
MỘT SỐ QUÁ TRÌNH NGẪU NHIÊN CÓ BƯỚC NHẢY -/- Hoàng Thị Phương Thảo -/- Luận án Tiến sỹ -/- TRƯỜNG ĐẠI HỌC KHOA HỌC TỰ NHIÊN ĐẠI HỌC QUỐC GIA HÀ NỘI -/- Hà Nội - 2015 .
This doctoral dissertation investigates the notion of physical necessity. Specifically, it studies whether it is possible to account for non-accidental regularities without the standard assumption of a pre-existent set of governing laws. Thus, it takes side with the so called deflationist accounts of laws of nature, like the humean or the antirealist. The specific aim is to complement such accounts by providing a missing explanation of the appearance of physical necessity. In order to provide an explanation, I recur to fields (...) that have not been appealed to so far in discussions about the metaphysics of laws. Namely, I recur to complex systems’ theory, and to the foundations of statistical mechanics. The explanation proposed is inspired by how complex systems’ theory has elucidated the way patterns emerge, and by the probabilistic explanations of the 2nd law of thermodynamics. More specifically, this thesis studies how some constraints that make no direct reference to the dynamics can be a sufficient condition for obtaining in the long run, with high probability, stable regular behavior. I hope to show how certain metaphysical accounts of laws might benefit from the insights achieved in these other fields. According to the proposal studied in this thesis, some regularities are not accidental not in virtue of an underlying physical necessity. The non-accidental character of certain regular behavior is only due to its overwhelming stability. Thus, from this point of view the goal becomes to explain the stability of temporal patterns without assuming a set of pre-existent guiding laws. It is argued that the stability can be the result of a process of convergence to simpler and stable regularities from a more complex lower level. According to this project, if successful, there would be no need to postulate a (mysterious) intermediate category between logical necessity and pure contingency. Similarly, there would be no need to postulate a (mysterious) set of pre-existent governing laws. Part I of the thesis motivates part II, mostly by arguing why further explanation of the notions of physical necessity and governing laws should be welcomed (chapter 1), and by studying the plausibility of a lawless fundamental level (chapters 2 and 3). Given so, part II develops the explanation of formation of simpler and stable behavior from a more complex underlying level. (shrink)
A possible event always seems to be more probable than an impossible event. Although this constraint, usually alluded to as regularity , is prima facie very attractive, it cannot hold for standard probabilities. Moreover, in a recent paper Timothy Williamson has challenged even the idea that regularity can be integrated into a comparative conception of probability by showing that the standard comparative axioms conflict with certain cases if regularity is assumed. In this note, we suggest that there is a natural (...) weakening of the standard comparative axioms. It is shown that these axioms are consistent both with the regularity condition and with the essential feature of Williamson’s example. (shrink)
Abstract The opening section outlines probabilism in the 20th century philosophy and shortly discusses the major accomplishments of Polish probabilist thinkers. A concise characterization of Bayesianism as the major recent form of probabilism follows. It builds upon the core personalist version of Bayesianism towards more objectively oriented versions thereof. The problem of a priori probability is shortly discussed. A tentative characterization of Kazimierz Ajdukiewicz’s standpoint regarding the inductive inference is cast in Bayesian terms. His objections against it presented in Pragmatic (...) Logic are presented. His 1958 paper on justification of non-deductive inference, as amply demonstrated by K. Szaniawski and I. Niiniluoto, extends his earlier Baysian position from 1928 monograph. In the closing section Ajdukiewicz’s standpoint is presented as a characteristically pragmatist and empiricist version of Bayesianism, which remains an unexplored and stimulating position. (shrink)
The formal representation of the strength of witness testimony has been historically tied to a formula — proposed by Condorcet — that uses a factor representing the reliability of an individual witness. This approach encourages a false dilemma between hyper-scepticism about testimony, especially to extraordinary events such as miracles, and an overly sanguine estimate of reliability based on insufficiently detailed evidence. Because Condorcet’s formula does not have the resources for representing numerous epistemically relevant details in the unique situation in which (...) testimony is given, many late 19th century thinkers like Venn turned away from the probabilistic analysis of testimony altogether. But a more nuanced approach using Bayes factors provides a better, more flexible, formalism for representing the evidential force of testimony. (shrink)
Hans Reichenbach has been not only one of the founding fathers of logical empiricism but also one of the most prominent figures in the philosophy of science of the past century. While some of his ideas continue to be of interest in current philosophical programs, an important part of his early work has been neglected, and some of it has been unavailable to English readers. Among Reichenbach’s overlooked (and untranslated) early works, his doctoral thesis of 1915, The Concept of Probability (...) in the Mathematical Representation of Reality, deserves special attention, both for the topics covered and for its significance for a proper understanding of his intellectual trajectory. This volume anticipates most of the fundamental themes of his later philosophy. In particular, it addresses the issue of the application of probability statements to reality, as well as the relationship between probability and causality—questions that have been at the core of his research throughout his life. (shrink)
The Bayesian model has been used in psychology as the standard reference for the study of probability revision. In the first part of this paper we show that this traditional choice restricts the scope of the experimental investigation of revision to a stable universe. This is the case of a situation that, technically, is known as focusing. We argue that it is essential for a better understanding of human probability revision to consider another situation called updating (Katsuno & Mendelzon, 1992), (...) in which the universe is evolving. In that case the structure of the universe has definitely been transformed and the revision message conveys information on the resulting universe. The second part of the paper presents four experiments based on the Monty Hall puzzle that aim to show that updating is a natural frame for individuals to revise their beliefs. (shrink)
Objective interpretations of probability are usually discussed in two varieties: frequency and propensity accounts. But there is a third, neglected possibility, namely, probabilities as deriving from ranges in suitably structured initial state spaces. Roughly, the probability of an event is the proportion of initial states that lead to this event in the space of all possible initial states, provided that this proportion is approximately the same in any not too small interval of the initial state space. This idea can also (...) be expressed by saying that in the types of situations that give rise to probabilistic phenomena we may expect to find an initial state space such that any "reasonable" density function over this space leads to the same probabilities for the possible outcomes. This "method of arbitrary functions" was introduced by Poincaré, studied and extended by Hopf and more recently by Eduardo Engel, Jan von Plato and Michael Strevens. The natural-range, or method-of-arbitrary-functions approach to probabilities is usually treated as an explanation for the occurrence of probabilistic patterns, whereas I examine its prospects for an objective interpretation of probability, in the sense of providing truth conditions for probability statements that do not depend on our state of mind or information. The main objection to such a proposal is that it is circular, i.e. presupposes the concept of probability, because a measure on the initial state has to be introduced, and density functions over the space are considered. I try to argue that this objection can be successfully met. (shrink)
How are we to understand the use of probability in corroboration functions? Popper says logically, but does not show we could have access to, or even calculate, probability values in a logical sense. This makes the logical interpretation untenable, as Ramsey and van Fraassen have argued. -/- If corroboration functions only make sense when the probabilities employed therein are subjective, however, then what counts as impressive evidence for a theory might be a matter of convention, or even whim. So isn’t (...) so-called ‘corroboration’ just a matter of psychology? -/- In this paper, I argue that we can go some way towards addressing this objection by adopting an intersubjective interpretation, of the form advocated by Gillies, with respect to corroboration. I show why intersubjective probabilities are preferable to subjective ones when it comes to decision making in science: why group decisions are liable to be superior to individual ones, given a number of plausible conditions. I then argue that intersubjective corroboration is preferable to intersubjective confirmation of a Bayesian variety, because there is greater opportunity for principled agreement concerning the factors involved in the former. (shrink)
I shall argue that there is no such property of an event as its “probability.” This is why standard interpretations cannot give a sound definition in empirical terms of what “probability” is, and this is why empirical sciences like physics can manage without such a definition. “Probability” is a collective term, the meaning of which varies from context to context: it means different — dimensionless [0, 1]-valued — physical quantities characterising the different particular situations. In other words, probability is a (...) reducible concept, supervening on physical quantities characterising the state of affairs corresponding to the event in question. On the other hand, however, these “probability-like” physical quantities correspond to objective features of the physical world, and are objectively related to measurable quantities like relative frequencies of physical events based on finite samples — no matter whether the world is objectively deterministic or indeterministic. (shrink)
Much is asked of the concept of chance. It has been thought to play various roles, some in tension with or even incompatible with others. Chance has been characterized negatively, as the absence of causation; yet also positively—the ancient Greek τυχη´ reifies it—as a cause of events that are not governed by laws of nature, or as a feature of the laws themselves. Chance events have been understood epistemically as those whose causes are unknown; yet also objectively as a distinct (...) ontological kind, sometimes called ‘pure’ chance events. Chance gives rise to individual unpredictability and disorder; yet it yields collective predictability and order—stable long-run statistics, and in the limit, aggregate behavior susceptible to precise mathematical theorems. Some authors believe that to posit chances is to abjure explanation; yet others think that chances are themselves explanatory. During the Enlightenment, talk of ‘chance’ was regarded as unscientific, unphilosophical, the stuff of superstition or ignorance; yet today it is often taken to be a fundamental notion of our most successful scientific theory, quantum mechanics, and a central concept of contemporary metaphysics. Chance has both negative and positive associations in daily life. The old word in English for it, hazard, which derives from French and originally from Arabic, still has unwelcome connotations of risk; ‘chance’ evokes uncertainty, uncontrollability, and chaos. Yet chance is also allied with luck, fortune, freedom from constraint, and diversity. And it apparently has various practical uses and benefits. It forms the basis of randomized trials in statistics, and of mixed strategies in decision theory and game theory; it is appealed to in order to resolve problems of fair division and other ethical.. (shrink)
The essays that constitute this dissertation explore three strategies for understanding the role of modality in philosophical accounts of propensities, randomness, and causation. In Chapter 1, I discuss how the following essays are to be considered as illuminating the prospects for these strategies, which I call reductive essentialism, subjectivism and pragmatism. The discussion is framed within a survey of approaches to modality more broadly construed. ;In Chapter 2, I argue that any broadly dispositional analysis of probability as a physical property (...) will either fail to give an adequate explication of probability, or else will fail to provide an explication that can be gainfully employed elsewhere . The diversity and number of arguments suggests that there is little prospect of any successful analysis along these lines. ;The concept of randomness has been unjustly neglected in recent philosophical literature, and when philosophers have thought about it, they have usually acquiesced in views about the concept that are fundamentally flawed. In Chapter 3 I try to redress this. After indicating the ways in which the existing accounts are flawed, I propose that randomness is to be understood as a special case of the epistemic concept of the unpredictability of a process. This proposal arguably captures the intuitive desiderata for the concept of randomness; at least it should suggest that the commonly accepted accounts cannot be the whole story and more philosophical attention needs to be paid. ;Russell famously argued that causation should be dispensed with. He gave two explicit arguments for this conclusion, which can be defused if we loosen the ties between causation and determinism. In Chapter 4, I define a concept of causation which meets Russell's conditions but does not reduce to triviality. Unfortunately, a further serious problem is implicit beneath the details of Russell's arguments, which I call the causal exclusion problem. Meeting this problem involves deploying a pragmatic account of the nature and function of modal concepts. Russell's scruples about causation can be accommodated, even as we partially legitimise the pervasiveness of causal explanations in folk and scientific practice. (shrink)
In this paper the strategy for the eliminative reduction of the alethic modalities suggested by John Venn is outlined and it is shown to anticipate certain related contemporary empiricistic and nominalistic projects. Venn attempted to reduce the alethic modalities to probabilities, and thus suggested a promising solution to the nagging issue of the inclusion of modal statements in empiricistic philosophical systems. However, despite the promise that this suggestion held for laying the ‘ghost of modality’ to rest, this general approach, tempered (...) modal eliminativism, is shown to be inadequate for that task.<br><br>. (shrink)
This dissertation looks at a set of interconnected questions concerning the foundations of probability, and gives a series of interconnected answers. At its core is a piece of old-fashioned philosophical analysis, working out what probability is. Or equivalently, investigating the semantic question of what is the meaning of ‘probability’? Like Keynes and Carnap, I say that probability is degree of reasonable belief. This immediately raises an epistemological question, which degrees count as reasonable? To solve that in its full generality would (...) mean the end of human inquiry, so that won’t be attempted here. Rather I will follow tradition and merely investigate which sets of partial beliefs are coherent. -/- The standard answer to this question, what is commonly called the Bayesian answer, says that degrees of belief are coherent iff they form a probability function. I disagree with the way this is usually justified, but subject to an important qualification I accept the answer. The important qualification is that degrees of belief may be imprecise, or vague. Part one of the dissertation, chapters 1 to 6, looks largely at the consequences of this qualification for the semantic and epistemological questions already mentioned. It turns out that when we allow degrees of belief to be imprecise, we can discharge potentially fatal objections to some philosophically attractive theses. Two of these, that probability is degree of reasonable belief and that the probability calculus provides coherence constraints on partial beliefs, have been mentioned. Others include the claim, defended in chapter 4, that chance is probability given total history. -/- As well as these semantic and epistemological questions, studies of the foundations of probability usually include a detailed discussion of decision theory. For reasons set out in chapter 2, I deny we can gain epistemological insights from decision theory. Nevertheless, it is an interesting field to study on its own, and it might be expected that there would be decision theoretic consequences of allowing imprecise degrees of belief. As I show in part two, this expectation seems to be mistaken. Chapter 9 shows that there aren’t interesting consequences of this theory for decision theory proper, and chapters 10 and 11 show that Keynes’s attempt to use imprecision in degrees of belief to derive a distinctive theory of interest rates is unsound. -/- Chapters 7 and 8 provide a link between these two parts. In chapter 7 I look at some previous philosophical investigations into the effects of imprecision. In chapter 8 I develop what I take to be the best competitor to the theory defended here – a constructivist theory of probability. On this view degrees of belief are precise, but the relevant coherence constraint is a constructivist probability calculus. This view is, I think, mistaken, but the calculus has some intrinsic interest, and there are at least enough arguments for it to warrant a chapter-length examination. (shrink)
My title is intended to recall Terence Fine's excellent survey, Theories of Probability [1973]. I shall consider some developments that have occurred in the intervening years, and try to place some of the theories he discussed in what is now a slightly longer perspective. Completeness is not something one can reasonably hope to achieve in a journal article, and any selection is bound to reflect a view of what is salient. In a subject as prone to dispute as this, there (...) will inevitably be many who will disagree with any author's views, and I take the opportunity to apologize in advance to all such people for what they will see as the narrowness and distortion of mine. (shrink)
Philosophy of Probability provides a comprehensive introduction to theoretical issues that occupy a central position in disciplines ranging from philosophy of ...
Probability is not an unambiguous concept within the sciences or in vernacular language, yet it is fundamental to much of behavior analysis. The present paper examines some problems this ambiguity creates in general,as well as within the experimental analysis of behavior, in particular. As background material, we first introduce the three most common theories of probability in mathematics and science, discussing their advantages and disadvantages, and their relevance to behavior analysis. Next, we discuss the concept of probability as encountered in (...) the writings of B.F. Skinner and in the contemporary behavior analysis more generally, the latter being based on material drawn from the professional literature and from a questionnaire survey. Although the exercise is basically a descriptive one, we do conclude with some suggestions that may promote more effective action on those occasions when behavior analysts speak of "probability.". (shrink)
This paper attempts to define Exploratory Data Analysis (EDA) more precisely than usual, and to produce the beginnings of a philosophy of this topical and somewhat novel branch of statistics. A data set is, roughly speaking, a collection of k-tuples for some k. In both descriptive statistics and in EDA, these k-tuples, or functions of them, are represented in a manner matched to human and computer abilities with a view to finding patterns that are not "kinkera". A kinkus is a (...) pattern that has a negligible probability of being even partly potentially explicable. A potentially explicable pattern is one for which there probably exists a hypothesis of adequate "explicativity", which is another technical probabilistic concept. A pattern can be judged to be probably potentially explicable even if we cannot find an explanation. The theory of probability understood here is one of partially ordered (interval-valued), subjective (personal) probabilities. Among other topics relevant to a philosophy of EDA are the "reduction" of data; Francis Bacon's philosophy of science; the automatic formulation of hypotheses; successive deepening of hypotheses; neurophysiology; and rationality of type II. (shrink)
I WHAT IS PROBABILITY? Style manuals advise us that the proper way to begin a piece of expository writing is to introduce and identify clearly the subject ...