In this paper I discuss probabilistic models of experimental intervention, and I show that such models elucidate the intuition that observations during intervention are more informative than observations per se. Because of this success, it seems attractive to also cast other problems addressed by the philosophy of experimentation in terms of such probabilistic models. However, a critical examination of the models reveals that some of the aspects of experimentation are covered up rather than resolved by probabilistic modelling. I end by (...) drawing a number of general lessons on the use of formal methods in the philosophy of science. (shrink)
This paper investigates the viability of the Bayesian model of belief change. Van Benthem (2003) has shown that a particular kind of information change typical for dynamic epistemic logic cannot be modelled by Bayesian conditioning. I argue that the problems described by van Benthem come about because the information change alters the semantics in which the change is supposed to be modelled by conditioning: it induces a shift in meanings. I then show that meaning shifts can be modelled in terms (...) of conditioning by employing a semantics that makes these changes in meaning explicit, and that the appropriate probability kinematics can be described by Dempster’s rule. The new model thereby facilitates a better understanding between probabilistic epistemology and dynamic epistemic logic. (shrink)
An inductive logic is a system of inference that describes the relation between propositions on data, and propositions that extend beyond the data, such as predictions over future data, and general conclusions on all possible data. Statistics, on the other hand, is a mathematical discipline that describes procedures for deriving results about a population from sample data. These results include predictions on future samples, decisions on rejecting or accepting a hypothesis about the population, the determination of probability assignments over such (...) hypotheses, the selection of a statistical model for studying the population, and so on. Both inductive logic and statistics are calculi for getting from the given data to propositions or results that transcend the data. (shrink)
Van Fraassen's Judy Benjamin problem has generally been taken to show that not all rational changes of belief can be modelled in a probabilistic framework if the available update rules are restricted to Bayes's rule and Jeffrey's generalization thereof. But alternative rules based on distance functions between probability assignments that allegedly can handle the problem seem to have counterintuitive consequences. Taking our cue from a recent proposal by Bradley, we argue that Jeffrey's rule can solve the Judy Benjamin problem after (...) all. Moreover, we show that the specific instance of Jeffrey's rule that solves the Judy Benjamin problem can be underpinned by a particular distance function. Finally, we extend the set of distance functions to ones that take into account the varying degrees to which propositions may be epistemically entrenched. (shrink)
This article presents a generalization of the Condorcet Jury Theorem. All results to date assume a fixed value for the competence of jurors, or alternatively, a fixed probability distribution over the possible competences of jurors. In this article, we develop the idea that we can learn the competence of the jurors by the jury vote. We assume a uniform prior probability assignment over the competence parameter, and we adapt this assignment in the light of the jury vote. We then compute (...) the posterior probability, conditional on the jury vote, of the hypothesis voted over. We thereby retain the central results of Condorcet, but we also show that the posterior probability depends on the size of the jury as well as on the absolute margin of the majority. (shrink)
Until now, antirealists have offered sketches of a theory of truth, at best. In this paper, we present a probabilist account of antirealist truth in some formal detail, and we assess its ability to deal with the problems that are standardly taken to beset antirealism.
In V. N. Huynh (ed.): Interval / Probabilistic Uncertainty and Non-Classical Logics, Advances in Soft Computing Series, Springer 2008, pp. 268-279. This paper proposes a common framework for various probabilistic logics. It consists of a set of uncertain premises with probabilities attached to them. This raises the question of the strength of a conclusion, but without imposing a particular semantics, no general solution is possible. The paper discusses several possible semantics by looking at it from the perspective of probabilistic argumentation.
Summary. This paper proposes a common framework for various probabilistic logics. It consists of a set of uncertain premises with probabilities attached to them. This raises the question of the strength of a conclusion, but without imposing a particular semantics, no general solution is possible. The paper discusses several possible semantics by looking at it from the perspective of probabilistic argumentation.
This paper presents the progicnet programme. It proposes a general framework for probabilistic logic that can guide inference based on both logical and probabilistic input. After an introduction to the framework as such, it is illustrated by means of a toy example from psychometrics. It is shown that the framework can accommodate a number of approaches to probabilistic reasoning: Bayesian statistical inference, evidential probability, probabilistic argumentation, and objective Bayesianism. The framework thus provides insight into the relations between these approaches, it (...) illustrates how the results of different approaches can be combined, and it provides a basis for doing efficient inference in each of the approaches. (shrink)
List and Pettit have stated an impossibility theorem about the aggregation of individual opinion states. Building on recent work on the lottery paradox, this paper offers a variation on that result. The present result places different constraints on the voting agenda and the domain of profiles, but it covers a larger class of voting rules, which need not satisfy the proposition-wise independence of votes.
This paper concerns exchangeable analogical predictions based on similarity relations between predicates, and deals with a restricted class of such relations. It describes a system of Carnapian λγ rules on underlying predicate families to model the analogical predictions for this restricted class. Instead of the usual axiomatic definition, the system is characterized with a Bayesian model that employs certain statistical hypotheses. Finally the paper argues that the Bayesian model can be generalized to cover cases outside the restricted class of similarity (...) relations. (shrink)
This article argues that time-asymmetric processes in spacetime are enantiomorphs. Subsequently, the Kantian puzzle concerning enantiomorphs in space is reviewed to introduce a number of positions concerning enantiomorphy, and to arrive at a dilemma: one must either reject that orientations of enantiomorphs are determinate, or furnish space or objects with orientation. The discussion on space is then used to derive two problems in the debate on the direction of time. First, it is shown that certain kinds of reductionism about the (...) direction of time are at variance with the claim that orientation of enantiomorphic objects is intrinsic. Second, it is argued that reductive explanations of time-asymmetric processes presuppose that enantiomorphic processes do not have determinate orientation. (shrink)
This paper studies the use of hypotheses schemes in generatinginductive predictions. After discussing Carnap–Hintikka inductive logic,hypotheses schemes are defined and illustrated with two partitions. Onepartition results in the Carnapian continuum of inductive methods, the otherresults in predictions typical for hasty generalization. Following theseexamples I argue that choosing a partition comes down to making inductiveassumptions on patterns in the data, and that by choosing appropriately anyinductive assumption can be made. Further considerations on partitions makeclear that they do not suggest any solution (...) to the problem of induction.Hypotheses schemes provide the tools for making inductive assumptions, but theyalso reveal the need for such assumptions. (shrink)