This paper offers a new angle on the common idea that the process of science does not support epistemic diversity. Under minimal assumptions on the nature of journal editing, we prove that editorial procedures, even when impartial in themselves, disadvantage less prominent research programs. This purely statistical bias in article selection further skews existing differences in the success rate and hence attractiveness of research programs, and exacerbates the reputation difference between the programs. After a discussion of the modeling assumptions, the (...) paper ends with a number of recommendations that may help promote scientific diversity through editorial decision making. (shrink)
This paper investigates the viability of the Bayesian model of belief change. Van Benthem (2003) has shown that a particular kind of information change typical for dynamic epistemic logic cannot be modelled by Bayesian conditioning. I argue that the problems described by van Benthem come about because the information change alters the semantics in which the change is supposed to be modelled by conditioning: it induces a shift in meanings. I then show that meaning shifts can be modelled in terms (...) of conditioning by employing a semantics that makes these changes in meaning explicit, and that the appropriate probability kinematics can be described by Dempster’s rule. The new model thereby facilitates a better understanding between probabilistic epistemology and dynamic epistemic logic. (shrink)
List and Pettit have stated an impossibility theorem about the aggregation of individual opinion states. Building on recent work on the lottery paradox, this paper offers a variation on that result. The present result places different constraints on the voting agenda and the domain of profiles, but it covers a larger class of voting rules, which need not satisfy the proposition-wise independence of votes.
The frequent occurrence of comorbidity has brought about an extensive theoretical debate in psychiatry. Why are the rates of psychiatric comorbidity so high and what are their implications for the ontological and epistemological status of comorbid psychiatric diseases? Current explanations focus either on classification choices or on causal ties between disorders. Based on empirical and philosophical arguments, we propose a conventionalist interpretation of psychiatric comorbidity instead. We argue that a conventionalist approach fits well with research and clinical practice and resolves (...) two problems for psychiatric diseases: experimenter’s regress and arbitrariness. (shrink)
This paper concerns exchangeable analogical predictions based on similarity relations between predicates, and deals with a restricted class of such relations. It describes a system of Carnapian λγ rules on underlying predicate families to model the analogical predictions for this restricted class. Instead of the usual axiomatic definition, the system is characterized with a Bayesian model that employs certain statistical hypotheses. Finally the paper argues that the Bayesian model can be generalized to cover cases outside the restricted class of similarity (...) relations. (shrink)
In this paper I discuss probabilistic models of experimental intervention, and I show that such models elucidate the intuition that observations during intervention are more informative than observations per se. Because of this success, it seems attractive to also cast other problems addressed by the philosophy of experimentation in terms of such probabilistic models. However, a critical examination of the models reveals that some of the aspects of experimentation are covered up rather than resolved by probabilistic modelling. I end by (...) drawing a number of general lessons on the use of formal methods in the philosophy of science. (shrink)
We present a conservative extension of a Bayesian account of confirmation that can deal with the problem of old evidence and new theories. So-called open-minded Bayesianism challenges the assumption—implicit in standard Bayesianism—that the correct empirical hypothesis is among the ones currently under consideration. It requires the inclusion of a catch-all hypothesis, which is characterized by means of sets of probability assignments. Upon the introduction of a new theory, the former catch-all is decomposed into a new empirical hypothesis and a new (...) catch-all. As will be seen, this motivates a second update rule, besides Bayes’ rule, for updating probabilities in light of a new theory. This rule conserves probability ratios among the old hypotheses. This framework allows for old evidence to confirm a new hypothesis due to a shift in the theoretical context. The result is a version of Bayesianism that, in the words of Earman, “keep[s] an open mind, but not so open that your brain falls out”. (shrink)
This article presents a generalization of the Condorcet Jury Theorem. All results to date assume a fixed value for the competence of jurors, or alternatively, a fixed probability distribution over the possible competences of jurors. In this article, we develop the idea that we can learn the competence of the jurors by the jury vote. We assume a uniform prior probability assignment over the competence parameter, and we adapt this assignment in the light of the jury vote. We then compute (...) the posterior probability, conditional on the jury vote, of the hypothesis voted over. We thereby retain the central results of Condorcet, but we also show that the posterior probability depends on the size of the jury as well as on the absolute margin of the majority. (shrink)
An inductive logic is a system of inference that describes the relation between propositions on data, and propositions that extend beyond the data, such as predictions over future data, and general conclusions on all possible data. Statistics, on the other hand, is a mathematical discipline that describes procedures for deriving results about a population from sample data. These results include predictions on future samples, decisions on rejecting or accepting a hypothesis about the population, the determination of probability assignments over such (...) hypotheses, the selection of a statistical model for studying the population, and so on. Both inductive logic and statistics are calculi for getting from the given data to propositions or results that transcend the data. (shrink)
Until now, antirealists have offered sketches of a theory of truth, at best. In this paper, we present a probabilist account of antirealist truth in some formal detail, and we assess its ability to deal with the problems that are standardly taken to beset antirealism.
This paper studies the use of hypotheses schemes in generatinginductive predictions. After discussing Carnap–Hintikka inductive logic,hypotheses schemes are defined and illustrated with two partitions. Onepartition results in the Carnapian continuum of inductive methods, the otherresults in predictions typical for hasty generalization. Following theseexamples I argue that choosing a partition comes down to making inductiveassumptions on patterns in the data, and that by choosing appropriately anyinductive assumption can be made. Further considerations on partitions makeclear that they do not suggest any solution (...) to the problem of induction.Hypotheses schemes provide the tools for making inductive assumptions, but theyalso reveal the need for such assumptions. (shrink)
This paper develops a probabilistic model of belief change under interpretation shifts, in the context of a problem case from dynamic epistemic logic. Van Benthem  has shown that a particular kind of belief change, typical for dynamic epistemic logic, cannot be modelled by standard Bayesian conditioning. I argue that the problems described by van Benthem come about because the belief change alters the semantics in which the change is supposed to be modelled: the new information induces a shift in (...) the interpretation of the sentences. In this paper I show that interpretation shifts can be modeled in terms of updating by conditioning. The model derives from the knowledge structures developed by Fagin et al , and hinges on a distinction between the propositional and informational content of sentences. Finally, I show that Dempster-Shafer theory provides the appropriate probability kinematics for the model. (shrink)
We consider the use of interventions for resolving a problem of unidentified statistical models. The leading examples are from latent variable modelling, an influential statistical tool in the social sciences. We first explain the problem of statistical identifiability and contrast it with the identifiability of causal models. We then draw a parallel between the latent variable models and Bayesian networks with hidden nodes. This allows us to clarify the use of interventions for dealing with unidentified statistical models. We end by (...) discussing the philosophical and methodological import of our result. (shrink)
Summary. This paper proposes a common framework for various probabilistic logics. It consists of a set of uncertain premises with probabilities attached to them. This raises the question of the strength of a conclusion, but without imposing a particular semantics, no general solution is possible. The paper discusses several possible semantics by looking at it from the perspective of probabilistic argumentation.
Jones & Love (J&L) suggest that Bayesian approaches to the explanation of human behavior should be constrained by mechanistic theories. We argue that their proposal misconstrues the relation between process models, such as the Bayesian model, and mechanisms. While mechanistic theories can answer specific issues that arise from the study of processes, one cannot expect them to provide constraints in general.
This article investigates a problem for statistical model evaluation, in particular for curve fitting: by employing a different family of curves we can fit any scatter plot almost perfectly at apparently minor cost in terms of model complexity. The problem is resolved by an appeal to prior probabilities. This leads to some general lessons about how to approach model evaluation.
Over the past decades or so the probabilistic model of rational belief has enjoyed increasing interest from researchers in epistemology and the philosophy of science. Of course, such probabilistic models were used for much longer in economics, in game theory, and in other disciplines concerned with decision making. Moreover, Carnap and co-workers used probability theory to explicate philosophical notions of confirmation and induction, thereby targeting epistemic rather than decision-theoretic aspects of rationality. However, following Carnap’s early applications, philosophy has more recently (...) seen an increased popularity of probabilistic models in other areas concerned with the philosophical analysis of belief: there are models targeting coherence, informativeness, simplicity, and so on.In brief, the probabilistic model of belief comprises of a language, detailing the propositions about which an agent is supposed to have beliefs, and a function over the language that expresses beliefs: .. (shrink)
This paper presents the progicnet programme. It proposes a general framework for probabilistic logic that can guide inference based on both logical and probabilistic input. After an introduction to the framework as such, it is illustrated by means of a toy example from psychometrics. It is shown that the framework can accommodate a number of approaches to probabilistic reasoning: Bayesian statistical inference, evidential probability, probabilistic argumentation, and objective Bayesianism. The framework thus provides insight into the relations between these approaches, it (...) illustrates how the results of different approaches can be combined, and it provides a basis for doing efficient inference in each of the approaches. (shrink)
This article argues that time-asymmetric processes in spacetime are enantiomorphs. Subsequently, the Kantian puzzle concerning enantiomorphs in space is reviewed to introduce a number of positions concerning enantiomorphy, and to arrive at a dilemma: one must either reject that orientations of enantiomorphs are determinate, or furnish space or objects with orientation. The discussion on space is then used to derive two problems in the debate on the direction of time. First, it is shown that certain kinds of reductionism about the (...) direction of time are at variance with the claim that orientation of enantiomorphic objects is intrinsic. Second, it is argued that reductive explanations of time-asymmetric processes presuppose that enantiomorphic processes do not have determinate orientation. (shrink)
This article comments on the article of Thorn and Schurz in this volume and focuses on, what we call, the problem of parasitic experts. We discuss that both meta-induction and crowd wisdom can be understood as pertaining to absolute reliability rather than comparative optimality, and we suggest that the involvement of reliability will provide a handle on this problem.