In this paper, we address the problem of truth approximation through theory change, asking whether revising our theories by newly acquired data leads us closer to the truth about a given domain. More particularly, we focus on “nomic conjunctive theories”, i.e., theories expressed as conjunctions of logically independent statements concerning the physical or, more generally, nomic possibilities and impossibilities of the domain under inquiry. We define both a comparative and a quantitative notion of the verisimilitude of such theories, and identify (...) suitable conditions concerning the (partial) correctness of acquired data, under which revising our theories by data leads us closer to “the nomic truth”, construed as the target of scientific inquiry. We conclude by indicating some further developments, generalizations, and open issues arising from our results. (shrink)
The Linda paradox is a key topic in current debates on the rationality of human reasoning and its limitations. We present a novel analysis of this paradox, based on the notion of verisimilitude as studied in the philosophy of science. The comparison with an alternative analysis based on probabilistic confirmation suggests how to overcome some problems of our account by introducing an adequately defined notion of verisimilitudinarian confirmation.
This is the introductory essay to the Italian translation of Matt Ridley's "The origins of virtue", surveying the game-theoretic and evolutionary approaches to the emergence and evolution of cooperation and altruism.
Confirmation of a hypothesis by evidence can be measured by one of the so far known incremental measures of confirmation. As we show, incremental measures can be formally defined as the measures of confirmation satisfying a certain small set of basic conditions. Moreover, several kinds of incremental measure may be characterized on the basis of appropriate structural properties. In particular, we focus on the so-called Matthew properties: we introduce a family of six Matthew properties including the reverse Matthew effect; we (...) further prove that incremental measures endowed with reverse Matthew effect are possible; finally, we shortly consider the problem of the plausibility of Matthew properties. (shrink)
Theory change is a central concern in contemporary epistemology and philosophy of science. In this paper, we investigate the relationships between two ongoing research programs providing formal treatments of theory change: the (post-Popperian) approach to verisimilitude and the AGM theory of belief change. We show that appropriately construed accounts emerging from those two lines of epistemological research do yield convergences relative to a specified kind of theories, here labeled “conjunctive”. In this domain, a set of plausible conditions are identified which (...) demonstrably capture the verisimilitudinarian effectiveness of AGM belief change, i.e., its effectiveness in tracking truth approximation. We conclude by indicating some further developments and open issues arising from our results. (shrink)
We provide a 'verisimilitudinarian' analysis of the well-known Linda paradox or conjunction fallacy, i.e., the fact that most people judge the probability of the conjunctive statement "Linda is a bank teller and is active in the feminist movement" (B & F) as more probable than the isolated statement "Linda is a bank teller" (B), contrary to an uncontroversial principle of probability theory. The basic idea is that experimental participants may judge B & F a better hypothesis about Linda as compared (...) to B because they evaluate B & F as more verisimilar than B. In fact, the hypothesis "feminist bank teller", while less likely to be true than "bank teller", may well be a better approximation to the truth about Linda. (shrink)
Bayesian epistemology postulates a probabilistic analysis of many sorts of ordinary and scientific reasoning. Huber () has provided a novel criticism of Bayesianism, whose core argument involves a challenging issue: confirmation by uncertain evidence. In this paper, we argue that under a properly defined Bayesian account of confirmation by uncertain evidence, Huber's criticism fails. By contrast, our discussion will highlight what we take as some new and appealing features of Bayesian confirmation theory. Introduction Uncertain Evidence and Bayesian Confirmation Bayesian Confirmation (...) by Uncertain Evidence: Test Cases and Basic Principles CiteULike Connotea Del.icio.us What's this? (shrink)
Bayesian epistemology postulates a probabilistic analysis of many sorts of ordinary and scientific reasoning. Huber () has provided a novel criticism of Bayesianism, whose core argument involves a challenging issue: confirmation by uncertain evidence. In this paper, we argue that under a properly defined Bayesian account of confirmation by uncertain evidence, Huber's criticism fails. By contrast, our discussion will highlight what we take as some new and appealing features of Bayesian confirmation theory.
In this paper it is argued that qualitative theories (Q-theories) can be used to describe the statistical structure of cross classified populations and that the notion of verisimilitude provides an appropriate tool for measuring the statistical adequacy of Q-theories. First of all, a short outline of the post-Popperian approaches to verisimilitude and of the related verisimilitudinarian non-falsificationist methodologies (VNF-methodologies) is given. Secondly, the notion of Q-theory is explicated, and the qualitative verisimilitude of Q-theories is defined. Afterwards, appropriate measures for the (...) statistical verisimilitude of Q-theories are introduced, so to obtain a clear formulation of the intuitive idea that the statistical truth about cross classified populations can be approached by falsified Q-theories. Finally, it is argued that some basic intuitions underlying VNF-methodologies are shared by the so-called prediction logic, developed by the statisticians and social scientists David K. Hildebrand, James D. Laing and Howard Rosenthal. (shrink)
In this paper I consider a number of metaphilosophical problems concerning the relations between logic and philosophy of science, as they appear from the neo-classical perspective on philosophy of science outlined by Theo Kuipers in ICR and SiS. More specifically, I focus on two pairs of issues: (A) the (dis)similarities between the goals and methods of logic and those of philosophy of science, w.r.t. (1) the role of theorems within the two disciplines; (2) the falsifiability of their theoretical claims; and (...) (B) the interactions between logic and philosophy of science, w.r.t. (3) the possibility of applying logic in philosophy of science; (4) the possibility that the two disciplines are sources of challenging problems for each other. (shrink)
Theo AF Kuipers THE THREEFOLD EVALUATION OF THEORIES A SYNOPSIS OF FROM INSTRUMENTALISM TO CONSTRUCTIVE REALISM. ON SOME RELATIONS BETWEEN CONFIRMATION, EMPIRICAL PROGRESS, AND TRUTH APPROXIMATION (2000) ABSTRACT.
Finland is internationally known as one of the leading centers of twentieth century analytic philosophy. This volume offers for the first time an overall survey of the Finnish analytic school. The rise of this trend is illustrated by original articles of Edward Westermarck, Eino Kaila, Georg Henrik von Wright, and Jaakko Hintikka. Contributions of Finnish philosophers are then systematically discussed in the fields of logic, philosophy of language, philosophy of science, history of philosophy, ethics and social philosophy. Metaphilosophical reflections on (...) the nature of philosophy are highlighted by the Finnish dialogue between analytic philosophy, phenomenology, pragmatism, and critical theory. (shrink)
An important problem in inductive probability theory is the design of exchangeable analogical methods, i.e., of exchangeable inductive methods that take into account certain considerations of analogy by similarity for predictive inferences. Here a precise reformulation of the problem of predictive analogy is given and a new family of exchangeable analogical methods is introduced.Firstly, it is proved that the exchangeable analogical method introduced by Skyrms (1993) does not satisfy the best known general principles of predictive analogy. Secondly, Skyrms's approach — (...) consisting of the usage of particular hyper-Carnapian methods, i.e., mixtures of Carnapian inductive methods — is adopted in the design of a new family of exchangeable analogical methods. Lastly, it is proved that such methods satisfy an interesting general principle of predictive analogy. (shrink)
According to the Bayesian view, scientific hypotheses must be appraised in terms of their posterior probabilities relative to the available experimental data. Such posterior probabilities are derived from the prior probabilities of the hypotheses by applying Bayes'theorem. One of the most important problems arising within the Bayesian approach to scientific methodology is the choice of prior probabilities. Here this problem is considered in detail w.r.t. two applications of the Bayesian approach: (1) the theory of inductive probabilities (TIP) developed by Rudolf (...) Carnap and other epistomologists and (2) the analysis of the multinational inferences provided by Bayesian statstics (BS). ... Zie: Summary. (shrink)
The problem of distance from the truth, and more generally distance between hypotheses, is considered here with respect to the case of quantitative hypotheses concerning the value of a given scientific quantity.Our main goal consists in the explication of the concept of distance D(I, ) between an interval hypothesis I and a point hypothesis . In particular, we attempt to give an axiomatic foundation of this notion on the basis of a small number of adequacy conditions.