Theory change is a central concern in contemporary epistemology and philosophy of science. In this paper, we investigate the relationships between two ongoing research programs providing formal treatments of theory change: the (post-Popperian) approach to verisimilitude and the AGM theory of belief change. We show that appropriately construed accounts emerging from those two lines of epistemological research do yield convergences relative to a specified kind of theories, here labeled “conjunctive”. In this domain, a set of plausible conditions are identified which (...) demonstrably capture the verisimilitudinarian effectiveness of AGM belief change, i.e., its effectiveness in tracking truth approximation. We conclude by indicating some further developments and open issues arising from our results. (shrink)
Popper’s original definition of truthlikeness relied on a central insight: that truthlikeness combines truth and information, in the sense that a proposition is closer to the truth the more true consequences and the less false consequences it entails. As intuitively compelling as this definition may be, it is untenable, as proved long ago; still, one can arguably rely on Popper’s intuition to provide an adequate account of truthlikeness. To this aim, we mobilize some classical work on partial entailment in defining (...) a new measure of truthlikeness which satisfies a number of desiderata. The resulting account has some interesting and surprising connections with other accounts on the market, thus shedding new light on current attempts of systematizing different approaches to verisimilitude. (shrink)
We provide a 'verisimilitudinarian' analysis of the well-known Linda paradox or conjunction fallacy, i.e., the fact that most people judge the probability of the conjunctive statement "Linda is a bank teller and is active in the feminist movement" (B & F) as more probable than the isolated statement "Linda is a bank teller" (B), contrary to an uncontroversial principle of probability theory. The basic idea is that experimental participants may judge B & F a better hypothesis about Linda as compared (...) to B because they evaluate B & F as more verisimilar than B. In fact, the hypothesis "feminist bank teller", while less likely to be true than "bank teller", may well be a better approximation to the truth about Linda. (shrink)
Confirmation of a hypothesis by evidence can be measured by one of the so far known incremental measures of confirmation. As we show, incremental measures can be formally defined as the measures of confirmation satisfying a certain small set of basic conditions. Moreover, several kinds of incremental measure may be characterized on the basis of appropriate structural properties. In particular, we focus on the so-called Matthew properties: we introduce a family of six Matthew properties including the reverse Matthew effect; we (...) further prove that incremental measures endowed with reverse Matthew effect are possible; finally, we shortly consider the problem of the plausibility of Matthew properties. (shrink)
According to the Bayesian view, scientific hypotheses must be appraised in terms of their posterior probabilities relative to the available experimental data. Such posterior probabilities are derived from the prior probabilities of the hypotheses by applying Bayes'theorem. One of the most important problems arising within the Bayesian approach to scientific methodology is the choice of prior probabilities. Here this problem is considered in detail w.r.t. two applications of the Bayesian approach: (1) the theory of inductive probabilities (TIP) developed by Rudolf (...) Carnap and other epistomologists and (2) the analysis of the multinational inferences provided by Bayesian statstics (BS). ... Zie: Summary. (shrink)
Popper’s original definition of truthlikeness relied on a central insight: that truthlikeness combines truth and information, in the sense that a proposition is closer to the truth the more true consequences and the less false consequences it entails. As intuitively compelling as this definition may be, it is untenable, as proved long ago; still, one can arguably rely on Popper’s intuition to provide an adequate account of truthlikeness. To this aim, we mobilize some classical work on partial entailment in defining (...) a new measure of truthlikeness which satisfies a number of desiderata. The resulting account has some interesting and surprising connections with other accounts on the market, thus shedding new light on current attempts of systematizing different approaches to verisimilitude. (shrink)
In this paper, we address the problem of truth approximation through theory change, asking whether revising our theories by newly acquired data leads us closer to the truth about a given domain. More particularly, we focus on “nomic conjunctive theories”, i.e., theories expressed as conjunctions of logically independent statements concerning the physical or, more generally, nomic possibilities and impossibilities of the domain under inquiry. We define both a comparative and a quantitative notion of the verisimilitude of such theories, and identify (...) suitable conditions concerning the (partial) correctness of acquired data, under which revising our theories by data leads us closer to “the nomic truth”, construed as the target of scientific inquiry. We conclude by indicating some further developments, generalizations, and open issues arising from our results. (shrink)
We explore the grammar of Bayesian confirmation by focusing on some likelihood principles, including the Weak Law of Likelihood. We show that none of the likelihood principles proposed so far is satisfied by all incremental measures of confirmation, and we argue that some of these measures indeed obey new, prima facie strange, antilikelihood principles. To prove this, we introduce a new measure that violates the Weak Law of Likelihood while satisfying a strong antilikelihood condition. We conclude by hinting at some (...) relevant links between the likelihood principles considered here and other properties of Bayesian confirmation recently explored in the literature. (shrink)
An important problem in inductive probability theory is the design of exchangeable analogical methods, i.e., of exchangeable inductive methods that take into account certain considerations of analogy by similarity for predictive inferences. Here a precise reformulation of the problem of predictive analogy is given and a new family of exchangeable analogical methods is introduced.Firstly, it is proved that the exchangeable analogical method introduced by Skyrms (1993) does not satisfy the best known general principles of predictive analogy. Secondly, Skyrms's approach — (...) consisting of the usage of particular hyper-Carnapian methods, i.e., mixtures of Carnapian inductive methods — is adopted in the design of a new family of exchangeable analogical methods. Lastly, it is proved that such methods satisfy an interesting general principle of predictive analogy. (shrink)
In this paper it is argued that qualitative theories (Q-theories) can be used to describe the statistical structure of cross classified populations and that the notion of verisimilitude provides an appropriate tool for measuring the statistical adequacy of Q-theories. First of all, a short outline of the post-Popperian approaches to verisimilitude and of the related verisimilitudinarian non-falsificationist methodologies (VNF-methodologies) is given. Secondly, the notion of Q-theory is explicated, and the qualitative verisimilitude of Q-theories is defined. Afterwards, appropriate measures for the (...) statistical verisimilitude of Q-theories are introduced, so to obtain a clear formulation of the intuitive idea that the statistical truth about cross classified populations can be approached by falsified Q-theories. Finally, it is argued that some basic intuitions underlying VNF-methodologies are shared by the so-called prediction logic, developed by the statisticians and social scientists David K. Hildebrand, James D. Laing and Howard Rosenthal. (shrink)
Starting with Popper, philosophers and logicians have proposed different accounts of verisimilitude or truthlikeness. One way of classifying such accounts is to distinguish between “conjunctive” and “disjunctive” ones. In this paper, we focus on our own “basic feature” approach to verisimilitude, which naturally belongs to the conjunctive family. We start by surveying the landscape of conjunctive accounts; then, we introduce two new measures of verisimilitude and discuss their properties; finally, we conclude by hinting at some surprising relations between our conjunctive (...) approach and a disjunctive account of verisimilitude widely discussed in the literature. (shrink)
Theo AF Kuipers THE THREEFOLD EVALUATION OF THEORIES A SYNOPSIS OF FROM INSTRUMENTALISM TO CONSTRUCTIVE REALISM. ON SOME RELATIONS BETWEEN CONFIRMATION, EMPIRICAL PROGRESS, AND TRUTH APPROXIMATION (2000) ABSTRACT.
The problem of distance from the truth, and more generally distance between hypotheses, is considered here with respect to the case of quantitative hypotheses concerning the value of a given scientific quantity.Our main goal consists in the explication of the concept of distance D(I, ) between an interval hypothesis I and a point hypothesis . In particular, we attempt to give an axiomatic foundation of this notion on the basis of a small number of adequacy conditions.
Mathematical game theory – developed starting from the publication of The Theory of Games and Economic Behavior , by John von Neumann and Oskar Morgenstern – aims to outline an ideal model of behaviour of rational agents involved in some interaction with other rational agents. For this reason, game theory has immediately attracted the attention of philosophers dealing with practical rationality and, since the fifties, has been applied to the analysis of several issues concerning ethics and philosophy of politics. Here (...) we will focus on one of the most interesting applications of game theory to ethical-political inquiry, i.e., with the game theoretic analysis of some problems related to the evolution of moral norms. Firstly, we will provide a short outline of the development of game theory, which has lead to the formulation of a plurality of different game theories. It will be shown that such theories can be classified in two main groups: rationalistic game theories –in two different versions: classical and epistemic – and evolutionary game theories. Moreover, some basic elements of classical game theory will be introduced and the key ideas of epistemic and evolutionary game theories will be illustrated. Afterwards, the main approaches developed within the ethical-political applications of game theory will be shortly described. Finally, some results obtained in the last twenty years by the researchers who have analysed the evolution of moral norms by the conceptual tools of evolutionary and epistemic game theories, will be examined. (shrink)
This is the introductory essay to the Italian translation of Matt Ridley's "The origins of virtue", surveying the game-theoretic and evolutionary approaches to the emergence and evolution of cooperation and altruism.
The Linda paradox is a key topic in current debates on the rationality of human reasoning and its limitations. We present a novel analysis of this paradox, based on the notion of verisimilitude as studied in the philosophy of science. The comparison with an alternative analysis based on probabilistic confirmation suggests how to overcome some problems of our account by introducing an adequately defined notion of verisimilitudinarian confirmation.
Bayesian epistemology postulates a probabilistic analysis of many sorts of ordinary and scientific reasoning. Huber has provided a novel criticism of Bayesianism, whose core argument involves a challenging issue: confirmation by uncertain evidence. In this paper, we argue that under a properly defined Bayesian account of confirmation by uncertain evidence, Huber's criticism fails. By contrast, our discussion will highlight what we take as some new and appealing features of Bayesian confirmation theory. Introduction Uncertain Evidence and Bayesian Confirmation Bayesian Confirmation by (...) Uncertain Evidence: Test Cases and Basic Principles. (shrink)
Finland is internationally known as one of the leading centers of twentieth century analytic philosophy. This volume offers for the first time an overall survey of the Finnish analytic school. The rise of this trend is illustrated by original articles of Edward Westermarck, Eino Kaila, Georg Henrik von Wright, and Jaakko Hintikka. Contributions of Finnish philosophers are then systematically discussed in the fields of logic, philosophy of language, philosophy of science, history of philosophy, ethics and social philosophy. Metaphilosophical reflections on (...) the nature of philosophy are highlighted by the Finnish dialogue between analytic philosophy, phenomenology, pragmatism, and critical theory. (shrink)
This book is the first of two volumes devoted to the work of Theo Kuipers, a leading Dutch philosopher of science. Philosophers and scientists from all over the world, thirty seven in all, comment on Kuipers' philosophy, and each of their commentaries is followed by a reply from Kuipers. The present volume focuses on Kuipers' views on confirmation, empirical progress, and truth approximation, as laid down in his From Instrumentalism to Constructive Realism. In this book, Kuipers offered a synthesis of (...) Carnap's and Hempel's confirmation theory on the one hand, and Popper's theory of truth approximation on the other. The key element of this synthesis is a sophisticated methodology, which enables the evaluation of theories in terms of their problems and successes, and which also fits well with the claim that one theory is closer to the truth than another. Ilkka Niiniluoto, Patrick Maher, John Welch, Gerhard Schurz, Igor Douven, Bert Hamminga, David Miller, Johan van Benthem, Sjoerd Zwart, Thomas Mormann, Jesús Zamora Bonilla, Isabella Burger & Johannes Heidema, Joke Meheus, Hans Mooij, and Diderik Batens comment on these ideas of Kuipers, and many present their own account. The present book also contains a synopsis of From Instrumentalism to Constructive Realism. It can be read independently of the second volume of Essays in Debate with Theo Kuipers, which is devoted to Kuipers' Structures in Science. (shrink)
Should philosophers speak about Aids? In this contribution, which is intended to open a forum for the readers of ‘Etica e politica’, it is argued that not only philosophers but all the responsible members of community have the duty to discuss the procedures which rules the funding of Aids research. The underlying idea is that scientific progress in this field is better guaranteed by supplying an adequate support to competing seriously proposed programmes of research.
This book is the second of two volumes devoted to the work of Theo Kuipers, a leading Dutch philosopher of science. Philosophers and scientists from all over the world, thirty seven in all, comment on Kuipers’ philosophy, and each of their commentaries is followed by a reply from Kuipers. The present volume is devoted to Kuipers’ neo-classical philosophy of science, as laid down in his Structures in Science . Kuipers defends a dialectical interaction between science and philosophy in that he (...) views philosophy of science as a meta-science which formulates cognitive structures that provide heuristic patterns for actual scientific research, including design research. In addition, Kuipers pays considerable attention to the computational approaches to philosophy of science as well as to the ethics of doing research. Thomas Nickles, David Atkinson, Jean-Paul van Bendegem, Maarten Franssen, Anne Ruth Mackor, Arno Wouters, Erik Weber & Helena de Preester, Eric Scerri, Adam Grobler & Andrzej Wisniewski, Alexander van den Bosch, Gerard Vreeswijk, Jaap Kamps, Paul Thagard, Emma Ruttkamp, Robert Causey, Henk Zandvoort comment on these ideas of Kuipers, and many present their own account. The present book also contains a synopsis of Structures in Science. It can be read independently of the first volume of Essays in Debate with Theo Kuipers, which is devoted to Kuipers’ From Instrumentalism to Constructive Realism. (shrink)
In a recent book on The Evolution of Social Contract, Brian Skyrms shows how evolutionary game theory can be used to explain how the implicit social contract we live by might have evolved. In this paper, after describing the main lines of Skyrms’s approach, we will examine some problems arising from it, on the basis of a comparison with von Hayek’s evolutionary view. Finally, we will make some remarks on the possible relevance of the outcomes achieved by Skyrms for the (...) studies on the ‘bounded rationality’. (shrink)
The basic problem of a theory of truth approximation is defining when a theory is “close to the truth” about some relevant domain. Existing accounts of truthlikeness or verisimilitude address this problem, but are usually limited to the problem of approaching a “deterministic” truth by means of deterministic theories. A general theory of truth approximation, however, should arguably cover also cases where either the relevant theories, or “the truth”, or both, are “probabilistic” in nature. As a step forward in this (...) direction, we first present a general characterization of both deterministic and probabilistic truth approximation; then, we introduce a new account of verisimilitude which provides a simple formal framework to deal with such issue in a unified way. The connections of our account with some other proposals in the literature are also briefly discussed. (shrink)
In this paper I consider a number of metaphilosophical problems concerning the relations between logic and philosophy of science, as they appear from the neo-classical perspective on philosophy of science outlined by Theo Kuipers in ICR and SiS. More specifically, I focus on two pairs of issues: (A) the (dis)similarities between the goals and methods of logic and those of philosophy of science, w.r.t. (1) the role of theorems within the two disciplines; (2) the falsifiability of their theoretical claims; and (...) (B) the interactions between logic and philosophy of science, w.r.t. (3) the possibility of applying logic in philosophy of science; (4) the possibility that the two disciplines are sources of challenging problems for each other. (shrink)