Switch to: References

Citations of:

Optimum Inductive Methods: A Study in Inductive Probability, Bayesian Statistics, and Verisimilitude

Dordrecht, Netherland: Kluwer Academic Publishers: Dordrecht (1993)

Add citations

You must login to add citations.
  1. Revising Beliefs Towards the Truth.Ilkka Niiniluoto - 2011 - Erkenntnis 75 (2):165-181.
    Belief revision (BR) and truthlikeness (TL) emerged independently as two research programmes in formal methodology in the 1970s. A natural way of connecting BR and TL is to ask under what conditions the revision of a belief system by new input information leads the system towards the truth. It turns out that, for the AGM model of belief revision, the only safe case is the expansion of true beliefs by true input, but this is not very interesting or realistic as (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   20 citations  
  • Inductive Logic.James Hawthorne - 2011 - The Stanford Encyclopedia of Philosophy.
    Sections 1 through 3 present all of the main ideas behind the probabilistic logic of evidential support. For most readers these three sections will suffice to provide an adequate understanding of the subject. Those readers who want to know more about how the logic applies when the implications of hypotheses about evidence claims (called likelihoods) are vague or imprecise may, after reading sections 1-3, skip to section 6. Sections 4 and 5 are for the more advanced reader who wants a (...)
    Direct download  
     
    Export citation  
     
    Bookmark   30 citations  
  • New Semantics for Bayesian Inference: The Interpretive Problem and Its Solutions.Olav Benjamin Vassend - 2019 - Philosophy of Science 86 (4):696-718.
    Scientists often study hypotheses that they know to be false. This creates an interpretive problem for Bayesians because the probability assigned to a hypothesis is typically interpreted as the probability that the hypothesis is true. I argue that solving the interpretive problem requires coming up with a new semantics for Bayesian inference. I present and contrast two new semantic frameworks, and I argue that both of them support the claim that there is pervasive pragmatic encroachment on whether a given Bayesian (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  • A Verisimilitude Framework for Inductive Inference, with an Application to Phylogenetics.Olav B. Vassend - 2018 - British Journal for the Philosophy of Science 71 (4):1359-1383.
    Bayesianism and likelihoodism are two of the most important frameworks philosophers of science use to analyse scientific methodology. However, both frameworks face a serious objection: much scientific inquiry takes place in highly idealized frameworks where all the hypotheses are known to be false. Yet, both Bayesianism and likelihoodism seem to be based on the assumption that the goal of scientific inquiry is always truth rather than closeness to the truth. Here, I argue in favour of a verisimilitude framework for inductive (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  • Analogical Predictions for Explicit Similarity.Jan Willem Romeijn - 2006 - Erkenntnis 64 (2):253 - 280.
    This paper concerns exchangeable analogical predictions based on similarity relations between predicates, and deals with a restricted class of such relations. It describes a system of Carnapian λγ rules on underlying predicate families to model the analogical predictions for this restricted class. Instead of the usual axiomatic definition, the system is characterized with a Bayesian model that employs certain statistical hypotheses. Finally the paper argues that the Bayesian model can be generalized to cover cases outside the restricted class of similarity (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   21 citations  
  • Approaching probabilistic laws.Ilkka Niiniluoto - 2021 - Synthese 199 (3-4):10499-10519.
    In the general problem of verisimilitude, we try to define the distance of a statement from a target, which is an informative truth about some domain of investigation. For example, the target can be a state description, a structure description, or a constituent of a first-order language. In the problem of legisimilitude, the target is a deterministic or universal law, which can be expressed by a nomic constituent or a quantitative function involving the operators of physical necessity and possibility. The (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  • Feminist Philosophy of Science.Lynn Hankinson Nelson - 2002 - In Peter Machamer & Michael Silberstein (eds.), The Blackwell Guide to the Philosophy of Science. Oxford, UK: Blackwell. pp. 312–331.
    This chapter contains sections titled: Highlights of Past Literature Current Work Future Work.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  • Probabilities for multiple properties: The models of Hesse and Carnap and Kemeny. [REVIEW]Patrick Maher - 2001 - Erkenntnis 55 (2):183-215.
    In 1959 Carnap published a probability model that was meant to allow forreasoning by analogy involving two independent properties. Maher (2000)derived a generalized version of this model axiomatically and defended themodel''s adequacy. It is thus natural to now consider how the model mightbe extended to the case of more than two properties. A simple extension waspublished by Hess (1964); this paper argues that it is inadequate. Amore sophisticated one was developed jointly by Carnap and Kemeny in theearly 1950s but never (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   32 citations  
  • Probabilities for two properties.Patrick Maher - 2000 - Erkenntnis 52 (1):63-91.
    Let R(X, B) denote the class of probability functions that are defined on algebra X and that represent rationally permissible degrees of certainty for a person whose total relevant background evidence is B. This paper is concerned with characterizing R(X, B) for the case in whichX is an algebra of propositions involving two properties and B is empty. It proposes necessary conditions for a probability function to be in R(X, B), some of which involve the notion of statistical dependence. The (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   28 citations  
  • From Instrumentalism to Constructive Realism: On Some Relations Between Confirmation, Empirical Progress, and Truth Approximation.Theodorus Antonius Franciscus Kuipers - 2000 - Dordrecht, Netherland: Springer.
    Surprisingly, modified versions of the confirmation theory (Carnap and Hempel) and truth approximation theory (Popper) turn out to be smoothly sythesizable. The glue between the two appears to be the instrumentalist methodology, rather than that of the falsificationalist. The instrumentalist methodology, used in the separate, comparative evaluation of theories in terms of their successes and problems (hence, even if already falsified), provides in theory and practice the straight road to short-term empirical progress in science ( à la Laudan). It is (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   153 citations  
  • Comparative versus quantitative truthlikeness definitions: Reply to Thomas Mormann. [REVIEW]Theo A. F. Kuipers - 1997 - Erkenntnis 47 (2):187-192.
  • Approaching probabilistic and deterministic nomic truths in an inductive probabilistic way.Theo A. F. Kuipers - 2021 - Synthese 199 (3-4):8001-8028.
    Theories of truth approximation in terms of truthlikeness almost always deal with approaching deterministic truths, either actual or nomic. This paper deals first with approaching a probabilistic nomic truth, viz. a true probability distribution. It assumes a multinomial probabilistic context, hence with a lawlike true, but usually unknown, probability distribution. We will first show that this true multinomial distribution can be approached by Carnapian inductive probabilities. Next we will deal with the corresponding deterministic nomic truth, that is, the set of (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  • Probabilistic Logics and Probabilistic Networks.Rolf Haenni, Jan-Willem Romeijn, Gregory Wheeler & Jon Williamson - 2010 - Dordrecht, Netherland: Synthese Library. Edited by Gregory Wheeler, Rolf Haenni, Jan-Willem Romeijn & and Jon Williamson.
    Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.
  • Resurrecting logical probability.James Franklin - 2001 - Erkenntnis 55 (2):277-305.
    The logical interpretation of probability, or "objective Bayesianism'' – the theory that (some) probabilities are strictly logical degrees of partial implication – is defended. The main argument against it is that it requires the assignment of prior probabilities, and that any attempt to determine them by symmetry via a "principle of insufficient reason" inevitably leads to paradox. Three replies are advanced: that priors are imprecise or of little weight, so that disagreement about them does not matter, within limits; that it (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   27 citations  
  • Analogy and exchangeability in predictive inferences.Roberto Festa - 1996 - Erkenntnis 45 (2-3):229 - 252.
    An important problem in inductive probability theory is the design of exchangeable analogical methods, i.e., of exchangeable inductive methods that take into account certain considerations of analogy by similarity for predictive inferences. Here a precise reformulation of the problem of predictive analogy is given and a new family of exchangeable analogical methods is introduced.Firstly, it is proved that the exchangeable analogical method introduced by Skyrms (1993) does not satisfy the best known general principles of predictive analogy. Secondly, Skyrms's approach — (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   20 citations  
  • Generalized Information Theory Meets Human Cognition: Introducing a Unified Framework to Model Uncertainty and Information Search.Vincenzo Crupi, Jonathan D. Nelson, Björn Meder, Gustavo Cevolani & Katya Tentori - 2018 - Cognitive Science 42 (5):1410-1456.
    Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   15 citations  
  • Approaching deterministic and probabilistic truth: a unified account.Gustavo Cevolani & Roberto Festa - 2021 - Synthese 199 (3-4):11465-11489.
    The basic problem of a theory of truth approximation is defining when a theory is “close to the truth” about some relevant domain. Existing accounts of truthlikeness or verisimilitude address this problem, but are usually limited to the problem of approaching a “deterministic” truth by means of deterministic theories. A general theory of truth approximation, however, should arguably cover also cases where either the relevant theories, or “the truth”, or both, are “probabilistic” in nature. As a step forward in this (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  • Probabilistic truth approximation and fixed points.David Atkinson & Jeanne Peijnenburg - 2020 - Synthese 199 (1-2):4195-4216.
    We use the method of fixed points to describe a form of probabilistic truth approximation which we illustrate by means of three examples. We then contrast this form of probabilistic truth approximation with another, more familiar kind, where no fixed points are used. In probabilistic truth approximation with fixed points the events are dependent on one another, but in the second kind they are independent. The first form exhibits a phenomenon that we call ‘fading origins’, the second one is subject (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  • Interpretations of probability.Alan Hájek - 2007 - Stanford Encyclopedia of Philosophy.
  • Statistics as Inductive Inference.Jan-Willem Romeijn - unknown
    An inductive logic is a system of inference that describes the relation between propositions on data, and propositions that extend beyond the data, such as predictions over future data, and general conclusions on all possible data. Statistics, on the other hand, is a mathematical discipline that describes procedures for deriving results about a population from sample data. These results include predictions on future samples, decisions on rejecting or accepting a hypothesis about the population, the determination of probability assignments over such (...)
     
    Export citation  
     
    Bookmark  
  • Philosophy as conceptual engineering: Inductive logic in Rudolf Carnap's scientific philosophy.Christopher F. French - 2015 - Dissertation, University of British Columbia
    My dissertation explores the ways in which Rudolf Carnap sought to make philosophy scientific by further developing recent interpretive efforts to explain Carnap’s mature philosophical work as a form of engineering. It does this by looking in detail at his philosophical practice in his most sustained mature project, his work on pure and applied inductive logic. I, first, specify the sort of engineering Carnap is engaged in as involving an engineering design problem and then draw out the complications of design (...)
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Approaching the truth via belief change in propositional languages.Gustavo Cevolani & Francesco Calandra - 2010 - In M. Suàrez, M. Dorato & M. Rèdei (eds.), EPSA Epistemology and Methodology of Science: Launch of the European Philosophy of Science Association. Springer. pp. 47--62.
    Starting from the sixties of the past century theory change has become a main concern of philosophy of science. Two of the best known formal accounts of theory change are the post-Popperian theories of verisimilitude (PPV for short) and the AGM theory of belief change (AGM for short). In this paper, we will investigate the conceptual relations between PPV and AGM and, in particular, we will ask whether the AGM rules for theory change are effective means for approaching the truth, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  • A Verisimilitude Framework for Inductive Inference, with an Application to Phylogenetics.Vassend Olav Benjamin - unknown
    Bayesianism and likelihoodism are two of the most important frameworks philosophers of science use to analyse scientific methodology. However, both frameworks face a serious objection: much scientific inquiry takes place in highly idealized frameworks where all the hypotheses are known to be false. Yet, both Bayesianism and likelihoodism seem to be based on the assumption that the goal of scientific inquiry is always truth rather than closeness to the truth. Here, I argue in favor of a verisimilitude framework for inductive (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark