Results for 'Frequentist approach'

991 found
Order:
  1. Why Frequentists and Bayesians Need Each Other.Jon Williamson - 2013 - Erkenntnis 78 (2):293-318.
    The orthodox view in statistics has it that frequentism and Bayesianism are diametrically opposed—two totally incompatible takes on the problem of statistical inference. This paper argues to the contrary that the two approaches are complementary and need to mesh if probabilistic reasoning is to be carried out correctly.
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   10 citations  
  2. Bayesian versus frequentist clinical trials.David Teira - 2011 - In Gifford Fred (ed.), Philosophy of Medicine. Amsterdam: Elsevier. pp. 255-297.
    I will open the first part of this paper by trying to elucidate the frequentist foundations of RCTs. I will then present a number of methodological objections against the viability of these inferential principles in the conduct of actual clinical trials. In the following section, I will explore the main ethical issues in frequentist trials, namely those related to randomisation and the use of stopping rules. In the final section of the first part, I will analyse why RCTs (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  3.  2
    Bayesians Versus Frequentists: A Philosophical Debate on Statistical Reasoning.Jordi Vallverdú - 2016 - Berlin, Heidelberg: Imprint: Springer.
    This book analyzes the origins of statistical thinking as well as its related philosophical questions, such as causality, determinism or chance. Bayesian and frequentist approaches are subjected to a historical, cognitive and epistemological analysis, making it possible to not only compare the two competing theories, but to also find a potential solution. The work pursues a naturalistic approach, proceeding from the existence of numerosity in natural environments to the existence of contemporary formulas and methodologies to heuristic pragmatism, a (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  4.  49
    A Battle in the Statistics Wars: a simulation-based comparison of Bayesian, Frequentist and Williamsonian methodologies.Mantas Radzvilas, William Peden & Francesco De Pretis - 2021 - Synthese 199 (5-6):13689-13748.
    The debates between Bayesian, frequentist, and other methodologies of statistics have tended to focus on conceptual justifications, sociological arguments, or mathematical proofs of their long run properties. Both Bayesian statistics and frequentist (“classical”) statistics have strong cases on these grounds. In this article, we instead approach the debates in the “Statistics Wars” from a largely unexplored angle: simulations of different methodologies’ performance in the short to medium run. We conducted a large number of simulations using a straightforward (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  5.  9
    Testing Simulation Models Using Frequentist Statistics.Andrew P. Robinson - 2019 - In Claus Beisbart & Nicole J. Saam (eds.), Computer Simulation Validation: Fundamental Concepts, Methodological Frameworks, and Philosophical Perspectives. Springer Verlag. pp. 465-496.
    One approach to validating simulation models is to formally compare model outputs with independent data. We consider such model validation from the point of view of Frequentist statistics. A range of estimates and tests of goodness of fit have been advanced. We review these approaches, and demonstrate that some of the tests suffer from difficulties in interpretation because they rely on the null hypothesisHypothesis that the model is similar to the observationsObservations. This reliance creates two unpleasant possibilities, namely, (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  6.  9
    Prior Information in Frequentist Research Designs: The Case of Neyman’s Sampling Theory.Adam P. Kubiak & Paweł Kawalec - 2022 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 53 (4):381-402.
    We analyse the issue of using prior information in frequentist statistical inference. For that purpose, we scrutinise different kinds of sampling designs in Jerzy Neyman’s theory to reveal a variety of ways to explicitly and objectively engage with prior information. Further, we turn to the debate on sampling paradigms (design-based vs. model-based approaches) to argue that Neyman’s theory supports an argument for the intermediate approach in the frequentism vs. Bayesianism debate. We also demonstrate that Neyman’s theory, by allowing (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  7. Error and inference: an outsider stand on a frequentist philosophy.Christian P. Robert - 2013 - Theory and Decision 74 (3):447-461.
    This paper is an extended review of the book Error and Inference, edited by Deborah Mayo and Aris Spanos, about their frequentist and philosophical perspective on testing of hypothesis and on the criticisms of alternatives like the Bayesian approach.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  8.  11
    Gambling-Specific Cognitions Are Not Associated With Either Abstract or Probabilistic Reasoning: A Dual Frequentist-Bayesian Analysis of Individuals With and Without Gambling Disorder.Ismael Muela, Juan F. Navas & José C. Perales - 2021 - Frontiers in Psychology 11.
    BackgroundDistorted gambling-related cognitions are tightly related to gambling problems, and are one of the main targets of treatment for disordered gambling, but their etiology remains uncertain. Although folk wisdom and some theoretical approaches have linked them to lower domain-general reasoning abilities, evidence regarding that relationship remains unconvincing.MethodIn the present cross-sectional study, the relationship between probabilistic/abstract reasoning, as measured by the Berlin Numeracy Test, and the Matrices Test, respectively, and the five dimensions of the Gambling-Related Cognitions Scale, was tested in a (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  9.  70
    John Venn's Hypothetical Infinite Frequentism and Logic.Lukas M. Verburgt - 2014 - History and Philosophy of Logic 35 (3):248-271.
    The goal of this paper is to provide a detailed reading of John Venn's Logic of Chance as a work of logic or, more specifically, as a specific portion of the general system of so-called ‘material’ logic developed in his Principles of Empirical or Inductive Logic and to discuss it against the background of his Boolean-inspired views on the connection between logic and mathematics. It is by means of this situating of Venn 1866 [The Logic of Chance. An Essay on (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  10.  74
    A Bayesian Approach to Absent Evidence Reasoning.Christopher Lee Stephens - 2011 - Informal Logic 31 (1):56-65.
    Normal 0 0 1 85 487 UBC 4 1 598 11.773 0 0 0 Under what conditions is the failure to have evidence that p evidence that p is false? Absent evidence reasoning is common in many sciences, including astronomy, archeology, biology and medicine. An often-repeated epistemological motto is that “the absence of evidence is not evidence of absence.” Analysis of absent evidence reasoning usually takes place in a deductive or frequentist hypothesis-testing framework. Instead, I develop a Bayesian analysis (...)
    Direct download (14 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  11. Enviromental genotoxicity evaluation: Bayesian approach for a mixture statistical model.Julio Michael Stern, Angela Maria de Souza Bueno, Carlos Alberto de Braganca Pereira & Maria Nazareth Rabello-Gay - 2002 - Stochastic Environmental Research and Risk Assessment 16:267–278.
    The data analyzed in this paper are part of the results described in Bueno et al. (2000). Three cytogenetics endpoints were analyzed in three populations of a species of wild rodent – Akodon montensis – living in an industrial, an agricultural, and a preservation area at the Itajaí Valley, State of Santa Catarina, Brazil. The polychromatic/normochromatic ratio, the mitotic index, and the frequency of micronucleated polychromatic erythrocites were used in an attempt to establish a genotoxic profile of each area. It (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  12.  65
    Causation in Personal Injury Law: The Case for a Probabilistic Approach[REVIEW]Chris Miller - 2014 - Topoi 33 (2):1-12.
    This paper makes the case for a wider acceptance of a probabilistic approach to causation in negligence. This acceptance would help to remove much of the incoherence which has come to afflict the English law of personal injury law. This incoherence can also be found in other common law jurisdictions (notably those of the United States, Canada and Australia). Concentrating upon recent UK case law, the argument opposes the contention that ‘naked statistics’ can play no role in establishing causation. (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  13.  13
    Dov M. Gabbay and John Woods.Formal Approaches To Practical - 2002 - In Dov M. Gabbay (ed.), Handbook of the Logic of Argument and Inference: The Turn Towards the Practical. Elsevier.
    Direct download  
     
    Export citation  
     
    Bookmark  
  14. Costs Law Expertise.Dgt Costs Lawyers Approachable Efficient Progressive - forthcoming - Ethos: Journal of the Society for Psychological Anthropology.
     
    Export citation  
     
    Bookmark  
  15. Nigel Howard.A. Piaget1an Approach To Decision - 1978 - In A. Hooker, J. J. Leach & E. F. McClennen (eds.), Foundations and Applications of Decision Theory. D. Reidel. pp. 205.
    No categories
     
    Export citation  
     
    Bookmark  
  16.  10
    QUOTATION3 By Israel Scheffler FOLLOWING Goodman4 in treating inscriptions framed by quotes as concrete general rather than abstract. [REVIEW]an Inscriptional Approach To Indirect - 1997 - In Catherine Z. Elgin (ed.), Nelson Goodman's Theory of Symbols and its Applications. Garland. pp. 237.
  17.  55
    The evaluation of measurement uncertainties and its epistemological ramifications.Nadine de Courtenay & Fabien Grégis - 2017 - Studies in History and Philosophy of Science Part A 65:21-32.
    The way metrologists conceive of measurement has undergone a major shift in the last two decades. This shift can in great part be traced to a change in the statistical methods used to deal with the expression of measurement results, and, more particularly, with the calculation of measurement uncertainties. Indeed, as we show, the incapacity of the frequentist approach to the calculus of uncertainty to deal with systematic errors has prompted the replacement of the customary frequentist methods (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  18. On the correct interpretation of p values and the importance of random variables.Guillaume Rochefort-Maranda - 2016 - Synthese 193 (6):1777-1793.
    The p value is the probability under the null hypothesis of obtaining an experimental result that is at least as extreme as the one that we have actually obtained. That probability plays a crucial role in frequentist statistical inferences. But if we take the word ‘extreme’ to mean ‘improbable’, then we can show that this type of inference can be very problematic. In this paper, I argue that it is a mistake to make such an interpretation. Under minimal assumptions (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  19. Marion Hourdequin and David B. Wong.A. Relational Approach To - 2005 - Journal of Chinese Philosophy 32:19-33.
     
    Export citation  
     
    Bookmark   1 citation  
  20. Inhalt: Werner Gephart.Oder: Warum Daniel Witte: Recht Als Kultur, I. Allgemeine, Property its Contemporary Narratives of Legal History Gerhard Dilcher: Historische Sozialwissenschaft als Mittel zur Bewaltigung der ModerneMax Weber und Otto von Gierke im Vergleich Sam Whimster: Max Weber'S. "Roman Agrarian Society": Jurisprudence & His Search for "Universalism" Marta Bucholc: Max Weber'S. Sociology of Law in Poland: A. Case of A. Missing Perspective Dieter Engels: Max Weber Und Die Entwicklung des Parlamentarischen Minderheitsrechts I. V. Das Recht Und Die Gesellsc Civilization Philipp Stoellger: Max Weber Und Das Recht des Protestantismus Spuren des Protestantismus in Webers Rechtssoziologie I. I. I. Rezeptions- Und Wirkungsgeschichte Hubert Treiber: Zur Abhangigkeit des Rechtsbegriffs Vom Erkenntnisinteresse Uta Gerhardt: Unvermerkte Nahe Zur Rechtssoziologie Talcott Parsons' Und Max Webers Masahiro Noguchi: A. Weberian Approach to Japanese Legal Culture Without the "Sociology of Law": Takeyoshi Kawashima - 2017 - In Werner Gephart & Daniel Witte (eds.), Recht als Kultur?: Beiträge zu Max Webers Soziologie des Rechts. Frankfurt am Main: Vittorio Klosterman.
     
    Export citation  
     
    Bookmark  
  21. Rogene A. Buchholz. Ethics & GovernanceRethinking Business Ethics A. Pragmatic Approach Sandra B. Rosenthal - 2000 - The Ruffin Series in Business Ethics 2000.
    No categories
     
    Export citation  
     
    Bookmark  
  22. Why do we need to employ Bayesian statistics and how can we employ it in studies of moral education?: With practical guidelines to use JASP for educators and researchers.Hyemin Han - 2018 - Journal of Moral Education 47 (4):519-537.
    ABSTRACTIn this article, we discuss the benefits of Bayesian statistics and how to utilize them in studies of moral education. To demonstrate concrete examples of the applications of Bayesian statistics to studies of moral education, we reanalyzed two data sets previously collected: one small data set collected from a moral educational intervention experiment, and one big data set from a large-scale Defining Issues Test-2 survey. The results suggest that Bayesian analysis of data sets collected from moral educational studies can provide (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  23. Improving Bayesian statistics understanding in the age of Big Data with the bayesvl R package.Quan-Hoang Vuong, Viet-Phuong La, Minh-Hoang Nguyen, Manh-Toan Ho, Manh-Tung Ho & Peter Mantello - 2020 - Software Impacts 4 (1):100016.
    The exponential growth of social data both in volume and complexity has increasingly exposed many of the shortcomings of the conventional frequentist approach to statistics. The scientific community has called for careful usage of the approach and its inference. Meanwhile, the alternative method, Bayesian statistics, still faces considerable barriers toward a more widespread application. The bayesvl R package is an open program, designed for implementing Bayesian modeling and analysis using the Stan language’s no-U-turn (NUTS) sampler. The package (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  24. Are ecology and evolutionary biology “soft” sciences?Massimo Pigliucci - 2002 - Annales Zoologici Finnici 39:87-98.
    Research in ecology and evolutionary biology (evo-eco) often tries to emulate the “hard” sciences such as physics and chemistry, but to many of its practitioners feels more like the “soft” sciences of psychology and sociology. I argue that this schizophrenic attitude is the result of lack of appreciation of the full consequences of the peculiarity of the evo-eco sciences as lying in between a-historical disciplines such as physics and completely historical ones as like paleontology. Furthermore, evo-eco researchers have gotten stuck (...)
    Direct download  
     
    Export citation  
     
    Bookmark   4 citations  
  25. Statistical Inference and the Replication Crisis.Lincoln J. Colling & Dénes Szűcs - 2018 - Review of Philosophy and Psychology 12 (1):121-147.
    The replication crisis has prompted many to call for statistical reform within the psychological sciences. Here we examine issues within Frequentist statistics that may have led to the replication crisis, and we examine the alternative—Bayesian statistics—that many have suggested as a replacement. The Frequentist approach and the Bayesian approach offer radically different perspectives on evidence and inference with the Frequentist approach prioritising error control and the Bayesian approach offering a formal method for quantifying (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  26.  32
    Statistical Data and Mathematical Propositions.Cory Juhl - 2015 - Pacific Philosophical Quarterly 96 (1):100-115.
    Statistical tests of the primality of some numbers look similar to statistical tests of many nonmathematical, clearly empirical propositions. Yet interpretations of probability prima facie appear to preclude the possibility of statistical tests of mathematical propositions. For example, it is hard to understand how the statement that n is prime could have a frequentist probability other than 0 or 1. On the other hand, subjectivist approaches appear to be saddled with ‘coherence’ constraints on rational probabilities that require rational agents (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  27. Population Epistemology: Information Flow in Evolutionary Processes.William F. Harms - 1996 - Dissertation, University of California, Irvine
    Evolutionary theory offers the possibility of building an epistemology that requires neither a theory of truth nor a definition of knowledge, thus bypassing some of the more notable difficulties with standard approaches to epistemology. Following a critique of one of the most popular approaches to thinking about cultural evolution I argue for a frequentist approach to evolutionary epistemology, and that cultural transmission should be understood as coordinated phenotypic variability within groups of closely related organisms. I construct a formal (...)
     
    Export citation  
     
    Bookmark   1 citation  
  28.  53
    What is probability and why does it matter.Zvonimir Šikić - 2014 - European Journal of Analytic Philosophy 10 (1):21-43.
    The idea that probability is a degree of rational belief seemed too vague for a foundation of a mathematical theory. It was certainly not obvious that degrees of rational belief had to be governed by the probability axioms as used by Laplace and other prestatistical probabilityst. The axioms seemed arbitrary in their interpretation. To eliminate the arbitrariness, the stat- isticians of the early 20th century drastically restricted the possible applications of the probability theory, by insisting that probabilities had to be (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  29.  22
    We are All Bayesian, Everyone is Not a Bayesian.Mattia Andreoletti & Andrea Oldofredi - 2019 - Topoi 38 (2):477-485.
    Medical research makes intensive use of statistics in order to support its claims. In this paper we make explicit an epistemological tension between the conduct of clinical trials and their interpretation: statistical evidence is sometimes discarded on the basis of an underlined Bayesian reasoning. We suggest that acknowledging the potentiality of Bayesian statistics might contribute to clarify and improve comprehension of medical research. Nevertheless, despite Bayesianism may provide a better account for scientific inference with respect to the standard frequentist (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  30.  29
    Revisiting Haavelmo's structural econometrics: bridging the gap between theory and data.Aris Spanos - 2015 - Journal of Economic Methodology 22 (2):171-196.
    The objective of the paper is threefold. First, to argue that some of Haavelmo's methodological ideas and insights have been neglected because they are largely at odds with the traditional perspective that views empirical modeling in economics as an exercise in curve-fitting. Second, to make a case that this neglect has contributed to the unreliability of empirical evidence in economics that is largely due to statistical misspecification. The latter affects the reliability of inference by inducing discrepancies between the actual and (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  31. On the definition of objective probabilities by empirical similarity.Itzhak Gilboa, Offer Lieberman & David Schmeidler - 2010 - Synthese 172 (1):79 - 95.
    We suggest to define objective probabilities by similarity-weighted empirical frequencies, where more similar cases get a higher weight in the computation of frequencies. This formula is justified intuitively and axiomatically, but raises the question, which similarity function should be used? We propose to estimate the similarity function from the data, and thus obtain objective probabilities. We compare this definition to others, and attempt to delineate the scope of situations in which objective probabilities can be used.
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  32. Bayes and health care research.Peter Allmark - 2004 - Medicine, Health Care and Philosophy 7 (3):321-332.
    Bayes’ rule shows how one might rationally change one’s beliefs in the light of evidence. It is the foundation of a statistical method called Bayesianism. In health care research, Bayesianism has its advocates but the dominant statistical method is frequentism. There are at least two important philosophical differences between these methods. First, Bayesianism takes a subjectivist view of probability (i.e. that probability scores are statements of subjective belief, not objective fact) whilst frequentism takes an objectivist view. Second, Bayesianism is explicitly (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  33.  33
    Wahrscheinlichkeitsrechnung im Spannungsfeld von Maß- und Häufigkeitstheorie—Leben und werk des “Deutschen” Mathematikers Erhard Tornier (1894–1982). [REVIEW]Thomas Hochkirchen - 1998 - NTM Zeitschrift für Geschichte der Wissenschaften, Technik und Medizin 6 (1):22-41.
    The axiomatization of probability theory by the national socialistic German mathematician E. Tornier will be embedded between measure theoretic and frequentist approaches to the problem and compared with Tomiers ideological claims. New details of his biography will be given.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  34. Significance Testing in Theory and Practice.Daniel Greco - 2011 - British Journal for the Philosophy of Science 62 (3):607-637.
    Frequentism and Bayesianism represent very different approaches to hypothesis testing, and this presents a skeptical challenge for Bayesians. Given that most empirical research uses frequentist methods, why (if at all) should we rely on it? While it is well known that there are conditions under which Bayesian and frequentist methods agree, without some reason to think these conditions are typically met, the Bayesian hasn’t shown why we are usually safe in relying on results reported by significance testers. In (...)
    Direct download (9 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  35.  35
    Why I Am Not a Likelihoodist.Greg Gandenberger - 2016 - Philosophers' Imprint 16.
    Frequentist statistical methods continue to predominate in many areas of science despite prominent calls for "statistical reform." They do so in part because their main rivals, Bayesian methods, appeal to prior probability distributions that arguably lack an objective justification in typical cases. Some methodologists find a third approach called likelihoodism attractive because it avoids important objections to frequentism without appealing to prior probabilities. However, likelihoodist methods do not provide guidance for belief or action, but only assessments of data (...)
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  36. Cross-Situational Learning: An Experimental Study of Word-Learning Mechanisms.Kenny Smith, Andrew D. M. Smith & Richard A. Blythe - 2011 - Cognitive Science 35 (3):480-498.
    Cross-situational learning is a mechanism for learning the meaning of words across multiple exposures, despite exposure-by-exposure uncertainty as to the word's true meaning. We present experimental evidence showing that humans learn words effectively using cross-situational learning, even at high levels of referential uncertainty. Both overall success rates and the time taken to learn words are affected by the degree of referential uncertainty, with greater referential uncertainty leading to less reliable, slower learning. Words are also learned less successfully and more slowly (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   28 citations  
  37. Cultural evolution in Vietnam’s early 20th century: a Bayesian networks analysis of Hanoi Franco-Chinese house designs.Quan-Hoang Vuong, Quang-Khiem Bui, Viet-Phuong La, Thu-Trang Vuong, Manh-Toan Ho, Hong-Kong T. Nguyen, Hong-Ngoc Nguyen, Kien-Cuong P. Nghiem & Manh-Tung Ho - 2019 - Social Sciences and Humanities Open 1 (1):100001.
    The study of cultural evolution has taken on an increasingly interdisciplinary and diverse approach in explicating phenomena of cultural transmission and adoptions. Inspired by this computational movement, this study uses Bayesian networks analysis, combining both the frequentist and the Hamiltonian Markov chain Monte Carlo (MCMC) approach, to investigate the highly representative elements in the cultural evolution of a Vietnamese city’s architecture in the early 20th century. With a focus on the façade design of 68 old houses in (...)
    Direct download  
     
    Export citation  
     
    Bookmark   10 citations  
  38.  25
    The Big Data razor.Ezequiel López-Rubio - 2020 - European Journal for Philosophy of Science 10 (2):1-20.
    Classic conceptions of model simplicity for machine learning are mainly based on the analysis of the structure of the model. Bayesian, Frequentist, information theoretic and expressive power concepts are the best known of them, which are reviewed in this work, along with their underlying assumptions and weaknesses. These approaches were developed before the advent of the Big Data deluge, which has overturned the importance of structural simplicity. The computational simplicity concept is presented, and it is argued that it is (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  39.  35
    “The Paradox of Deterministic Probabilities”.Valia Allori - 2022 - Inquiry: An Interdisciplinary Journal of Philosophy 1 (DOI: 10.1080/0020174X.2022.20655):0-00.
    This paper aims to investigate the so-called paradox of deterministic probabilities: in a deterministic world, all probabilities should be subjective; however, they also seem to play important explanatory and predictive roles which suggest they are objective. The problem is then to understand what these deterministic probabilities are. Recent proposed solutions of this paradox are the Mentaculus vision, the range account of probability, and a version of frequentism based on typicality. All these approaches aim at defining deterministic objective probabilities as to (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  40.  19
    The ethics of randomised controlled trials: A matter of statistical belief? [REVIEW]Jane L. Hutton - 1996 - Health Care Analysis 4 (2):95-102.
    This paper outlines the approaches of two apparently competing schools of statistics. The criticisms made by supporters of Bayesian statistics about conventional Frequentist statistics are explained, and the Bayesian claim that their method enables research into new treatments without the need for clinical trials is examined in detail. Several further important issues are considered, including: the use of historical controls and data routinely collected on patients; balance in randomised trials; the possibility of giving information to patients; patient choice and (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  41.  93
    The base rate fallacy reconsidered: Descriptive, normative, and methodological challenges.Jonathan J. Koehler - 1996 - Behavioral and Brain Sciences 19 (1):1-17.
    We have been oversold on the base rate fallacy in probabilistic judgment from an empirical, normative, and methodological standpoint. At the empirical level, a thorough examination of the base rate literature (including the famous lawyer–engineer problem) does not support the conventional wisdom that people routinely ignore base rates. Quite the contrary, the literature shows that base rates are almost always used and that their degree of use depends on task structure and representation. Specifically, base rates play a relatively larger role (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   84 citations  
  42. The curve fitting problem: A bayesian rejoinder.Prasanta S. Bandyopadhyay & Robert J. Boik - 1999 - Philosophy of Science 66 (3):402.
    In the curve fitting problem two conflicting desiderata, simplicity and goodness-of-fit pull in opposite directions. To solve this problem, two proposals, the first one based on Bayes's theorem criterion (BTC) and the second one advocated by Forster and Sober based on Akaike's Information Criterion (AIC) are discussed. We show that AIC, which is frequentist in spirit, is logically equivalent to BTC, provided that a suitable choice of priors is made. We evaluate the charges against Bayesianism and contend that AIC (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  43.  76
    Ethics and statistical methodology in clinical trials.C. R. Palmer - 1993 - Journal of Medical Ethics 19 (4):219-222.
    Statisticians in medicine can disagree on appropriate methodology applicable to the design and analysis of clinical trials. So called Bayesians and frequentists both claim ethical superiority. This paper, by defining and then linking together various dichotomies, argues there is a place for both statistical camps. The choice between them depends on the phase of clinical trial, disease prevalence and severity, but supremely on the ethics underlying the particular trial. There is always a tension present between physicians primarily obligated to their (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  44.  41
    Is There a Free Lunch in Inference?Jeffrey N. Rouder, Richard D. Morey, Josine Verhagen, Jordan M. Province & Eric-Jan Wagenmakers - 2016 - Topics in Cognitive Science 8 (3):520-547.
    The field of psychology, including cognitive science, is vexed by a crisis of confidence. Although the causes and solutions are varied, we focus here on a common logical problem in inference. The default mode of inference is significance testing, which has a free lunch property where researchers need not make detailed assumptions about the alternative to test the null hypothesis. We present the argument that there is no free lunch; that is, valid testing requires that researchers test the null against (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  45.  65
    The quantitative-qualitative distinction and the Null hypothesis significance testing procedure.Nimal Ratnesar & Jim Mackenzie - 2006 - Journal of Philosophy of Education 40 (4):501–509.
    Conventional discussion of research methodology contrast two approaches, the quantitative and the qualitative, presented as collectively exhaustive. But if qualitative is taken as the understanding of lifeworlds, the two approaches between them cover only a tiny fraction of research methodologies; and the quantitative, taken as the routine application to controlled experiments of frequentist statistics by way of the Null Hypothesis Significance Testing Procedure, is seriously flawed. It is contrary to the advice both of Fisher and of Neyman and Pearson, (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  46.  9
    The Quantitative-Qualitative Distinction and the Null Hypothesis Significance Testing Procedure.Nimal Ratnesar & Jim Mackenzie - 2006 - Journal of Philosophy of Education 40 (4):501-509.
    Conventional discussion of research methodology contrast two approaches, the quantitative and the qualitative, presented as collectively exhaustive. But if qualitative is taken as the understanding of lifeworlds, the two approaches between them cover only a tiny fraction of research methodologies; and the quantitative, taken as the routine application to controlled experiments of frequentist statistics by way of the Null Hypothesis Significance Testing Procedure, is seriously flawed. It is contrary to the advice both of Fisher and of Neyman and Pearson, (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  47.  22
    The role of pragmatic rules in the conjunction fallacy.Giuseppe Mosconi & Laura Macchi - 2001 - Mind and Society 2 (1):31-57.
    We here report the findings of our investigation into the validity of the conjunction fallacy (Tversky & Kahneman, 1983), bearing in mind the role of conversational rules. Our first experiment showed that subjects found a logically correct answer unacceptable when it implied a violation of the conversational rules. We argue that tautological questions, such as those which concern the relationship of inclusion between a class and its sub-class, violate conversational rules because they are not informative. In this sense, it is (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  48.  60
    Reichenbach’s Transcendental Probability.Fedde Benedictus & Dennis Dieks - 2015 - Erkenntnis 80 (1):15-38.
    The aim of this article is twofold. First, we shall review and analyse the neo-kantian justification for the application of probabilistic concepts in science that was defended by Hans Reichenbach early in his career, notably in his dissertation of 1916. At first sight this kantian approach seems to contrast sharply with Reichenbach’s later logical positivist, frequentist viewpoint. But, and this is our second goal, we shall attempt to show that there is an underlying continuity in Reichenbach’s thought: typical (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  49.  39
    Mathematical statistics and metastatistical analysis.Andrés Rivadulla - 1991 - Erkenntnis 34 (2):211 - 236.
    This paper deals with meta-statistical questions concerning frequentist statistics. In Sections 2 to 4 I analyse the dispute between Fisher and Neyman on the so called logic of statistical inference, a polemic that has been concomitant of the development of mathematical statistics. My conclusion is that, whenever mathematical statistics makes it possible to draw inferences, it only uses deductive reasoning. Therefore I reject Fisher's inductive approach to the statistical estimation theory and adhere to Neyman's deductive one. On the (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  50. Testing a precise null hypothesis: the case of Lindley’s paradox.Jan Sprenger - 2013 - Philosophy of Science 80 (5):733-744.
    The interpretation of tests of a point null hypothesis against an unspecified alternative is a classical and yet unresolved issue in statistical methodology. This paper approaches the problem from the perspective of Lindley's Paradox: the divergence of Bayesian and frequentist inference in hypothesis tests with large sample size. I contend that the standard approaches in both frameworks fail to resolve the paradox. As an alternative, I suggest the Bayesian Reference Criterion: it targets the predictive performance of the null hypothesis (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   10 citations  
1 — 50 / 991