First, I argue that scientific progress is possible in the absence of increasing verisimilitude in science’s theories. Second, I argue that increasing theoretical verisimilitude is not the central, or primary, dimension of scientific progress. Third, I defend my previous argument that unjustified changes in scientific belief may be progressive. Fourth, I illustrate how false beliefs can promote scientific progress in ways that cannot be explicated by appeal to verisimilitude.
Popper’s Critical Rationalism presents Popper’s views on science, knowledge, and inquiry, and examines the significance and tenability of these in light of recent developments in philosophy of science, philosophy of probability, and epistemology. It develops a fresh and novel philosophical position on science, which employs key insights from Popper while rejecting other elements of his philosophy.
First, I answer the controversial question ’What is scientific realism?’ with extensive reference to the varied accounts of the position in the literature. Second, I provide an overview of the key developments in the debate concerning scientific realism over the past decade. Third, I provide a summary of the other contributions to this special issue.
Popper repeatedly emphasised the significance of a critical attitude, and a related critical method, for scientists. Kuhn, however, thought that unquestioning adherence to the theories of the day is proper; at least for ‘normal scientists’. In short, the former thought that dominant theories should be attacked, whereas the latter thought that they should be developed and defended. Both seem to have missed a trick, however, due to their apparent insistence that each individual scientist should fulfil similar functions. The trick is (...) to consider science at the group level; and doing so shows how puzzle solving and ‘offensive’ critical activity can simultaneously have a legitimate place in science. This analysis shifts the focus of the debate. The crucial question becomes ‘How should the balance between functions be struck?’. (shrink)
This paper challenges a recent argument of Bird’s, which involves imagining that Réné Blondlot’s belief in N-rays was true, in favour of the view that scientific progress should be understood in terms of knowledge rather than truth. By considering several variants of Bird’s thought-experiment, it shows that the semantic account of progress cannot be so easily vanquished. A key possibility is that justification is only instrumental in, and not partly constitutive of, progress.
Popper repeatedly emphasised the significance of a critical attitude, and a related critical method, for scientists. Kuhn, however, thought that unquestioning adherence to the theories of the day is proper; at least for ‘normal scientists’. In short, the former thought that dominant theories should be attacked, whereas the latter thought that they should be developed and defended (for the vast majority of the time). -/- Both seem to have missed a trick, however, due to their apparent insistence that each individual (...) scientist should fulfil similar functions (at any given point in time). The trick is to consider science at the group level; and doing so shows how puzzle solving and ‘offensive’ critical activity can simultaneously have a legitimate place in science. This analysis shifts the focus of the debate. The crucial question becomes ‘How should the balance between functions be struck?’. (shrink)
Stanford’s argument against scientific realism focuses on theories, just as many earlier arguments from inconceivability have. However, there are possible arguments against scientific realism involving unconceived (or inconceivable) entities of different types: observations, models, predictions, explanations, methods, instruments, experiments, and values. This paper charts such arguments. In combination, they present the strongest challenge yet to scientific realism.
This paper challenges Bird’s view that scientific progress should be understood in terms of knowledge, by arguing that unjustified scientific beliefs (and/or changes in belief) may nevertheless be progressive. It also argues that false beliefs may promote progress.
Roughly, instrumentalism is the view that science is primarily, and should primarily be, an instrument for furthering our practical ends. It has fallen out of favour because historically influential variants of the view, such as logical positivism, suffered from serious defects. -/- In this book, however, Darrell P. Rowbottom develops a new form of instrumentalism, which is more sophisticated and resilient than its predecessors. This position—‘cognitive instrumentalism’—involves three core theses. First, science makes theoretical progress primarily when it furnishes us with (...) more predictive power or understanding concerning observable things. Second, scientific discourse concerning unobservable things should only be taken literally in so far as it involves observable properties or analogies with observable things. Third, scientific claims about unobservable things are probably neither approximately true nor liable to change in such a way as to increase in truthlikeness. -/- There are examples from science throughout the book, and Rowbottom demonstrates at length how cognitive instrumentalism fits with the development of late nineteenth- and early twentieth-century chemistry and physics, and especially atomic theory. Drawing upon this history, Rowbottom also argues that there is a kind of understanding, empirical understanding, which we can achieve without having true, or even approximately true, representations of unobservable things. In closing the book, he sets forth his view on how the distinction between the observable and unobservable may be drawn, and compares cognitive instrumentalism with key contemporary alternatives such as structural realism, constructive empiricism, and semirealism. -/- Overall, this book offers a strong defence of instrumentalism that will be of interest to scholars and students working on the debate about realism in philosophy of science. (shrink)
This paper argues that talk of ‘the aim of science’ should be avoided in the philosophy of science, with special reference to the way that van Fraassen sets up the difference between scientific realism and constructive empiricism. It also argues that talking instead of ‘what counts as success in science as such’ is unsatisfactory. The paper concludes by showing what this talk may be profitably replaced with, namely specific claims concerning science that fall into the following categories: descriptive, evaluative, normative, (...) and definitional. There are two key advantages to this proposal. First, realism and its competitors may be understood to consist of highly nuanced variants. Second, scientific realism and its competitors may be understood as something other than ‘all or nothing’ theses about science. More particularly, one may accept that there are general claims concerning science in some of the identified categories, but deny that there are such claims in the others. (shrink)
This paper compares and contrasts the concept of a stance with that of a paradigm qua disciplinary matrix, in an attempt to illuminate both notions. First, it considers to what extent it is appropriate to draw an analogy between stances and disciplinary matrices. It suggests that despite first appearances, a disciplinary matrix is not simply a stance writ large. Second, it examines how we might reinterpret disciplinary matrices in terms of stances, and shows how doing so can provide us with (...) a better insight into non-revolutionary science. Finally, it identifies two directions for future research: “Can the rationality of scientific revolutions be understood in terms of the dynamic between stances and paradigms?” and “Do stances help us to understand incommensurability between disciplinary matrices?”. (shrink)
When a doctor tells you there’s a one percent chance that an operation will result in your death, or a scientist claims that his theory is probably true, what exactly does that mean? Understanding probability is clearly very important, if we are to make good theoretical and practical choices. In this engaging and highly accessible introduction to the philosophy of probability, Darrell Rowbottom takes the reader on a journey through all the major interpretations of probability, with reference to real–world situations. (...) In lucid prose, he explores the many fallacies of probabilistic reasoning, such as the ‘gambler’s fallacy’ and the ‘inverse fallacy’, and shows how we can avoid falling into these traps by using the interpretations presented. He also illustrates the relevance of the interpretation of probability across disciplinary boundaries, by examining which interpretations of probability are appropriate in diverse areas such as quantum mechanics, game theory, and genetics. Using entertaining dialogues to draw out the key issues at stake, this unique book will appeal to students and scholars across philosophy, the social sciences, and the natural sciences. (shrink)
We have three goals in this paper. First, we outline an ontology of stance, and explain the role that modes of engagement and styles of reasoning play in the characterization of a stance. Second, we argue that we do enjoy a degree of control over the modes of engagement and styles of reasoning we adopt. Third, we contend that maximizing one’s prospects for change also maximizes one’s rationality.
When do we agree? The answer might once have seemed simple and obvious; we agree that p when we each believe that p. But from a formal epistemological perspective, where degrees of belief are more fundamental than beliefs, this answer is unsatisfactory. On the one hand, there is reason to suppose that it is false; degrees of belief about p might differ when beliefs simpliciter on p do not. On the other hand, even if it is true, it is too (...) vague; for what it is to believe simpliciter ought to be explained in terms of degrees of belief. This paper presents several possible notions of agreement, and corresponding notions of disagreement. It indicates how the findings are fruitful for the epistemology of disagreement, with special reference to the notion of epistemic peerhood. (shrink)
Both Popper and van Fraassen have used evolutionary analogies to defend their views on the aim of science, although these are diametrically opposed. By employing Price's equation in an illustrative capacity, this paper considers which view is better supported. It shows that even if our observations and experimental results are reliable, an evolutionary analogy fails to demonstrate why conjecture and refutation should result in: (1) the isolation of true theories; (2) successive generations of theories of increasing truth-likeness; (3) empirically adequate (...) theories; or (4) successive generations of theories of increasing proximity to empirical adequacy. Furthermore, it illustrates that appeals to induction do not appear to help. It concludes that an evolutionary analogy is only sufficient to defend the notion that the aim of science is to isolate a particular class of false theories, namely those that are empirically inadequate. (shrink)
Schwitzgebel (2001) — henceforth 'S' — offers three examples in order to convince us that there are situations in which individuals are neither accurately describable as believing that p or failing to so believe, but are rather in 'in-between states of belief'. He then argues that there are no 'Bayesian' or representational strategies for explicating these, and proposes a dispositional account. I do not have any fundamental objection to the idea that there might be 'in-between states of belief'. What I (...) shall argue, rather, is that: (I) S does not provide a convincing argument that there really are such states; (II) S does not show, as he claims, that 'in-between states of belief' could not be accounted for in terms of degrees of belief; (III) S’s dispositional account of 'in-between states of belief' is more problematic than the 'degree of belief' alternative. (shrink)
In a recent article in the British Journal for the Philosophy of Science, Heesen and Bright argue that prepublication peer review should be abolished and replaced with postpublication peer review (provided the matter is judged purely on epistemic grounds). In this article, I show that there are three problems with their argument. First, it fails to consider the epistemic cost of implementing the change to postpublication peer review. Second, it fails to consider some potential epistemic benefits of prepublication peer review, (...) which involve avoiding bias. Third, it fails to consider some potential epistemic disadvantages of postpublication peer review, which stem from the greater number of papers that would be published under that system. (shrink)
Van Fraassen has recently argued that empiricism can be construed as a stance, involving commitments, attitudes, values, and goals, in addition to beliefs and opinions. But this characterisation emerges from his recognition that to be an empiricist can not be to believe, or decide to commit to belief in, a foundational proposition, without removing any basis for a non-dogmatic empiricist critique of other philosophical approaches, such as materialism. However, noticeable by its absence in Van Fraassen's discussions is any mention of (...) Bartley's ‘pancritical rationalism’, for Bartley offers a cohesive argument that genuine dogmatism lies precisely in the act of commitment to an idea. The consequence of denying this, he thinks, is an opening of the floodgates to irrationalism: if to rely on reasoned argument in decision-making is fundamentally an act of faith, then there is a tu quoque – “I simply have a different faith” – that may be employed by those who wish to shield their views from criticism. This raises the following question: why should it be any less dogmatic to adopt particular commitments, attitudes, values, and goals, rather than a particular belief or opinion, come what may? And if Bartley is right that there is only one non-dogmatic attitude – the critical attitude – then why might this not be adopted by an empiricist, a materialist, a metaphysician, or anyone else? (shrink)
This article challenges Bird’s view that scientific progress should be understood in terms of knowledge, by arguing that unjustified scientific beliefs may nevertheless be progressive. It also argues that false beliefs may promote progress.
We argue that the inference from dispositional essentialism about a property (in the broadest sense) to the metaphysical necessity of laws involving it is invalid. Let strict dispositional essentialism be any view according to which any given property’s dispositional character is precisely the same across all possible worlds. Clearly, any version of strict dispositional essentialism rules out worlds with different laws involving that property. Permissive dispositional essentialism is committed to a property’s identity being tied to its dispositional profile or causal (...) role, yet is compatible with moderate interworld variation in a property’s dispositional profile. We provide such a model of dispositional essentialism about a property and metaphysical contingency of the laws involving it. (shrink)
This paper develops a new version of instrumentalism, in light of progress in the realism debate in recent decades, and thereby defends the view that instrumentalism remains a viable philosophical position on science. The key idea is that talk of unobservable objects should be taken literally only when those objects are assigned properties (or described in terms of analogies involving things) with which we are experientially (or otherwise) acquainted. This is derivative from the instrumentalist tradition in so far as the (...) distinction between unobservable and observable is taken to have significance with respect to meaning. (shrink)
How are we to understand the use of probability in corroboration functions? Popper says logically, but does not show we could have access to, or even calculate, probability values in a logical sense. This makes the logical interpretation untenable, as Ramsey and van Fraassen have argued. -/- If corroboration functions only make sense when the probabilities employed therein are subjective, however, then what counts as impressive evidence for a theory might be a matter of convention, or even whim. So isn’t (...) so-called ‘corroboration’ just a matter of psychology? -/- In this paper, I argue that we can go some way towards addressing this objection by adopting an intersubjective interpretation, of the form advocated by Gillies, with respect to corroboration. I show why intersubjective probabilities are preferable to subjective ones when it comes to decision making in science: why group decisions are liable to be superior to individual ones, given a number of plausible conditions. I then argue that intersubjective corroboration is preferable to intersubjective confirmation of a Bayesian variety, because there is greater opportunity for principled agreement concerning the factors involved in the former. (shrink)
This paper, which is based on recent empirical research at the University of Leeds, the University of Edinburgh, and the University of Bristol, presents two difficulties which arise when condensed matter physicists interact with molecular biologists: (1) the former use models which appear to be too coarse-grained, approximate and/or idealized to serve a useful scientific purpose to the latter; and (2) the latter have a rather narrower view of what counts as an experiment, particularly when it comes to computer simulations, (...) than the former. It argues that these findings are related; that computer simulations are considered to be undeserving of experimental status, by molecular biologists, precisely because of the idealizations and approximations that they involve. The complexity of biological systems is a key factor. The paper concludes by critically examining whether the new research programme of ‘systems biology’ offers a genuine alternative to the modelling strategies used by physicists. It argues that it does not. (shrink)
In Making Sense of Life , Keller emphasizes several differences between biology and physics. Her analysis focuses on significant ways in which modelling practices in some areas of biology, especially developmental biology, differ from those of the physical sciences. She suggests that natural models and modelling by homology play a central role in the former but not the latter. In this paper, I focus instead on those practices that are importantly similar, from the point of view of epistemology and cognitive (...) science. I argue that concrete and abstract models are significant in both disciplines, that there are shared selection criteria for models in physics and biology, e.g. familiarity, and that modelling often occurs in a similar fashion. (shrink)
It is a common view that the axioms of probability can be derived from the following assumptions: probabilities reflect degrees of belief, degrees of belief can be measured as betting quotients; and a rational agent must select betting quotients that are coherent. In this paper, I argue that a consideration of reasonable betting behaviour, with respect to the alleged derivation of the first axiom of probability, suggests that and are incorrect. In particular, I show how a rational agent might assign (...) a ‘probability’ of zero to an event which she is sure will occur. (shrink)
I argue that so-called 'background knowledge' in confirmation theory has little, if anything, to do with 'knowledge' in the sense of mainstream epistemology. I argue that it is better construed as 'background information', which need not be believed in, justified, or true.
When a doctor tells you there’s a one percent chance that an operation will result in your death, or a scientist claims that his theory is probably true, what exactly does that mean? Understanding probability is clearly very important, if we are to make good theoretical and practical choices. In this engaging and highly accessible introduction to the philosophy of probability, Darrell Rowbottom takes the reader on a journey through all the major interpretations of probability, with reference to real–world situations. (...) In lucid prose, he explores the many fallacies of probabilistic reasoning, such as the ‘gambler’s fallacy’ and the ‘inverse fallacy’, and shows how we can avoid falling into these traps by using the interpretations presented. He also illustrates the relevance of the interpretation of probability across disciplinary boundaries, by examining which interpretations of probability are appropriate in diverse areas such as quantum mechanics, game theory, and genetics. Using entertaining dialogues to draw out the key issues at stake, this unique book will appeal to students and scholars across philosophy, the social sciences, and the natural sciences. (shrink)
People often act in ways that appear incompatible with their sincere assertions. But how might we explain such cases? On the shifting view, subjects’ degrees of belief may be highly sensitive to changes in context. This paper articulates and refines this view, after defending it against recent criticisms. It details two mechanisms by which degrees of beliefs may shift.
This paper compares and contrasts the concept of a stance with that of a paradigm qua disciplinary matrix, in an attempt to illuminate both notions. First, it considers to what extent it is appropriate to draw an analogy between stances and disciplinary matrices. It suggests that despite first appearances, a disciplinary matrix is not simply a stance writ large. Second, it examines how we might reinterpret disciplinary matrices in terms of stances, and shows how doing so can provide us with (...) a better insight into non-revolutionary science. Finally, it identifies two directions for future research: “Can the rationality of scientific revolutions be understood in terms of the dynamic between stances and paradigms?” and “Do stances help us to understand incommensurability between disciplinary matrices?”. (shrink)
This article develops a new version of instrumentalism, in light of progress in the realism debate in recent decades, and thereby defends the view that instrumentalism remains a viable philosophical position on science. The key idea is that talk of unobservable objects should be taken literally only when those objects are assigned properties with which we are experientially acquainted. This is derivative from the instrumentalist tradition insofar as the distinction between unobservable and observable is taken to have significance with respect (...) to meaning. (shrink)
In his Bayesian Nets and Causality, Jon Williamson presents an ‘Objective Bayesian’ interpretation of probability, which he endeavours to distance from the logical interpretation yet associate with the subjective interpretation. In doing so, he suggests that the logical interpretation suffers from severe epistemological problems that do not affect his alternative. In this paper, I present a challenge to his analysis. First, I closely examine the relationship between the logical and ‘Objective Bayesian’ views, and show how, and why, they are highly (...) similar. Second, I argue that the logical interpretation is not manifestly inferior, at least for the reasons that Williamson offers. I suggest that the key difference between the logical and ‘Objective Bayesian’ views is in the domain of the philosophy of logic; and that the genuine disagreement appears to be over Platonism versus nominalism. (shrink)
This paper, which is based on recent empirical research at the University of Leeds, the University of Edinburgh, and the University of Bristol, presents two difficulties which arise when condensed matter physicists interact with molecular biologists: the former use models which appear to be too coarse-grained, approximate and/or idealized to serve a useful scientific purpose to the latter; and the latter have a rather narrower view of what counts as an experiment, particularly when it comes to computer simulations, than the (...) former. It argues that these findings are related; that computer simulations are considered to be undeserving of experimental status, by molecular biologists, precisely because of the idealizations and approximations that they involve. The complexity of biological systems is a key factor. The paper concludes by critically examining whether the new research programme of ‘systems biology’ offers a genuine alternative to the modelling strategies used by physicists. It argues that it does not. (shrink)
In this article, I present some new group level interpretations of probability, and champion one in particular: a consensus-based variant where group degrees of belief are construed as agreed upon betting quotients rather than shared personal degrees of belief. One notable feature of the account is that it allows us to treat consensus between experts on some matter as being on the union of their relevant background information. In the course of the discussion, I also introduce a novel distinction between (...) intersubjective and interobjective interpretations of probability. (shrink)
This article shows that Popper’s measure of corroboration is inapplicable if, as Popper argued, the logical probability of synthetic universal statements is zero relative to any evidence that we might possess. It goes on to show that Popper’s definition of degree of testability, in terms of degree of logical content, suffers from a similar problem. 1 The Corroboration Function and P(h|b) 2 Degrees of Testability and P(h|b).
ArgumentThis paper investigates whether there is a discrepancy between stated and actual aims in biomechanical research, particularly with respect to hypothesis testing. We present an analysis of one hundred papers recently published inThe Journal of Experimental BiologyandJournal of Biomechanics, and examine the prevalence of papers which have hypothesis testing as a stated aim, contain hypothesis testing claims that appear to be purely presentational, and have exploration as a stated aim. We found that whereas no papers had exploration as a stated (...) aim, 58 per cent of papers had hypothesis testing as a stated aim. We had strong suspicions, at the bare minimum, that presentational hypotheses were present in 31 per cent of the papers in this latter group. (shrink)
This paper presents a new 'discontinuous' view of Popper's theory of corroboration, where theories cease to have corroboration values when new severe tests are devised which have not yet been performed, on the basis of a passage from The Logic of Scientific Discovery. Through subsequent analysis and discussion, a novel problem for Popper's account of corroboration, which holds also for the standard view, emerges. This is the problem of the Big Test : that the severest test of any hypothesis is (...) actually to perform all possible tests. But this means that Popper's demand for 'the severest tests' amounts simply to a demand for 'all possible tests'. The paper closes by considering how this bears on accommodation vs. prediction, with respect to corroboration. (shrink)
This paper shows that Bertrand's proposed 'solutions' to his own question, which generates his chord paradox, are inapplicable. It uses a simple analogy with cake cutting. The problem is that none of Bertrand's solutions considers all possible cuts. This is no solace for the defenders of the principle of indifference, however, because it emerges that the paradox is harder to solve than previously anticipated.
We offer an overview of some ways of examining the connections between stance and rationality, by surveying recent work on four central topics: the very idea of a stance, the relations between stances and voluntarism, the metaphysics and epistemology that emerge once stances are brought to center stage, and the role that emotions and phenomenology play in the empirical stance.
An important suggestion of objective Bayesians is that the maximum entropy principle can replace a principle which is known to get into paradoxical difficulties: the principle of indifference. No one has previously determined whether the maximum entropy principle is better able to solve Bertrand’s chord paradox than the principle of indifference. In this paper I show that it is not. Additionally, the course of the analysis brings to light a new paradox, a revenge paradox of the chords, that is unique (...) to the maximum entropy principle. (shrink)
In Making Sense of Life, Keller emphasizes several differences between biology and physics. Her analysis focuses on significant ways in which modelling practices in some areas of biology, especially developmental biology, differ from those of the physical sciences. She suggests that natural models and modelling by homology play a central role in the former but not the latter. In this paper, I focus instead on those practices that are importantly similar, from the point of view of epistemology and cognitive science. (...) I argue that concrete and abstract models are significant in both disciplines, that there are shared selection criteria for models in physics and biology, e.g. familiarity, and that modelling often occurs in a similar fashion. (shrink)
A major problem posed by cases of self-deception concerns the inconsistent behavior of the self-deceived subject (SDS). How can this be accounted for, in terms of propositional attitudes and other mental states? In this paper, we argue that key problems with two recent putative solutions, due to Mele and Archer, are avoided by “the shifting view” that has been advanced elsewhere in order to explain cases where professed beliefs conflict with actions. We show that self-deceived agents may possess highly unstable (...) degrees of belief concerning the matters about which they are self-deceived. (shrink)
This paper is a supplement to, and provides a proof of principle of, Kuhn vs. Popper on Criticism and Dogmatism in Science: A Resolution at the Group Level. It illustrates how calculations may be performed in order to determine how the balance between different functions in science—such as imaginative, critical, and dogmatic—should be struck, with respect to confirmation (or corroboration) functions and rules of scientific method.
This paper argues that Duhem’s thesis does not decisively refute a corroboration-based account of scientific methodology (or ‘falsificationism’), but instead that auxiliary hypotheses are themselves subject to measurements of corroboration which can be used to inform practice. It argues that a corroboration-based account is equal to the popular Bayesian alternative, which has received much more recent attention, in this respect.
This chapter presents and criticizes the two dominant accounts of thought experiments in science, due to James Robert Brown and John Norton; the mechanical thought experiment of Simon Stevin is used as an exemplar. The chapter argues that scientific thought experiments are strongly analogous to their ‘real’, actual physical, counterparts. In each kind of experiment, theoretical context affects which arguments are generated and/or thought to be sustainable on the basis of the states of affairs involved. The difference is whether the (...) states of affairs are hypothetical and/or counterfactual rather than actual. This view is consistent with empiricism concerning scientific thought experiments. On such empiricism, the arguments that it is possible to pump from thought experiments have premises grounded in experience, rather than an additional faculty. (shrink)
Whether educational research should employ the ‘scientific method’ has been a recurring issue in its history. Hence, textbooks on research methods continue to perpetuate the idea that research students ought to choose between competing camps: ‘positivist’ or ‘interpretivist’. In reference to one of the most widely referred to educational research methods textbooks on the market—namely Research Methods in Education by Cohen, Manion, and Morrison—this paper demonstrates the misconception of science in operation and the perversely false dichotomy that has become enshrined (...) in educational research. It then advocates a new approach, and suggests that the fixation with ‘science’ versus ‘non-science’ is counterproductive, when what is actually required for good inquiry is a critical approach to knowledge claims. (shrink)
In this article, I present some new group level interpretations of probability, and champion one in particular: a consensus-based variant where group degrees of belief are construed as agreed upon betting quotients rather than shared personal degrees of belief. One notable feature of the account is that it allows us to treat consensus between experts on some matter as being on the union of their relevant background information. In the course of the discussion, I also introduce a novel distinction between (...) intersubjective and interobjective interpretations of probability. (shrink)
This paper is a supplement to, and provides a proof of principle of, Kuhn vs. Popper on Criticism and Dogmatism in Science: A Resolution at the Group Level. It illustrates how calculations may be performed in order to determine how the balance between different functions in science—such as imaginative, critical, and dogmatic—should be struck, with respect to confirmation functions and rules of scientific method.
This paper provides a rationale for advocating pancritical rationalism. First, it argues that the advocate of critical rationalism may accept (but not be internally justified in accepting) that there is ‘justification’ in an externalist sense, specifically that certain procedures can track truth, and suggest that this recognition should inform practice; that one should try to determine which sources and methods are appropriate for various aspects of inquiry, and to what extent they are. Second, it argues that if there is external (...) justification, then a critical rationalist is better off than a dogmatist from an evolutionary perspective. (shrink)