This paper asks (a) how new scientific objects of research are onceptualized at a point in time when little is known about them, and (b) how those conceptualizations, in turn, figure in the process of investigating the phenomena in question. Contrasting my approach with existing notions of concepts and situating it in relation to existing discussions about the epistemology of experimentation, I propose to think of concepts as research tools. I elaborate on the conception of a tool that (...) informs my account. Narrowing my focus to phenomena in cognitive neuropsychology, I then illustrate my thesis with the example of the concept of implicit memory. This account is based on an original reconstruction of the nature and function of operationism in psychology. (shrink)
This paper provides an interpretation of Hans-Jörg Rheinberger’s notions of epistemic things and historical epistemology . I argue that Rheinberger’s approach articulates a unique contribution to current debates about integrated HPS, and I propose some modifications and extensions of this contribution. Drawing on examples from memory research, I show that Rheinberger is right to highlight a particular feature of many objects of empirical research (“epistemic things”)—especially in the contexts of exploratory experimentation—namely our lack of knowledge about them. I (...) argue that this analysis needs to be supplemented with an account of what scientists do know, and in particular, how they are able to attribute rudimentary empirical contours to objects of research. These contours are closely connected to paradigmatic research designs, which in turn are tied to basic methodological rules for the exploration of the purported phenomena. I suggest that we engage with such rules in order to develop our own normative (epistemological) categories, and I tie this proposal to the idea of a methodological naturalism in philosophy of science. (shrink)
Several major breakthroughs in the history of physics have been prompted not by new empirical data but by thought experiments. James Robert Brown and John Norton have developed accounts of how thought experiments can yield such advances. Brown argues that knowledge gained via thought experiments demands a Platonic explanation; thought experiments for Brown are a window into the Platonic realm of the laws of nature. Norton argues that thought experiments are just cleverly disguised inductive or deductive arguments, so no new (...) account of their epistemology is needed. In this paper, I argue that although we do not need to invoke any Platonic insight to explain thought experimentation, Norton’s eliminativist account fails to capture the unique epistemological importance of thought experiments qua thought experiments. I then present my own account, according to which thought experiments are a particular type of inductive inference that is uniquely suited to generate new breakthroughs. (shrink)
Philosophers of science turned to historical case studies in part in response to Thomas Kuhn's insistence that such studies can transform the philosophy of science. In this issue Joseph Pitt argues that the power of case studies to instruct us about scientific methodology and epistemology depends on prior philosophical commitments, without which case studies are not philosophically useful. Here I reply to Pitt, demonstrating that case studies, properly deployed, illustrate styles of scientific work and modes of argumentation that are (...) not well handled by currently standard philosophical analyses. I illustrate these claims with exemplary findings from case studies dealing with exploratory experimentation and with interdisciplinary cooperation across sciences to yield multiple independent means of access to theoretical entities. The latter cases provide examples of ways that scientists support claims about theoretical entities that are not available in work performed within a single discipline. They also illustrate means of correcting systematic biases that stem from the commitments of each discipline taken separately. These findings illustrate the transformative power of case study methods, allow us to escape from the horns of Pitt's "dilemma of case studies," and vindicate some of the post-Kuhn uses to which case studies have been put. (shrink)
In his new book, Knowledge: The Philosophical Quest in History, Steve Fuller returns to core themes of his program of social epistemology that he first outlined in his 1988 book, Social Epistemology. He develops a new, unorthodox theology and philosophy building upon his testimony in Kitzmiller v. Dover Area School District in defense of intelligent design, leading to a call for maximal human experimentation. Beginning from the theological premise rooted in the Abrahamic religious tradition that we are (...) created in the image of God, Fuller argues that the spark of the divine within us distinguishes us from animals. I argue that Fuller’s recent work takes us away from key insights of his original work. In contrast, I advocate for a program of social epistemology rooted in evolutionary science rather than intelligent design, emphasize a precautionary and ecological approach rather than a proactionary approach that favors risky human experimentation, and attend to our material and sociological embeddedness rather than a transhumanist repudiation of the body. (shrink)
The importance given by historian and philosopher of science Georges Canguilhem to the role of practice, techniques, and experimentation in concept-formation was largely overlooked by commentators. After placing Canguilhem’s contributions within the larger history of historical epistemology in France, and clarifying his views regarding this expression, I re-evaluate the relation between concepts and experimental practices in Canguilhem’s philosophy of science. Drawing on his early writings on the relations between science and technology in the 1930s, on the Essai sur (...) quelques problèmes concernant le normal et le pathologique , and on La formation du concept de réflexe aux XVIIe et XVIIIe siècles , I argue that the formation and rectification of concepts in Canguilhem’s sense are intrinsically bound with the experimental, material, technical, and cultural contexts in which concepts are operationalized. (shrink)
The importance given by historian and philosopher of science Georges Canguilhem to the role of practice, techniques, and experimentation in concept-formation was largely overlooked by commentators. After placing Canguilhem’s contributions within the larger history of historical epistemology in France, and clarifying his views regarding this expression, I re-evaluate the relation between concepts and experimental practices in Canguilhem’s philosophy of science. Drawing on his early writings on the relations between science and technology in the 1930s, on the Essai sur (...) quelques problèmes concernant le normal et le pathologique, and on La formation du concept de réflexe aux XVIIe et XVIIIe siècles, I argue that the formation and rectification of concepts in Canguilhem’s sense are intrinsically bound with the experimental, material, technical, and cultural contexts in which concepts are operationalized. (shrink)
Prof. Miščević has long been an ardent defender of the use of thought experiments in philosophy, foremost metaphysics, epistemology and philosophy of mind. Recently he has, in his typically sophisticated manner, extended his general account of philosophical thought-experimenting to the domain of normative politics. Not only can the history of political philosophy be better understood and appreciated, according to Miščević, when seen as a more or less continuous, yet covert, practice of thought-experimenting, the very progress of the discipline may (...) crucially depend on finding the right balance between the constraints of (biological, psychological, economic, political, and so on) reality and political-moral ideals when we set to design our basic political notions and institutions. I have much less confidence in this project than prof. Miščević does. As a subspecies of moral TE, political TE share all their problems plus exhibit some of their own. In the paper, I present and discuss two types of evidence that threaten to undermine political philosophers’ trust in thought-experiments and the ethical/political intuitions elicited by them: (i) the dismal past record of thought-experimentation in moral and political philosophy; and (ii) the variety, prevalence, and stubbornness, of bias in ordinary social/political judgment. (shrink)
The paper compares John Dewey's pragmatism and cultural-historical activity theory as epistemologies and theories of transformative material activity. For both of the theories, the concept of activity, the prototype of which is work, constitutes a basis for understanding the nature of knowledge and reality. This concept also implies for both theories a methodological approach of studying human behavior in which social experimentation and intervention play a central role. They also suggest that reflection and thought, mediated by language and semiotic (...) artifacts, serve the reorientation of activity and is vital in the development of new, alternative ways of action. That is why Dewyan pragmatism and activity theory supply means of understanding organizational behavior and change in human activities better than the concepts of practice based on rule following, routines or embodied skills. (shrink)
This paper argues that whereas philosophical discussions of first-person methods often turn on the veridicality of first-person reports, more attention should be paid to the experimental circumstances under which the reports are generated, and to the purposes of designing such experiments. After pointing to the ‘constructedness’ of first-person reports in the science of perception, I raise questions about the criteria by which to judge whether the reports illuminate something about the nature of perception. I illustrate this point with a historical (...) debate between Gestalt psychologist and atomists, both of whom used first-person methods to investigate perception. (shrink)
Why should anybody care about theoretical simplicity? It is pretty clear that simpler theories don't stand a better chance of being true, just because they are simpler than their competitors. Of course, simpler theories are easier to use in technological applications, and they are more tractable. But that is something engineers should be concerned about. Why should the theoretical scientist be interested in simple theories? ;The principal virtue of simple theories is their facilitation of scientific understanding in virtue of their (...) greater explanatory power. Simple theories are more unified, and they allow important kinds of reasoning about the world. If a theory yields a unified but structure-rich picture of the world, and thereby a high degree of understanding, we can design relevant experiments, form rational expectations, and in general are in a better position to gather relevant data than when we confront the world without any understanding whatsoever. Simple theories are therefore, in virtue of increasing our understanding, epistemically advantageous. That's why the theoretical scientist should be interested in simple theories. ;Of course, since the choice of simple theories does not guarantee getting closer to the truth, the claim that such a choice is epistemically advantageous presupposes that we draw a distinction between the explanatory power of theories and their accuracy. This distinction has not received sufficient attention in the existing literature, and that's why it was so difficult to say exactly what the virtue of simple theories is. Recognizing that explanatory power and accuracy are orthogonal aspects of scientific theories allows us to assign simplicity the role of facilitating understanding and thereby guiding controlled experimentation. (shrink)
This paper reviews current debates in social epistemology about the relations between knowledge and consensus. These relations are philosophically interesting on their own, but also have practical consequences, as consensus takes an increasingly significant role in informing public decision making. The paper addresses the following questions. When is a consensus attributable to an epistemic community? Under what conditions may we legitimately infer that a consensual view is knowledge-based or otherwise epistemically justified? Should consensus be the aim of scientific inquiry, (...) and if so, what kind of consensus? How should dissent be handled? It is argued that a legitimate inference that a theory is correct from the fact that there is a scientific consensus on it requires taking into consideration both cognitive properties of the theory as well as social properties of the consensus. The last section of the paper reviews computational models of consensus formation.. (shrink)
In "Truth and Method" Hans Georg Gadamer revealed hermeneutics as one of the foundational epistemological elements of history, in contrast to scientific method, which, with empiricism, constitutes natural sciences’ epistemology. This important step solved a number of long-standing arguments over the ontology of history, which had become increasingly bitter in the twentieth century. But perhaps Gadamer’s most important contribution was that he annulled history’s supposed inferiority to the natural sciences by showing that the knowledge it offers, though different in (...) nature from science, is of equal import. By showing history’s arrant independence from the natural sciences, the former was furnished with a new-found importance, and thrust on an equal footing with the latter—even in a distinctly scientific age such as ours. This essay intends to show that the idea of history’s discrete ontology from science was prefigured almost a century earlier by Benedetto Croce. Croce and Gadamer show compelling points of contact in their philosophies, notwithstanding that they did not confer equal consequence to what may be identified as Gadamer’s principal substantiation of history’s epistemology—hermeneutics. Of course this essay does not aspire to be exhaustive: the thought of both philosophers is far too dense. Nevertheless, the main points of contact shall be outlined, and, though concise, this essay seeks to point out the striking similarities of these two cardinal philosophers of history. (shrink)
The default view in the epistemology of forgetting is that human memory would be epistemically better if we were not so susceptible to forgetting—that forgetting is in general a cognitive vice. In this paper, I argue for the opposed view: normal human forgetting—the pattern of forgetting characteristic of cognitively normal adult human beings—approximates a virtue located at the mean between the opposed cognitive vices of forgetting too much and remembering too much. I argue, first, that, for any finite cognizer, (...) a certain pattern of forgetting is necessary if her memory is to perform its function well. I argue, second, that, by eliminating clutter from her memory store, this pattern of forgetting improves the overall shape of the subject’s total doxastic state. I conclude by reviewing work in psychology which suggests that normal human forgetting approximates this virtuous pattern of forgetting. (shrink)
The question of rationality and of its role in human agency has been at the core of pragmatist concerns since the beginning of this movement. While Peirce framed the horizon of a new understanding of human reason through the idea of inquiry as aiming at belief-fixation and James stressed the individualistic drives that move individuals to action, it is in Dewey’s writing that we find the deepest understanding of the naturalistic and normative traits of rationality considered as the qualifying attribute (...) of human agency. Recent developments in moral and political philosophy as well as in general pragmatist scholarship have shown a renewal of interest in the role of human reason in agency, both with respect to control of conduct (decisions about how to act) and with respect to normative attitudes (considerations of what is good and right). In this article I will examine some features of Dewey’s epistemology which are particularly promising for the elaboration of a theory of practical rationality based on pragmatist sources. In particular, I will focus on Dewey’s notion of “judgment of practice” in order to frame a distinctively Deweyan approach to practical rationality. In order to point out the specificity of Dewey’s epistemological framework, I will refer to it as an “epistemology of practice”i. The aim of this article is to clarify the epistemological meaning of the concepts of articulation and transformation, that Dewey places at the heart of his theory of inquiry. Part of my argument consists in showing that through these notions Dewey aimed at broadening the conception of rationality, bringing it beyond the reach of the standard notions of analysis and synthesis and of induction, deduction, and abduction. Once the specificity of Dewey’s conception of rationality will have been demonstrated, I will proceed to show some of its implications in the explanation of the rationality of human agency with reference to practical reasoning and value assessment. I will then conclude the article by drawing some implications of Dewey’s theory of judgment for a broader epistemology based upon the acknowledgment of the primacy of practice. (shrink)
Background: The experimental method to acquire knowledge about efficacy and efficiency of medical procedures is well established in evidence-based medicine. A method to attain evidence about the significance of diseases and interventions from the patients' perspectives taking into account their right to self-determination about their lives and bodies has however not been sufficiently characterized.Design: Identification of a method to acquire evidence about the clinical significance of disease and therapeutic options from the patients' perspectives.Arguments: Communication between patient and physician is analyzed (...) as the method to attain evidence about what is at stake for individual patients in disease and therapy. It is the method that enables physicians to directly take into account patients' disease experiences and their aims regarding treatments. These patients' perspectives in turn determine the clinical significance of diagnoses and therapeutic options, if patient-autonomy is taken seriously.Conclusions: A full account of evidence-based medicine needs to include experimentation and communication between physician and patient as equally important methods to attain evidence necessary to practice patient-oriented medicine. The communicative method is especially important for primary physicians as they direct patients within the medical system to have their medical problems most effectively and efficiently addressed. (shrink)
An epistemology of art has seemed problematic mainly because of arguments claiming that an essential element of a theory of knowledge, truth, has no place in aesthetic contexts. For, if it is objectively true that something is beautiful, it seems to follow that the predicate “is beautiful” expresses a property – a view asserted by Plato but denied by Hume and Kant. But then, if the belief that something is beautiful is not objectively true, we cannot be said to (...) know that something is beautiful and the path to an epistemology of art is effectively blocked. The article places the existence aesthetic properties in the proper context; presents a logically correct argument for the existence of such properties; identifies strategies for responding to this argument; explains why objections by Hume, Kant, and several other philosophers fail; and sketches a realization account of beauty influenced by Hogarth. (shrink)
Sometimes we learn through the use of imagination. The epistemology of imagination asks how this is possible. One barrier to progress on this question has been a lack of agreement on how to characterize imagination; for example, is imagination a mental state, ability, character trait, or cognitive process? This paper argues that we should characterize imagination as a cognitive ability, exercises of which are cognitive processes. Following dual process theories of cognition developed in cognitive science, the set of imaginative (...) processes is then divided into two kinds: one that is unconscious, uncontrolled, and effortless, and another that is conscious, controlled, and effortful. This paper outlines the different epistemological strengths and weaknesses of the two kinds of imaginative process, and argues that a dual process model of imagination helpfully resolves or clarifies issues in the epistemology of imagination and the closely related epistemology of thought experiments. (shrink)
Social epistemology has paid little attention to oral historiography as a source of expert insight into the credibility of testimony. One extant suggestion, however, is that oral historians treat testimony with a default trust reflecting a standing warrant for accepting testimony. The view that there is such a standing warrant is sometimes known as the Acceptance Principle for Testimony. I argue that the practices of oral historians do not count in support of APT, all in all. Experts have commonly (...) described oral traditions as oriented towards political, cultural and entertainment ends, and not only—or not even—towards an accurate depiction of past events. Even when accuracy is the emphasis, many historians of oral tradition do not trust such testimony as APT would suggest; the importance of gathering supporting evidence is a consistent emphasis. Yet oral historiography, both of traditions and more generally, does hold out lessons for the epistemology of testimony, implicating a wider range of social.. (shrink)
This paper is about the putative theoretical virtue of strength, as it might be used in abductive arguments to the correct logic in the epistemology of logic. It argues for three theses. The first is that the well-defined property of logical strength is neither a virtue nor a vice, so that logically weaker theories are not—all other things being equal—worse or better theories than logically stronger ones. The second thesis is that logical strength does not entail the looser characteristic (...) of scientific strength, and the third is that many modern logics are on a par—or can be made to be on a par—with respect to scientific strength. (shrink)
We present a family of counter-examples to David Christensen's Independence Criterion, which is central to the epistemology of disagreement. Roughly, independence requires that, when you assess whether to revise your credence in P upon discovering that someone disagrees with you, you shouldn't rely on the reasoning that lead you to your initial credence in P. To do so would beg the question against your interlocutor. Our counter-examples involve questions where, in the course of your reasoning, you almost fall for (...) an easy-to-miss trick. We argue that you can use the step in your reasoning where you caught the trick as evidence that someone of your general competence level likely fell for it. Our cases show that it's permissible to use your reasoning about disputed matters to disregard an interlocutor's disagreement, so long as that reasoning is embedded in the right sort of explanation of why she finds the disputed conclusion plausible, even though it's false. (shrink)
Philip Kitcher has advanced an epistemology of science that purports to be naturalistic. For Kitcher, this entails that his epistemology of science must explain the correctness of belief-regulating norms while endorsing a realist notion of truth. This paper concerns whether or not Kitcher's epistemology of science is naturalistic on these terms. I find that it is not but that by supplementing the account we can secure its naturalistic standing.
If our experiences are cognitively penetrable, they can be influenced by our antecedent expectations, beliefs, or other cognitive states. Theorists such as Churchland, Fodor, Macpherson, and Siegel have debated whether and how our cognitive states might influence our perceptual experiences, as well as how any such influences might affect the ability of our experiences to justify our beliefs about the external world. This article surveys views about the nature of cognitive penetration, the epistemological consequences of denying cognitive penetration, and the (...) epistemological consequences of affirming cognitive penetration. (shrink)
Recent work on analyticity distinguishes two kinds, metaphysical and epistemic. This paper argues that the distinction allows for a new view in the philosophy of logic according to which the claims of logic are metaphysically analytic and have distinctive modal profiles, even though their epistemology is holist and in many ways rather Quinean. It is argued that such a view combines some of the more attractive aspects of the Carnapian and Quinean approaches to logic, whilst avoiding some famous problems.
Karl Popper's methodology highlights our scientific ignorance: hence the need to institutionalize open?mindedness through controlled experiments that may falsify our fallible theories about the world. In his endorsement of?piecemeal social engineering,? Popper assumes that the social?democratic state and its citizens are capable of detecting social problems, and of assessing the results of policies aimed at solving them, through a process of experimentation analogous to that of natural science. But we are not only scientifically but politically ignorant: ignorant of the (...) facts that underpin political debate, which are brought to our attention by theories that, as Max Weber emphasized, can be tested only through counterfactual thought experiments. Public?opinion and political?psychology research suggest that human beings are far too unaware, illogical, and doctrinaire to conduct the rigorous theorizing that would be necessary to make piecemeal social engineering work. F.A. Hayek realized that the public could not engage, specifically, in piecemeal economic regulation but failed to draw the conclusion that this was due to a specific type of political ignorance: ignorance of economic theory. (shrink)
Peter Graham has recently given a dilemma purportedly showing the compatibility of libertarianism about free will and the anti-skeptical epistemology of testimony. In the first part of this paper I criticize his dilemma: the first horn either involves a false premise or makes the dilemma invalid. The second horn relies without argument on an implausible assumption about testimonial knowledge, and even if granted, nothing on this horn shows libertarianism does not entail skepticism about testimonial justification. I then argue for (...) the incompatibility of (i) a view entailed by Open Theism, viz., that there are no true counterfactuals of freedom, (ii) a popular form of process reliabilism about justification and knowledge, and (iii) a weak anti-skepticism about testimonial justification and knowledge. I conclude that there is a costly tension between certain views about testimony and about free will. (shrink)
The subject of this paper is the epistemology of identity: a general theory of knowledge, evidence and justification for the claim that one thing is identical to another. Although identity figures significantly in our epistemic lives, this is a topic that, to the best of my knowledge, has gone entirely unexplored. Initial attempts to integrate such an epistemology into existing theories of evidence---many of which are tailor-made for contingent propositions---are confounded by the necessity of identity. I defend a (...) restricted form of skepticism according to which the only knowable identity claims are trivial. (shrink)
A central theme in the Christian contemplative tradition is that knowing God is much more like ‘unknowing’ than it is like possessing rationally acceptable beliefs. Knowledge of God is expressed, in this tradition, in metaphors of woundedness, darkness, silence, suffering, and desire. Philosophers of religion, on the other hand, tend to explore the possibilities of knowing God in terms of rational acceptability, epistemic rights, cognitive responsibility, and propositional belief. These languages seem to point to very different accounts of how it (...) is that we come to know God, and a very different range of critical concepts by which the truth of such knowledge can be assessed. In this paper, I begin to explore what might be at stake in these different languages of knowing God, drawing particularly on Alvin Plantinga’s epistemology of Christian belief. I will argue that his is a distorted account of the epistemology of Christian belief, and that this has implications for his project of demonstrating the rational acceptability of Christian faith for the 21st century. (shrink)
Mathematical models are a well established tool in most natural sciences. Although models have been neglected by the philosophy of science for a long time, their epistemological status as a link between theory and reality is now fairly well understood. However, regarding the epistemological status of mathematical models in the social sciences, there still exists a considerable unclarity. In my paper I argue that this results from specific challenges that mathematical models and especially computer simulations face in the social sciences. (...) The most important difference between the social sciences and the natural sciences with respect to modeling is that in the social sciences powerful and well confirmed background theories (like Newtonian mechanics, quantum mechanics or the theory of relativity in physics) do not exist in the social sciences. Therefore, an epistemology of models that is formed on the role model of physics may not be appropriate for the social sciences. I discuss the challenges that modeling faces in the social sciences and point out their epistemological consequences. The most important consequences are that greater emphasis must be placed on empirical validation than on theoretical validation and that the relevance of purely theoretical simulations is strongly limited. (shrink)
We ordinarily take it as obvious that we acquire knowledge of our world on the basis of sensory perception, and that such knowledge plays a central cognitive and practical role in our lives. Upon reflection, however, it is far from obvious what perception involves and how exactly it contributes to our knowledge. Indeed, skeptical arguments have led some to question whether we have any knowledge, or even rational or justified belief, regarding the world outside our minds. -/- Investigating the nature (...) and scope of our perceptual knowledge and perceptually justified belief, A Critical Introduction to the Epistemology of Perception provides an accessible and engaging introduction to a flourishing area of philosophy. Before introducing and evaluating the main theories in the epistemology of perception, Ali Hasan sets the stage with a discussion of skepticism, realism, and idealism in early modern philosophy, theories of perceptual experience (sense-datum theory, adverbialism, intentionalism, and metaphysical disjunctivism), and central controversies in general epistemology. Hasan then surveys the main theories in the contemporary debate, including coherentism, abductivism, phenomenal conservatism or dogmatism, reliabilism, and epistemological disjunctivism, presenting the motivations and primary objections to each. Hasan also shows how to avoid confusing metaphysical issues with epistemological ones, and identifies interesting connections between the epistemology and metaphysics of perception. -/- For students in epistemology or the philosophy of perception looking to better understand the central questions, concepts, and debates shaping contemporary epistemology, A Critical Introduction to the Epistemology of Perception is essential reading. (shrink)
We have some justified beliefs about modal matters. A modal epistemology should explain what’s involved in our having that justification. Given that we’re realists about modality, how should we expect that explanation to go? In the first part of this essay, I suggest an answer to this question based on an analogy with games. Then, I outline a modal epistemology that fits with that answer. According to a theory-based epistemology of modality, you justifiably believe that p if (...) you justifiably believe a theory that says that p and you believe p on the basis of that theory. (shrink)
The paper approaches the topic of what a general philosophy of science could mean today from the perspective of a historical epistemology. Consequently, in a first step, the paper looks at the notion of generality in the sciences, and how it evolved over time, on the example of the life sciences. In the second part of the paper, the urgency of a general philosophy of science is located in the history of philosophy of science. Two attempts at the beginning (...) of the twentieth century are particularly highlighted: that of Karl Popper and that of Martin Heidegger. Both of them concentrate, albeit in widely different form, on the phenomenon of research as an open-ended process. This trend is even more pronounced in Gaston Bachelard’s version of a historical epistemology, whose work is taken as a point of reference for a general historical epistemology of research. The paper concludes with a plea to look, with Georges Canguilhem, at the history of the sciences as a laboratory for epistemology. (shrink)
This position paper advocates combining formal epistemology and the new paradigm psychology of reasoning in the studies of conditionals and reasoning with uncertainty. The new paradigm psychology of reasoning is characterized by the use of probability theory as a rationality framework instead of classical logic, used by more traditional approaches to the psychology of reasoning. This paper presents a new interdisciplinary research program which involves both formal and experimental work. To illustrate the program, the paper discusses recent work on (...) the paradoxes of the material conditional, nonmonotonic reasoning, and Adams’ Thesis. It also identifies the issue of updating on conditionals as an area which seems to call for a combined formal and empirical approach. (shrink)
_ Source: _Volume 10, Issue 1, pp 98 - 115 What has come to be known as the ‘linguistic turn’ in historical theory over the past forty years or so has finished what the two World Wars began in demolishing the confidence that the historical discipline possessed at the turn of the twentieth century. This confidence was most memorably expressed by Lord Acton that one day we would possess ‘ultimate history’. Today most historians are probably more inclined to subscribe to (...) Pieter Geyl’s view that history is ‘an argument without end’. Yet the jettisoning of a teleological goal for historical accounts does not mean that we have to also part with the idea of progress; we just need a new definition of it. In this article I argue that we should adopt an evolutionary epistemology of history which sees progress as something pushed from behind, rather than aiming at an undefined point in the future; but this is not the only advantage an evolutionary epistemology can offer us. I go on to outline two further aspects of evolutionary epistemology which may benefit historical theorists. (shrink)
In this paper I outline my conception of the epistemology of science, by reference to my published papers, showing how the ideas presented there fit together. In particular I discuss the aim of science, scientific progress, the nature of scientific evidence, the failings of empiricism, inference to the best (or only) explanation, and Kuhnian psychology of discovery. Throughout, I emphasize the significance of the concept of scientific knowledge.
This paper assesses the comparative reliability of two belief-revision rules relevant to the epistemology of disagreement, the Equal Weight and Stay the Course rules. I use two measures of reliability for probabilistic belief-revision rules, calibration and Brier Scoring, to give a precise account of epistemic peerhood and epistemic reliability. On the calibration measure of reliability, epistemic peerhood is easy to come by, and employing the Equal Weight rule generally renders you less reliable than Staying the Course. On the Brier-Score (...) measure of reliability, epistemic peerhood is much more difficult to come by, but employing the Equal Weight rule always renders you more reliable than Staying the Course. I conclude with some normative lessons we can draw from these formal results. (shrink)
This paper applies a virtue epistemology approach to using the Internet, as to improve our information-seeking behaviours. Virtue epistemology focusses on the cognitive character of agents and is less concerned with the nature of truth and epistemic justification as compared to traditional analytic epistemology. Due to this focus on cognitive character and agency, it is a fruitful but underexplored approach to using the Internet in an epistemically desirable way. Thus, the central question in this paper is: How (...) to use the Internet in an epistemically virtuous way? Using the work of Jason Baehr, it starts by outlining nine intellectual or epistemic virtues: curiosity, intellectual autonomy, intellectual humility, attentiveness, intellectual carefulness, intellectual thoroughness, open-mindedness, intellectual courage and intellectual tenacity. It then explores how we should deploy these virtues and avoid the corresponding vices when interacting with the Internet, particularly search engines. Whilst an epistemically virtuous use of the Internet will not guarantee that one will acquire true beliefs, understanding or even knowledge, it will strongly improve one’s information-seeking behaviours. The paper ends with arguing that teaching and assessing online intellectual virtues should be part of school and university curricula, perhaps embedded in critical thinking courses, or even better, as individual units. (shrink)
It is proposed that psychologists need a working theory of knowledge for conceptual and discourse purposes. Arguments are made from a pragmatist view of science for a conception of inquiry practice that may resolve current paradigm conflicts and support a viable methodological pluralism. The suggestion is made that a naturalized approach to research practice, such as historical-descriptive case study, may illuminate the judgments and intentions constitutive of our applied epistemology and methodological choices. Implications of such meta-methodological understanding for research (...) training and a contingent theory of knowledge for psychological science are discussed. (shrink)
Three contrasting approaches to the epistemology of argument are presented. Each one is naturalistic, drawing upon successful practices as the basis for epistemological virtue. But each looks at very different sorts of practices and they differ greatly as to the manner with which relevant practices may be described. My own contribution relies on a metamathematical reconstruction of mature science, and as such, is a radical break with the usual approaches within the theory of argument.
Discussions of technoscience are bringing to light that scientific journals feature very different knowledge claims. At one end of the spectrum, there is the scientific claim that a hypothesis needs to be reevaluated in light of new evidence. At the other end of the spectrum, there is the technoscientific claim that some new measure of control has been achieved in a laboratory. The latter claim has not received sufficient attention as of yet. In what sense is the achievement of control (...) genuine knowledge in its own right; how is this knowledge acquired; and publicly validated? Notions of tacit or embodied knowledge, of knowledge by acquaintance, of engineering or thing knowledge, and reconstructions of ability or skill take us only part of the way towards answering such questions. The epistemology of technoscience needs to account for the acquisition and demonstration of a public knowledge of control that does not consist in the holding of propositions, even though it is usually communicated in writing: Technoscientific knowledge is, firstly, objective and public insofar as it is exhibited and documented. Secondly, it presupposes a specific context of technology and expertise. Thirdly, it is communicable, even where the achieved capability itself is not. Knowledge of control entails, fourthly, a knowledge of causal relationships, and it sediments itself, fifthly, as a habit of action in the sense proposed by Charles Sanders Peirce. (shrink)
In this paper I argue, first, that the most influential (and perhaps only acceptable) account of the epistemology of self-knowledge, developed and defended at great length in Wright (1989b) and (1989c) (among other places), leaves unanswered a question about the psychology of self-knowledge; second, that without an answer to this question about the psychology of self-knowledge, the epistemic account cannot be considered acceptable; and third, that neither Wright's own answer, nor an interpretation-based answer (based on a proposal from Jacobsen (...) (1997)), will suffice as an acceptable answer to the psychological question. My general ambition is thus to establish that more work is needed if we are to have a full account of self-knowledge in both its epistemological and psychological aspects. I conclude by suggesting how my thesis bears on those who aim to provide an empirical account of the cognition involved in self-knowledge. (shrink)
Recent years have seen an explosion of empirical data concerning arithmetical cognition. In this paper that data is taken to be philosophically important and an outline for an empirically feasible epistemological theory of arithmetic is presented. The epistemological theory is based on the empirically well-supported hypothesis that our arithmetical ability is built on a protoarithmetical ability to categorize observations in terms of quantities that we have already as infants and share with many nonhuman animals. It is argued here that arithmetical (...) knowledge developed in such a way cannot be totally conceptual in the sense relevant to the philosophy of arithmetic, but neither can arithmetic understood to be empirical. Rather, we need to develop a contextual a priori notion of arithmetical knowledge that preserves the special mathematical characteristics without ignoring the roots of arithmetical cognition. Such a contextual a priori theory is shown not to require any ontologically problematic assumptions, in addition to fitting well within a standard framework of general epistemology. (shrink)
In place of the traditional epistemological view of knowledge as justified true belief we argue that artificial intelligence and law needs an evidence-based epistemology according to which scientific knowledge is based on critical analysis of evidence using argumentation. This new epistemology of scientific evidence (ESE) models scientific knowledge as achieved through a process of marshaling evidence in a scientific inquiry that results in a convergence of scientific theories and research results. We show how a dialogue interface of argument (...) from expert opinion, along with its set of critical questions, provides the argumentation component of the ESE. It enables internal scientific knowledge to be translated over into a wider arena in which individual nonexpert citizens and groups can make use of it. The external component shows how evidence is presented and used in a legal procedural setting that includes fact-finding, weighing the credibility of expert witnesses, and critical questioning of arguments. The paper critically reviews the standards of admissibility of scientific evidence using the ESE. (shrink)
Dualism in the Epistemology of Testimony and the Ability Intuition Content Type Journal Article DOI 10.1007/s11406-010-9291-4 Authors Spyridon Orestis Palermos, Department of Philosophy, School of Philosophy, Psychology and Language Sciences (PPLS), The University of Edinburgh, Edinburgh, UK Journal Philosophia Online ISSN 1574-9274 Print ISSN 0048-3893.
Artificial intelligence has often been seen as an attempt to reduce the natural mind to informational processes and, consequently, to naturalize philosophy. The many criticisms that were addressed to the so-called “old-fashioned AI” do not concern this attempt itself, but the methods it used, especially the reduction of the mind to a symbolic level of abstraction, which has often appeared to be inadequate to capture the richness of our mental activity. As a consequence, there were many efforts to evacuate the (...) semantical models in favor of elementary physiological mechanisms simulated by information processes. However, these views, and the subsequent criticisms against artificial intelligence that they contain, miss the very nature of artificial intelligence, which is not reducible to a “science of the nature”, but which directly impacts our culture. More precisely, they lead to evacuate the role of the semantic information. In other words, they tend to throw the baby out with the bath-water. This paper tries to revisit the epistemology of artificial intelligence in the light of the opposition between the “sciences of nature” and the “sciences of culture”, which has been introduced by German neo-Kantian philosophers. It then shows how this epistemological view opens on the many contemporary applications of artificial intelligence that have already transformed—and will continue to transform—all our cultural activities and our world. Lastly, it places those perspectives in the context of the philosophy of information and more particularly it emphasizes the role played by the notions of context and level of abstraction in artificial intelligence. (shrink)
In this paper we present two distinctly epistemological puzzles that arise for one who aspires to defend some plausible version of the precautionary principle. The first puzzle involves an application of contextualism in epistemology; and the second puzzle concerns the task of defending a plausible version of the precautionary principle that would not be invalidated by de minimis.
Computer simulations are an exciting tool that plays important roles in many scientific disciplines. This has attracted the attention of a number of philosophers of science. The main tenor in this literature is that computer simulations not only constitute interesting and powerful new science , but that they also raise a host of new philosophical issues. The protagonists in this debate claim no less than that simulations call into question our philosophical understanding of scientific ontology, the epistemology and semantics (...) of models and theories, and the relation between experimentation and theorising, and submit that simulations demand a fundamentally new philosophy of science in many respects. The aim of this paper is to critically evaluate these claims. Our conclusion will be sober. We argue that these claims are overblown and that simulations, far from demanding a new metaphysics, epistemology, semantics and methodology, raise few if any new philosophical problems. The philosophical problems that do come up in connection with simulations are not specific to simulations and most of them are variants of problems that have been discussed in other contexts before. (shrink)
How do we know what other speakers say? Perhaps the most natural view is that we hear a speaker's utterance and infer what was said, drawing on our competence in the syntax and semantics of the language. An alternative view that has emerged in the literature is that native speakers have a non-inferential capacity to perceive the content of speech. Call this the perceptual view. The disagreement here is best understood as an epistemological one about whether our knowledge of what (...) speakers say is epistemically mediated by our linguistic competence. The present paper takes up the question of how we should go about settling this issue. Arguments for the perceptual view generally appeal to the phenomenology of speech comprehension. The present paper develops a line of argument for the perceptual view that draws on evidence from empirical psychology. The evidence suggests that a speaker's core syntactic and semantic competence is typically deployed sub-personally (e.g., by something like a module). The point is not just that the competence is tacit or unconscious, but that the person is not the locus of the competence. I argue that standing competence can enter into the grounds for knowledge only if it is subject to a certain sort of epistemic assessment, an assessment that is appropriate only if the person is the locus of that competence. If the person is not the locus of a speaker's core linguistic competence, as the psychological evidence suggests, then that competence does not enter into the grounds for our knowledge of what speakers say. If this line of argument is right, it has implications for the epistemology of perception and for our understanding of how empirical psychology bears on epistemology generally. (shrink)