The history of science is articulated by moments of discovery. Yet, these 'moments' are not simple or isolated events in science. Just as a scientific discovery illuminates our understanding of nature or of society, and reveals new connections among phenomena, so too does the history of scientific activity and the analysis of scientific reasoning illuminate the processes which give rise to moments of discovery and the complex network of consequences which follow upon such moments. Understanding discovery has not been, until (...) recently, a major concern of modem philosophy of science. Whether the act of discoyery was regarded as mysterious and inexplicable, or obvious and in no need of explanation, modem philosophy of science in effect bracketed the question. It concentrated instead on the logic of scientific explanation or on the issues of validation or justification of scientific theories or laws. The recent revival of interest in the context of discovery, indeed in the acts of discovery, on the part of philosophers and historians of science, represents no one particular method'ological or philosophical orientation. It proceeds as much from an empiricist and analytical approach as from a sociological or historical one; from considerations of the logic of science as much as from the alogical or extralogical contexts of scientific tho'¢tt and practice. But, in general, this new interest focuses sharply on the actual historical and contem porary cases of scientific discovery, and on an examination of the act or moment of discovery in situ. (shrink)
Does the viability of the discovery program depend on showing either (1) that methods of generating new problem solutions, per se, have special probative weight (the per se thesis); or, (2) that the original conception of an idea is logically continuous with its justification (anti-divorce thesis)? Many writers have identified these as the key issues of the discovery debate. McLaughlin, Pera, and others recently have defended the discovery program by attacking the divorce thesis, while Laudan has attacked the discovery program (...) by rejecting the per se thesis. This disagreement over the central issue has led to communication breakdown. I contend that both friends and foes of discovery mistake the central issues. Recognizing a form of divorce helps rather than hurts the discovery program. However, the per se thesis is not essential to the program (nor is the related debate over novel prediction); hence, the status of the per se thesis is a side issue. With these clarifications in hand, we can proceed to the next stage of the discovery debate--the development (or revival) of a generative conception of justification which goes beyond consequentialism to forge a strong linkage of generation (or rather, generatability) with justification. (shrink)
Contemporary Philosophy in Focus offers a series of introductory volumes to many of the dominant philosophical thinkers of the current age. Thomas Kuhn, the author of The Structure of Scientific Revolutions, is probably the best-known and most influential historian and philosopher of science of the last 25 years, and has become something of a cultural icon. His concepts of paradigm, paradigm change and incommensurability have changed the way we think about science. This volume offers an introduction to Kuhn's life and (...) work and then considers the implications of Kuhn's work for philosophy, cognitive psychology, social studies of science and feminism. The volume is more than a retrospective on Kuhn, exploring future developments of cognitive and information services along Kuhnian lines. Outside of philosophy the volume will be of particular interest to professionals and students in cognitive science, history of science, science studies and cultural studies. (shrink)
Are we entering a major new phase of modern science, one in which our standard, human modes of reasoning and understanding, including heuristics, have decreasing value? The new methods challenge human intelligibility. The digital revolution inspires such claims, but they are not new. During several historical periods, scientific progress has challenged traditional concepts of reasoning and rationality, intelligence and intelligibility, explanation and knowledge. The increasing intelligence of machine learning and networking is a deliberately sought, somewhat alien intelligence. As such, it (...) challenges the traditional, heuristic foresight of expert researchers. Nonetheless, science remains human-centered in important ways—and yet many of our ordinary human epistemic activities are alien to ourselves. This fact has always been the source of “the discovery problem”. It generalizes to the problem of understanding expert scientific practice. Ironically, scientific progress plunges us ever deeper into complexities beyond our grasp. But how is progress possible without traditional realism and the intelligibility realism requires? Pragmatic flexibility offers an answer. (shrink)
I discuss changes of perspective of four kinds in science and about science. Section 2 defends a perspectival nonrealism—something akin to Giere’s perspectival realism but not a realism—against the idea of complete, “Copernican” objectivity. Section 3 contends that there is an inverse relationship between epistemological conservatism and scientific progress. Section 4 casts doubt on strong forms of scientific realism by taking a long-term historical perspective that includes future history. Section 5 defends a partial reversal in the status of so-called context (...) of discovery and context of justification. Section 6 addresses the question of how we can have scientific progress without scientific realism—how progress is possible without the accumulation of representational truth. The overall result is a pragmatic instrumentalist perspective on the sciences and how to study them philosophically, one that contains a kernel of realism—instrumental realism. (shrink)
There is a rough, long-term tradeoff between rate of innovation and degree of strong realism in scientific practice, a point reflected in historically changing conceptions of method as they retreat from epistemological foundationism to a highly fallibilistic, modeling perspective. The successively more liberal, innovation-stimulating methods open up to investigation deep theoretical domains at the cost, in many cases, of moving away from strong realism as a likely outcome of research. The crowbar model of method highlights this tension, expressed as the (...) crowbar compromise and the crowbar fallacy. The tools-to-theories heuristic, described and evaluated by Gigerenzer and colleagues, can be regarded as an attempt by some scientific realists to overcome this compromise. Instead, it is an instance of it. Nonetheless, in successful applications the crowbar model implies a modest, instrumental realism. (shrink)
Economic competitive advantage depends on innovation, which in turn requires pushing back the frontiers of various kinds of knowledge. Although understanding how knowledge grows ought to be a central topic of epistemology, epistemologists and philosophers of science have given it insufficient attention, even deliberately shunning the topic. Traditional confirmation theory and general epistemology offer little help at the frontier, because they are mostly retrospective rather than prospective. Nor have philosophers been highly visible in the science and technology policy realm, despite (...) philosophy’s being a normative discipline. This paper suggests a way to address both deficits. Creative scientists, technologists, business managers, and policy makers face similar problems of decision-making at their respective frontiers of knowledge. These areas should therefore be fertile ground for both epistemologists and philosophers concerned with policy. Here I call attention to the importance of heuristic appraisal for “frontier epistemology” and to policy formation. Evaluation of the comparative promise or expected fertility of available options comprises a cluster of activities that cut across traditional discovery/justification and descriptive/normative distinctions. The study of weak modes of reasoning and evaluation is especially relevant to socio-economic policy. (shrink)
Pure consequentialists hold that all theoretical justification derives from testing the consequences of hypotheses, while generativists maintain that reasoning (some feature of) the hypothesis from we already know is an important form of justification. The strongest form of justification (they claim) is an idealized discovery argument. In the guise of H-D methodology, consequentialism is widely supposed to have defeated generativism during the 19th century. I argue that novel prediction fails to overcome the logical weakness of consequentialism or to render generative (...) methodology superfluous. Specifically, Bayesian consequentialism is not an alternative to generativism but reduces to an instance of it. (shrink)
One component of a viable account of scientific inquiry is a defensible conception of scientific problems. This paper specifies some logical and conceptual requirements that an acceptable account of scientific problems must meet as well as indicating some features that a study of scientific inquiry indicates scientific problems have. On the basis of these requirements and features, three standard empiricist models of problems are examined and found wanting. Finally a constraint inclusion-model of scientific problems is proposed.
In this paper the relation between scientific problems and the constraints on their solutions is explored. First the historical constraints on the solution to the blackbody radiation problem are set out. The blackbody history is used as a guide in sketching a working taxonomy of constraints, which distinguishes various kinds of reductive and nonreductive constraints. Finally, this discussion is related to some work in erotetic logic. The hypothesis that scientific problems can be identified with structured sets of constraints is interesting; (...) however, a full defense of the identification thesis requires the resolution of some unsolved problems. (shrink)
Davidson's defective defense of the consistency of (1) the causal interaction of mental and physical events, (2) the backing law thesis on causation, (3) the impossibility of lawfully explaining mental events is repaired by closer attention to the description-Relativity of explanation. Davidson wrongly allows that particular mental events are explainable when particular identities to physical events are known. The author argues that such identities are powerless to affect what features a given law can explain. Thus a great intelligence knowing all (...) the physical laws could not explain a single mental event, As such, Even if he knew all particular identities. (shrink)
The last forty years have produced a dramatic reversal in leading accounts of science. Once thought necessary to (explain) scientific progress, a rigid method of science is now widely considered impossible. Study of products yields to study of processes and practices, .unity gives way to diversity, generality to particularity, logic to luck, and final justification to heuristic scaffolding. I sketch the story, from Bacon and Descartes to the present, of the decline and fall of traditional scientific method, conceived as The (...) Central Planning Bureau for Science or as Rationality Czar. I defend a deflationary account of method and of rational judgment,. with emphasis on heuristic appraisal and cognitive economy. (shrink)
The book answers long-standing questions on scientific modeling and inference across multiple perspectives and disciplines, including logic, mathematics, physics and medicine. The different chapters cover a variety of issues, such as the role models play in scientific practice; the way science shapes our concept of models; ways of modeling the pursuit of scientific knowledge; the relationship between our concept of models and our concept of science. The book also discusses models and scientific explanations; models in the semantic view of theories; (...) the applicability of mathematical models to the real world and their effectiveness; the links between models and inferences; and models as a means for acquiring new knowledge. It analyzes different examples of models in physics, biology, mathematics and engineering. Written for researchers and graduate students, it provides a cross-disciplinary reference guide to the notion and the use of models and inferences in science. (shrink)
Although seriously defective, 17th-century ideas about discovery, justification, and positive science are not as hopeless, useless, and out of date as many philosophers assume. They appear to underlie modern scientific practice. The generationist view of justification interestingly links justification with discovery issues while employing a concept of empirical support quite foreign to the modern, consequentialist concept, which identifies empirical evidence with favorable test results (predictive/explanatory success). In the generationist sense, justification amounts to potential discovery or "discoverability". A partial defense of (...) updated versions of these ideas is offered without disputing the importance of consequential testing. Much further work is needed! (shrink)
Science continually contributes new models and rethinks old ones. The way inferences are made is constantly being re-evaluated. The practice and achievements of science are both shaped by this process, so it is important to understand how models and inferences are made. But, despite the relevance of models and inference in scientific practice, these concepts still remain contro-versial in many respects. The attempt to understand the ways models and infer-ences are made basically opens two roads. The first one is to (...) produce an analy-sis of the role that models and inferences play in science. The second one is to produce an analysis of the way models and inferences are constructed, especial-ly in the light of what science tells us about our cognitive abilities. The papers collected in this volume go both ways. (shrink)
A serious problem for covering law explanation is raised and its consequences for the Hempelian theory of explanation are discussed. The problem concerns an intensional feature of explanations, involving the manner in which theoretical law statements are related to the events explained. The basic problem arises because explanations are not of events but of events under descriptions; moreover, in a sense, our linguistic descriptions outrun laws. One form of the problem, termed the problem of weak intensionality, is apparently solved by (...) a simple logical move, but in fact the problem arises in a new, strong form. It is found that Hempel's model for deductive explanation (to which this discussion is confined) requires modification to handle the weak intensionality problem but then is faced with the problem of strong intensionality. In consequence, it is suggested that Hempel's important concept of explanation sketch is not as widely applicable as usually claimed, especially for explanations in the behavioral and social sciences and history. Reason is found to reject the covering law thesis that every scientific explanation must contain at least one law statement. An important feature of the discussion is that some of the main reasons given for altering the deductive model and for considering other forms of explanation are internal to the covering law theory. (shrink)
The paper locates, appreciates, and extends several dimensions of Simon’s work in the direction of more recent contributions by people such as Gigerenzer and Dennett. The author’s “crowbar model of method” is compared to Simon’s scissors metaphor. Against an evolutionary background, both support a pragmatic rather than strong realist approach to theoretically deep and complex problems. The importance of implicit knowledge is emphasized, for humans, as well as nonhuman animals. Although Simon was a realist in some respects, his work on (...) bounded rationality, satisficing, problem solving, heuristics, models, and scientific discovery mark him as a pragmatist. Indeed, he should be regarded as one of the great American pragmatists, alongside Peirce, James, Dewey, and a few others. (shrink)
Reduction was once a central topic in philosophy of science. I claim that it remains important, especially when applied to problems and problem-solutions rather than only to large theory-complexes. Without attempting a comprehensive classification, I discuss various kinds of problem reductions and similar relations, illustrating them, inter alia, in terms of the blackbody problem and early quantization problems. Kuhn's early work is suggestive here both for structuralist theory of science and for the line I prefer to take. My central claims (...) in the paper are (1) that problem reduction is important in its own right and does not "reduce" to theory reduction and (2) that problem reduction is generally more important than theory reduction to methodology as the "control theory" of inquiry. (shrink)