Economic competitive advantage depends on innovation, which in turn requires pushing back the frontiers of various kinds of knowledge. Although understanding how knowledge grows ought to be a central topic of epistemology, epistemologists and philosophers of science have given it insufficient attention, even deliberately shunning the topic. Traditional confirmation theory and general epistemology offer little help at the frontier, because they are mostly retrospective rather than prospective. Nor have philosophers been highly visible in the science and technology policy realm, despite (...) philosophy’s being a normative discipline. This paper suggests a way to address both deficits. Creative scientists, technologists, business managers, and policy makers face similar problems of decision-making at their respective frontiers of knowledge. These areas should therefore be fertile ground for both epistemologists and philosophers concerned with policy. Here I call attention to the importance of heuristic appraisal for “frontier epistemology” and to policy formation. Evaluation of the comparative promise or expected fertility of available options comprises a cluster of activities that cut across traditional discovery/justification and descriptive/normative distinctions. The study of weak modes of reasoning and evaluation is especially relevant to socio-economic policy. (shrink)
Reduction was once a central topic in philosophy of science. I claim that it remains important, especially when applied to problems and problem-solutions rather than only to large theory-complexes. Without attempting a comprehensive classification, I discuss various kinds of problem reductions and similar relations, illustrating them, inter alia, in terms of the blackbody problem and early quantization problems. Kuhn's early work is suggestive here both for structuralist theory of science and for the line I prefer to take. My central claims (...) in the paper are (1) that problem reduction is important in its own right and does not "reduce" to theory reduction and (2) that problem reduction is generally more important than theory reduction to methodology as the "control theory" of inquiry. (shrink)
Contemporary Philosophy in Focus offers a series of introductory volumes to many of the dominant philosophical thinkers of the current age. Thomas Kuhn (1922-1996), the author of The Structure of Scientific Revolutions, is probably the best-known and most influential historian and philosopher of science of the last 25 years, and has become something of a cultural icon. His concepts of paradigm, paradigm change and incommensurability have changed the way we think about science. This volume offers an introduction to Kuhn's life (...) and work and then considers the implications of Kuhn's work for philosophy, cognitive psychology, social studies of science and feminism. The volume is more than a retrospective on Kuhn, exploring future developments of cognitive and information services along Kuhnian lines. Outside of philosophy the volume will be of particular interest to professionals and students in cognitive science, history of science, science studies and cultural studies. (shrink)
Looking at Thomas Kuhn's work from a cognitive science perspective helps to articulate and to legitimize, to some degree, his rejection of traditional views of concepts, categorization, theory structure, and rule-based problem solving. Whereas my colleagues focus on the later Kuhn of the MIT years, I study the early Kuhn as an anticipation of case-based reasoning and schema theory. These recent developments in cognitive psychology and artificial intelligence may point toward a more computational version of Kuhn's ideas, but they also (...) expose ambiguities in his work, notably in his understanding of exemplars. (shrink)
Pure consequentialists hold that all theoretical justification derives from testing the consequences of hypotheses, while generativists maintain that reasoning (some feature of) the hypothesis from we already know is an important form of justification. The strongest form of justification (they claim) is an idealized discovery argument. In the guise of H-D methodology, consequentialism is widely supposed to have defeated generativism during the 19th century. I argue that novel prediction fails to overcome the logical weakness of consequentialism or to render generative (...) methodology superfluous. Specifically, Bayesian consequentialism is not an alternative to generativism but reduces to an instance of it. (shrink)
Thomas Nickles (1987). Twixt Method and Madness. In Nancy J. Nersessian (ed.), The Process of Science: Contemporary Philosophical Approaches to Understanding Scientific Practice. Distributors for the United States and Canada, Kluwer Academic Publishers.
Does the viability of the discovery program depend on showing either (1) that methods of generating new problem solutions, per se, have special probative weight (the per se thesis); or, (2) that the original conception of an idea is logically continuous with its justification (anti-divorce thesis)? Many writers have identified these as the key issues of the discovery debate. McLaughlin, Pera, and others recently have defended the discovery program by attacking the divorce thesis, while Laudan has attacked the discovery program (...) by rejecting the per se thesis. This disagreement over the central issue has led to communication breakdown. I contend that both friends and foes of discovery mistake the central issues. Recognizing a form of divorce helps rather than hurts the discovery program. However, the per se thesis is not essential to the program (nor is the related debate over novel prediction); hence, the status of the per se thesis is a side issue. With these clarifications in hand, we can proceed to the next stage of the discovery debate--the development (or revival) of a generative conception of justification which goes beyond consequentialism to forge a strong linkage of generation (or rather, generatability) with justification. (shrink)
Although seriously defective, 17th-century ideas about discovery, justification, and positive science are not as hopeless, useless, and out of date as many philosophers assume. They appear to underlie modern scientific practice. The generationist view of justification interestingly links justification with discovery issues while employing a concept of empirical support quite foreign to the modern, consequentialist concept, which identifies empirical evidence with favorable test results (predictive/explanatory success). In the generationist sense, justification amounts to potential discovery or "discoverability". A partial defense of (...) updated versions of these ideas is offered without disputing the importance of consequential testing. Much further work is needed! (shrink)
One component of a viable account of scientific inquiry is a defensible conception of scientific problems. This paper specifies some logical and conceptual requirements that an acceptable account of scientific problems must meet as well as indicating some features that a study of scientific inquiry indicates scientific problems have. On the basis of these requirements and features, three standard empiricist models of problems are examined and found wanting. Finally a constraint inclusion-model of scientific problems is proposed.
In this paper the relation between scientific problems and the constraints on their solutions is explored. First the historical constraints on the solution to the blackbody radiation problem are set out. The blackbody history is used as a guide in sketching a working taxonomy of constraints, which distinguishes various kinds of reductive and nonreductive constraints. Finally, this discussion is related to some work in erotetic logic. The hypothesis that scientific problems can be identified with structured sets of constraints is interesting; (...) however, a full defense of the identification thesis requires the resolution of some unsolved problems. (shrink)
Davidson's defective defense of the consistency of (1) the causal interaction of mental and physical events, (2) the backing law thesis on causation, (3) the impossibility of lawfully explaining mental events is repaired by closer attention to the description-Relativity of explanation. Davidson wrongly allows that particular mental events are explainable when particular identities to physical events are known. The author argues that such identities are powerless to affect what features a given law can explain. Thus a great intelligence knowing all (...) the physical laws could not explain a single mental event, As such, Even if he knew all particular identities. (shrink)
Arguments, suggested by readings of Durkheim and Kroeber, for the integrity and autonomy of social theory are examined. These arguments may be construed as closure arguments on domains of social events and of social facts. Causal closure, ontic closure, and several kinds of nomic and explanatory closure are distinguished. Discussion of the relations of various kinds of closure, integrity, autonomy, etc. under plausible assumptions concerning causation and explanation leads to the conclusion that (a) one main strand of the integrity arguments (...) is defensible; (b) special ontological assumptions (ontic closure) are not necessary and are dubiously sufficient for autonomy. This general conclusion accords with the positions of the later Kroeber and of D. Kaplan, that integrity-autonomy is best considered a methodological, not an ontological issue--a matter of distinct levels of description and explanation, not distinct levels of reality. (shrink)
A serious problem for covering law explanation is raised and its consequences for the Hempelian theory of explanation are discussed. The problem concerns an intensional feature of explanations, involving the manner in which theoretical law statements are related to the events explained. The basic problem arises because explanations are not of events but of events under descriptions; moreover, in a sense, our linguistic descriptions outrun laws. One form of the problem, termed the problem of weak intensionality, is apparently solved by (...) a simple logical move, but in fact the problem arises in a new, strong form. It is found that Hempel's model for deductive explanation (to which this discussion is confined) requires modification to handle the weak intensionality problem but then is faced with the problem of strong intensionality. In consequence, it is suggested that Hempel's important concept of explanation sketch is not as widely applicable as usually claimed, especially for explanations in the behavioral and social sciences and history. Reason is found to reject the covering law thesis that every scientific explanation must contain at least one law statement. An important feature of the discussion is that some of the main reasons given for altering the deductive model and for considering other forms of explanation are internal to the covering law theory. (shrink)