Some causal explanations are non-committal in that mention of a property in the explanans conveys information about the causal origin of the explanandum even if the property in question plays no causal role for the explanandum . Programmeexplanations are a variety of non-committal causal (NCC) explanations. Yet their interest is very limited since, as I will argue in this paper, their range of applicability is in fact quite narrow. However there is at least another variety (...) of NCC explanations, causal orientation explanations, which offer a plausible model for many explanations in the special sciences. (shrink)
In this essay, I address a novel criticism recently levelled at the Strong Programme by Nick Tosh and Tim Lewens. Tosh and Lewens paint Strong Programme theorists as trading on a contrastive form of explanation. With this, they throw valuable new light on the explanatory methods employed by the Strong Programme. However, as I shall argue, Tosh and Lewens run into trouble when they accuse Strong Programme theorists of unduly restricting the contrast space in which legitimate (...) historical and sociological explanations of scientific knowledge might be given. Their attack founders as a result of their failure to properly understand the overall methodological concerns of Strong Programme theorists. After introducing readers to the technique of contrastive explanation and correcting the errors in Tosh and Lewens’ interpretation of the Strong Programme, I argue that it is, in fact, Tosh and Lewens’ own commitment to scientific realism which places an unacceptable restriction on the explanatory space open to historians and sociologists of science. The happy ending is that the Strong Programme provides more freedom for analysis than does scientific realism, and that careful attention to the methodological benefits of contrastive explanation can help lighten the burden on historians and sociologists of science as they go about their explanatory business. (shrink)
Frank Jackson, Philip Pettit, and Jaegwon Kim put forward two models of higher-level causal explanation. Advocates of both versions are inclined to draw the conclusion that the models don't differ substantially. I argue, on the contrary, that there are relevant metaphysical differences between Jackson and Pettit's notion of programme explanation on the one hand, and Kim's idea of supervenient causation on the other. These can be traced back to underlying differences between the contents of their physicalisms.
Contents. Introduction. 1. Preliminaries. 2. Normal Form Games. 3. Extensive Games. 4. Applications of Game Theory. 5. The Methodology of Game Theory. Conclusion. Appendix. Bibliography. Index. Does game theory—the mathematical theory of strategic interaction—provide genuine explanations of human behaviour? Can game theory be used in economic consultancy or other normative contexts? Explaining Games: The Epistemic Programme in Game Theory—the first monograph on the philosophy of game theory—is an attempt to combine insights from epistemic logic and the philosophy of (...) science to investigate the applicability of game theory in such fields as economics, philosophy and strategic consultancy. I prove new mathematical theorems about the beliefs, desires and rationality principles of individual human beings, and explore in detail the logical form of game theory as it is used in explanatory and normative contexts. I argue that game theory reduces to rational choice theory if used as an explanatory device, and that game theory is nonsensical if used as a normative device. A provocative account of the history of game theory reveals that this is not bad news for all of game theory, though. Two central research programmes in game theory tried to find the ultimate characterisation of strategic interaction between rational agents. Yet, while the Nash Equilibrium Refinement Programme has done badly thanks to such research habits as overmathematisation, model-tinkering and introversion, the Epistemic Programme, I argue, has been rather successful in achieving this aim. "The 'epistemic' approach to game theory has emerged over the past twenty-five years. What is this approach? How does it differ from the conventional equilibrium-based approach to game theory? What have been its strengths and weaknesses to date? To find out, read this comprehensive and excellently written account". Adam Brandenburger, J. P. Valles Professor of Business Economics and Strategy, Stern School of Business, New York University "Reading Boudewijn de Bruin's book should be rewarding both for game theorists interested in the conceptual foundations of their discipline and for philosophers who want to learn more about formal analysis of strategic interaction. It provides an in-depth logical study of the currently dominant epistemic approaches to non-cooperative games, with an eye both to the attractions and to the serious challenges facing the Epistemic Programme". Wlodek Rabinowicz, Professor of Practical Philosophy, Department of Philosophy, Lund University . (shrink)
The Oxford Handbook provides an extensive and innovative review of developments in Analytical Sociology (AS) which is a theory program which seeks to develop ‘thin explanations’ of social phenomena by understanding their micro-foundations through explicitly developed models and then tracing through the broader consequences of these actions and interactions for aggregate social patterns. The volume covers the key characteristics of this approach in terms of ontology and epistemology and then assays recent developments across over two dozen areas of application: (...) each a particular social mechanism. Methodological approaches particularly pertinent to AS are also covered. However, some of the criticisms of the AS programme are also noted, especially its lack of attention thus far to meso-level and macro-level mechanisms. (shrink)
This article considers the `Strong Programme' account of scientific knowledge from a fresh perspective. It argues that insufficient attention has been paid to the Strong Programme's monistic intent, that is, its aim to unify considerations of instrumental adequacy and social interests in explanations of the development of scientific knowledge. Although sharing the judgment of many critics that the Strong Programme approach is flawed, the article diverges from standard criticisms by suggesting that the best alternative is not (...) a dualistic framework but a more adequate monistic approach. Key Words: Strong Programme interests monism finitism classification. (shrink)
Abstract The accumulated case studies in the Sociology of Scientific Knowledge have been taken to establish the Strong Programme's thesis that beliefs have social causes in contradistinction to psychological ones. This externalism is essentially a commitment to the stimulus control of behaviour which was the principal tenet of orthodox Skinnerian Behaviorism. Offered as ?straight forward scientific hypotheses? these claims of social determination are asserted to be ?beyond dispute?. However, the causes of beliefs and especially their contents has also been (...) the subject of intense study in the quite different domain of cognitive science where internal states, images, rules, representations and schemas are postulated as explanatory constructs. Such explanations which postulate mental states are described by Bloor as infected by the ?disease? of ?psychologism? and Bloor has defined his Strong Programme in terms of its diametrical opposition to mentalistic theories. For example, Bloor has explicitly endorsed the Behaviourist rejection of mental representations such as images. Accordingly, a direct comparison of these radically divergent approaches to a common subject matter is of considerable interest. The paper attempts to reveal the unnoticed enormity and recidivism of the sociological programme, and how its vulnerability is betrayed in Bloor's response to criticism on central issues. (shrink)
In this paper I argue that the Strong Programme's aim to provide robust explanations of belief acquisition is limited by its commitment to the symmetry principle. For Bloor and Barnes, the symmetry principle is intended to drive home the fact that epistemic norms are socially constituted. My argument here is that even if our epistemic standards are fully naturalized-even relativized-they nevertheless can play a pivotal role in why individuals adopt the beliefs that they do. Indeed, sometimes the fact (...) that a belief is locally endorsed as rational is the only reason why an individual holds it. In this way, norms of rationality have a powerful and unique role in belief formation. But if this is true then the symmetry principle's emphasis on 'sameness of type' is misguided. It has the undesirable effect of not just naturalizing our cognitive commitments, but trivializing them. Indeed, if the notion of 'similarity' is to have any content, then we are not going to classify as 'the same' beliefs that are formed in accordance with deeply entrenched epistemic norms as ones formed without reflection on these norms, or ones formed in spite of these norms. My suggestion here is that we give up the symmetry principle in favor of a more sophisticated principle, one that allows for a taxonomy of causes rich enough to allow us to delineate the unique impact epistemic norms have on those individuals who subscribe to them. (shrink)
Reconstructing Popper's research programme for the Human Sciences, Noretta Koertge (Inquiry , Vol. 18 ) has given a deductive-nomological account of explanations of actions by means of a Rationality Principle. It is argued here that such a Rationality Principle is fundamentally redundant. Neither is it logically necessary in order to deduce a cognitive action-explanandum, nor can it be given a semantic non-empty interpretation, at least not within Koertge's own syllogism. Any attempt to save the Rationality Principle as unfalsifiablc (...) but nevertheless indispensable for action explanations is rejected in the light of possible alternative action explanations by empirical and therefore in principle falsifiable psychological laws. (shrink)
Reconstructing Popper's research programme for the Human Sciences, Noretta Koertge (Inquiry, Vol. 18 ) has given a deductive?nomological account of explanations of actions by means of a Rationality Principle. It is argued here that such a Rationality Principle is fundamentally redundant. Neither is it logically necessary in order to deduce a cognitive action?explanandum, nor can it be given a semantic non?empty interpretation, at least not within Koertge's own syllogism. Any attempt to save the Rationality Principle as unfalsifiablc but (...) nevertheless indispensable for action explanations is rejected in the light of possible alternative action explanations by empirical and therefore in principle falsifiable psychological laws. (shrink)
The paper argues that the Nash Equilibrium Refinement Programme in game theory was less successful than its competitor, the Epistemic Programme (Interactive Epistemology). The prime criterion of success is the extent to which the programmes were able to reach the key objective guiding non-cooperative game theory for much of the 20th century, namely, to develop a complete characterisation of the strategic rationality of economic agents in the form of the ultimate game theoretic solution concept for any normal form (...) and extensive game. The paper explains this in terms of unjustified degrees of mathematisation in the Nash Equilibrium Refinement Programme. While this programme's mathematical models were often inspired by purely mathematical concerns rather than the economic phenomena they were intended to be mathematical models of, the Epistemic Programme's mathematical models were developed with a keen eye to the role beliefs and desires play in strategic interaction between rational economic agents playing games; that is, their Interactive Epistemology. The Epistemic Programme succeeded in developing mathematical models formalising aspects of strategic interaction that remained implicit in the Nash Equilibrium Refinement Programme due to an unjustified degree of mathematisation. As a result, the Epistemic Programme is more successful in game theory . (shrink)
In a recent paper in this journal (Rottschaefer and Martinsen 1990) we have proposed a view of Darwinian evolutionary metaethics that we believe improves upon Michael Ruse's (e.g., Ruse 1986) proposals by claiming that there are evolutionary based objective moral values and that a Darwinian naturalistic account of the moral good in terms of human fitness can be given that avoids the naturalistic fallacy in both its definitional and derivational forms while providing genuine, even if limited, justifications for substantive ethical (...) claims. Jonathan Barrett (this issue) has objected to our proposal contending that we cannot hold for the reality of supervenient moral properties without either falling foul of the naturalistic fallacy or suffering the consequences of postulating inexplicable moral properties. In reply, we show that Barrett's explicit arguments that we commit either the definitional or derivational form of the naturalistic fallacy fail and that his naturalistic intuitions that supervenience explanations of moral properties by nonmoral properties force us into what we call the explanatory form of the naturalistic fallacy also fail. Positively, his objections help us to clarify the nature of the naturalistic fallacy within an evolutionary based naturalistic ethics and to point out the proper role of both supervenience explanations and moral explanations in such an ethics. (shrink)
Instances of explanatory reduction are often advocated on metaphysical grounds; given that the only real things in the world are subatomic particles and their interaction, we have to try to explain everything in terms of the laws of physics. In this paper, we show that explanatory reduction cannot be defended on metaphysical grounds. Nevertheless, indispensability arguments for reductive explanations can be developed, taking into account actual scientific practice and the role of epistemic interests. Reductive explanations might be indispensable (...) to address some epistemic interest answering a specific explanation-seeking question in the most accurate, adequate and efficient way. Just like explanatory pluralists often advocate the indispensability of higher levels of explanation pointing at the pragmatic value of the explanatory information obtained on these higher levels, we argue that explanatory reduction—traditionally understood as the contender of pluralism—can be defended in a similar way. The pragmatic value reductionist, lower level explanations might have in the biomedical sciences and the social sciences is illustrated by some case studies. (shrink)
Studies exploring how students learn and understand science processes such as diffusion and natural selection typically find that students provide misconceived explanations of how the patterns of such processes arise (such as why giraffes’ necks get longer over generations, or how ink dropped into water appears to “flow”). Instead of explaining the patterns of these processes as emerging from the collective interactions of all the agents (e.g., both the water and the ink molecules), students often explain the pattern as (...) being caused by controlling agents with intentional goals, as well as express a variety of many other misconceived notions. In this article, we provide a hypothesis for what constitutes a misconceived explanation; why misconceived explanations are so prevalent, robust, and resistant to instruction; and offer one approach of how they may be overcome. In particular, we hypothesize that students misunderstand many science processes because they rely on a generalized version of narrative schemas and scripts (referred to here as a Direct-causal Schema) to interpret them. For science processes that are sequential and stage-like, such as cycles of moon, circulation of blood, stages of mitosis, and photosynthesis, a Direct-causal Schema is adequate for correct understanding. However, for science processes that are non-sequential (or emergent), such as diffusion, natural selection, osmosis, and heat flow, using a Direct Schema to understand these processes will lead to robust misconceptions. Instead, a different type of general schema may be required to interpret non-sequential processes, which we refer to as an Emergent-causal Schema. We propose that students lack this Emergent Schema and teaching it to them may help them learn and understand emergent kinds of science processes such as diffusion. Our study found that directly teaching students this Emergent Schema led to increased learning of the process of diffusion. This article presents a fine-grained characterization of each type of Schema, our instructional intervention, the successes we have achieved, and the lessons we have learned. (shrink)
Several philosophers of science have advanced an instrumentalist thesis about the use of probabilities in evolutionary biology. I investigate the consequences of instrumentalism on evolutionary explanations. I take issue with Barbara Horan's (1994) argument that probabilities are unnecessary to explain evolutionary change given the underlying deterministic character of evolutionary processes. First, I question Horan's deterministic assumption. Then, I attempt to undermine her Laplacian argument by demonstrating that whether probabilities are necessary depends upon the sort of questions one is asking.
Matthen (Philos Sci 76(4):464–487, 2009) argues that explanations of evolutionary change that appeal to natural selection are statistically abstractive explanations, explanations that ignore some possible explanatory partitions that in fact impact the outcome. This recognition highlights a difficulty with making selective analyses fully rigorous. Natural selection is not about the details of what happens to any particular organism, nor, by extension, to the details of what happens in any particular population. Since selective accounts focus on tendencies, those (...) factors that impact the actual outcomes but do not impact the tendencies must be excluded. So, in order to properly exclude the factors irrelevant to selection, the relevant factors must be identified, and physical processes, environments, and populations individuated on the basis of being relevantly similar for the purposes of selective accounts. Natural selection, on this view, becomes in part a measure of the robustness of particular kinds of outcomes given variations over some kinds of inputs. (shrink)
This paper addresses the theoretical notion of a game as it arisesacross scientific inquiries, exploring its uses as a technical andformal asset in logic and science versus an explanatory mechanism. Whilegames comprise a widely used method in a broad intellectual realm(including, but not limited to, philosophy, logic, mathematics,cognitive science, artificial intelligence, computation, linguistics,physics, economics), each discipline advocates its own methodology and aunified understanding is lacking. In the first part of this paper, anumber of game theories in formal studies are critically (...) surveyed. Inthe second part, the doctrine of games as explanations for logic isassessed, and the relevance of a conceptual analysis of games tocognition discussed. It is suggested that the notion of evolution playsa part in the game-theoretic concept of meaning. (shrink)
The prevention, treatment and management of disease are closely linked to how the causes of a particular disease are explained. For multi-factorial conditions, the causal explanations are inevitably complex and competing models may exist to explain the same condition. Selecting one particular causal explanation over another will carry practical and ethical consequences that are acutely relevant for health policy. In this paper our focus is two-fold; (i) the different models of causal explanation that are put forward within current scientific (...) literature for the high and rising prevalence of the common complex conditions of coronary artery disease (CAD) and type 2 diabetes mellitus (T2D); and (ii) how these explanations are taken up (or not) within national health policy guidelines. We examine the causal explanations for these two conditions through a systematic database search of current scientific literature. By identifying different causal explanations we propose a three-tier taxonomy of the most prominent models of explanations: (i) evolutionary, (ii) lifecourse, and (iii) lifestyle and environment. We elaborate this taxonomy with a micro-level thematic analysis to illustrate how some explanations are semantically and rhetorically foregrounded over others. We then investigate the uptake of the scientific causal explanations in health policy documents with regard to the prevention and management recommendations of current National Service Frameworks for CAD and T2D. Our findings indicate a lack of congruence between the complexity and frequent overlap of causal explanations evident in the scientific literature and the predominant focus on lifestyle recommendations found in the mainstream health policy documents. (shrink)
Even with the lack of consensus on the nature of an argument, the thesis that explanations and arguments are distinct is near orthodoxy in well-known critical thinking texts and in the more advanced argumentation literature. In this paper, I reconstruct two rationales for distinguishing arguments from explanations. According to one, arguments and explanations are essentially different things because they have different structures. According to the other, while some explanations and arguments may have the same structure, they (...) are different things because explanations are used for different purposes than arguments. I argue that both rationales fail to motivate a distinction between arguments and explanations. Since these are the only rationales for distinguishing arguments from explanations that I am prepared to take seriously, I don’t see why we should exclude explanations from being arguments. (shrink)
In this paper I argue that Marxist studies of schools have overlooked the power of intentional explanations to explain schooling practices and policies. This oversight is at least in part due to many radical analyses failing to distinguish between explaining the acquisition and persistence of beliefs and determining the social consequences that follow from acting on beliefs. I further contend that radical researchers examining schooling practices must develop a more rigorous and refined conception of capitalist class interests.
In this paper I briefly reply to Shant Shahbazian’s comments on my paper “Austere quantum mechanics as a reductive basis for chemistry” and argue that quantum theory of atoms in molecules can be characterised as a research programme in the theories of chemistry. I also explore the areas in which Shahbazian and me agree and disagree.
We attack the SSK's rejection of the distinction between discovery and justification (the DJ distinction), famously introduced by Hans Reichenbach and here defended in a "lean" version. Some critics claim that the DJ distinction cannot be drawn precisely, or that it cannot be drawn prior to the actual analysis of scientific knowledge. Others, instead of trying to blur or to reject the distinction, claim that we need an even more fine-grained distinction (e.g. between discovery, invention, prior assessment, test and justification). (...) Adherents of the SSK, however, maintain that the distinction is useless and perhaps nonexistent. We first argue against the assumption that the SSK's objection to the DJ distinction is just the same as Thomas Kuhn's. Second, we point out general weaknesses of the SSK's arguments against the DJ distinction. Finally, we argue that the distinction is useful not only in order to explicate what is meant by an evaluation but even for the empirical explanation of knowledge. We use two case studies from the history of cognitive science to support this point. (shrink)
In The Scientific Image B. C. van Fraassen argues that a theory of explanation ought to take the form of a theory of why-questions, and a theory of this form is what he provides. Van Fraassen's account of explanation is good, as far as it goes. In particular, van Fraassen's theory of why-questions adds considerable illumination to the problem of alternative explanations in psychodynamics. But van Fraassen's theory is incomplete because it ignores those classes of explanations that are (...) answers not to why-questions but to how-questions. In this article I provide a unified theory of explanatory questions that comprehends both how-questions and why-questions, and I show that a question-theoretic approach to explanation can be defended independently of van Fraassen's programme of Constructive Empiricism. (shrink)
A compelling idea holds that reality has a layered structure. We often disagree about what inhabits the bottom layer (or even if there is one), but we agree that higher up we find chemical, biological, geological, psychological, sociological, economic, /etc./, entities: molecules, human beings, diamonds, mental states, cities, interest rates, and so on. How is this intuitive talk of a layered structure of entities to be understood? Traditionally, philosophers have proposed to understand layered structure in terms of either reduction or (...) supervenience. But these traditional views face well-known problems. A plausible alternative is that layered structure is to be explicated by appeal to explanations of a certain sort, termed /grounding explanations/. Grounding explanations tell us what obtains in virtue of what. Unfortunately, the use of grounding explanations to articulate the layered conception faces a problem, which I call /the collapse/. The collapse turns on the question of how to ground the facts stated by the explanations themselves. In this paper I make a suggestion about how to ground explanations that avoids the collapse. Briefly, the suggestion is that the fact stated by a grounding explanation is grounded in its /explanans/. (shrink)
Abstract While agreeing that dynamical models play a major role in cognitive science, we reject Stepp, Chemero, and Turvey's contention that they constitute an alternative to mechanistic explanations. We review several problems dynamical models face as putative explanations when they are not grounded in mechanisms. Further, we argue that the opposition of dynamical models and mechanisms is a false one and that those dynamical models that characterize the operations of mechanisms overcome these problems. By briefly considering examples involving (...) the generation of action potentials and circadian rhythms, we show how decomposing a mechanism and modeling its dynamics are complementary endeavors. (shrink)
In this paper we argue that structural explanations are an effective way of explaining well known relativistic phenomena like length contraction and time dilation, and then try to understand how this can be possible by looking at the literature on scientific models. In particular, we ask whether and how a model like that provided by Minkowski spacetime can be said to represent the physical world, in such a way that it can successfully explain physical phenomena structurally. We conclude by (...) claiming that a partial isomorphic approach to scientific representation can supply an answer only if supplemented by a robust injection of pragmatic factors. (shrink)
This paper defends my claim in earlier work that certain non-causal conditions are sufficient for the truth of some reasons explanations of actions, against the critique of this claim given by Randolph Clarke in his book, Libertarian Accounts of Free Will.
Many cognitive scientists, having discovered that some computational-level characterization f of a cognitive capacity φ is intractable, invoke heuristics as algorithmic-level explanations of how cognizers compute f. We argue that such explanations are actually dysfunctional, and rebut five possible objections. We then propose computational-level theory revision as a principled and workable alternative.
This paper argues that besides mechanistic explanations, there is a kind of explanation that relies upon “topological” properties of systems in order to derive the explanandum as a consequence, and which does not consider mechanisms or causal processes. I first investigate topological explanations in the case of ecological research on the stability of ecosystems. Then I contrast them with mechanistic explanations, thereby distinguishing the kind of realization they involve from the realization relations entailed by mechanistic explanations, (...) and explain how both kinds of explanations may be articulated in practice. The second section, expanding on the case of ecological stability, considers the phenomenon of robustness at all levels of the biological hierarchy in order to show that topological explanations are indeed pervasive there. Reasons are suggested for this, in which “neutral network” explanations are singled out as a form of topological explanation that spans across many levels. Finally, I appeal to the distinction of explanatory regimes to cast light on a controversy in philosophy of biology, the issue of contingence in evolution, which is shown to essentially involve issues about realization. (shrink)
David Friedrich Strauss is best known for his mythical interpretation of the Gospel narratives. He opposed both the supernaturalists (who regarded the Gospel stories as reliable) and the rationalists (who offered natural explanations of purportedly supernatural events). His mythical interpretation suggests that many of the stories about Jesus were woven out of pre-existing messianic beliefs and expectations. Picking up this suggestion, I argue that the Gospel writers thought paradigmatically rather than historically. A paradigmatic explanation assimilates the event-to-be- explained to (...) what is thought to be a prototypical instance of divine action. It differs from a historical or scientific explanation insofar as it does not specify the conditions under which it should be applied. It is, therefore, a wonderfully flexible way to understand the present in the light of the past. (shrink)