Physicalism ? or roughly the view that the stuff that physics talks about is all the stuff there is ? has had a popular press in philosophical circles during the twentieth century. And yet, at the same time, it has become quite fashionable lately to believe that the mind matters in this world after all and that psychology is an autonomous science irreducible to physics. However, if (true, downward) mental causation implies non-reducibility and Physicalism implies the converse, it is hard (...) to see how these two views could be compatible. This paper reviews some classical arguments purportedly showing how the autonomy of the specialsciences can be upheld without violating the laws of physics or the principle that physics constitutes a complete and closed system. These arguments are presented in order of increasing strength, indicating how the more popular arguments in fact fall short of establishing anti-reductionism of the intended kind. New arguments are added which claim to demonstrate quite effectively how downward causation is possible compatibly with the reign of physics. The paper begins with a section which distinguishes various kinds of reductionism. (shrink)
We examine the pros and cons of color realism, exposing some desiderata on a theory of color: the theory should render colors as scientifically legitimate and correctly individuated, and it should explain how we have veridical color experiences. We then show that these desiderata can by met by treating colors as properties of the specialsciences. According to our view, some of the major as properties of the specialsciences. According to our view, some of the (...) major disputes in the literature about color -- anti-realism versus dispositionalism versus reductionism -- are not well-founded at this stage of scientific inquiry. Our account of color is designed to be of use in the sciences and as such is driven largely by considerations of what the various sciences need in order to proceed appropriately. We argue that a scientific theory of colors need not regard colors as anything more than high-level statistical constructs built out of correlations between color experiences and other phenomena. (shrink)
The primacy of physics generates a philosophical problem that the naturalist must solve in order to be entitled to an egalitarian acceptance of the ontological commitments he or she inherits from the specialsciences and fundamental physics. The problem is the generalized causal exclusion argument. If there is no genuine causation in the domains of the specialsciences but only in fundamental physics then there are grounds for doubting the existence of macroscopic objects and properties, or (...) at least the concreteness of them. The aim of this paper is to show that the causal exclusion problem derives its force from a false dichotomy between Humeanism about causation and a notion of productive or generative causation based on a defunct model of the physical world. †To contact the author, please write to: Department of Philosophy, University of Bristol, 9 Woodland Rd., Bristol BS8 1TB, UK. (shrink)
This paper describes an alternative to the common view that explanation in the specialsciences involves subsumption under laws. According to this alternative, whether or not a generalization can be used to explain has to do with whether it is invariant rather than with whether it is lawful. A generalization is invariant if it is stable or robust in the sense that it would continue to hold under a relevant if it is stable or robust in the sense (...) that it would continue to hold under a relevant class of changes. Unlike lawfulness, invariance comes in degrees and has other features that are well suited to capture the characteristics of explanatory generalizations in the specialsciences. For example, a generalization can be invariant even if it has exceptions or holds only over a limited spatio-temporal interval. The notion of invariance can be used to resolve a number of dilemmas that arise in standard treatments of explanatory generalizations in the specialsciences. (shrink)
In recent work on the foundations of statistical mechanics and the arrow of time, Barry Loewer and David Albert have developed a view that defends both a best system account of laws and a physicalist fundamentalism. I argue that there is a tension between their account of laws, which emphasizes the pragmatic element in assessing the relative strength of different deductive systems, and their reductivism or funda- mentalism. If we take the pragmatic dimension in their account seriously, then the laws (...) of the specialsciences should be part of our best explanatory system of the world, as well. (shrink)
The systems studied in the specialsciences are often said to be causally autonomous, in the sense that their higher-level properties have causal powers that are independent of those of their more basic physical properties. This view was espoused by the British emergentists, who claimed that systems achieving a certain level of organizational complexity have distinctive causal powers that emerge from their constituent elements but do not derive from them.2 More recently, non-reductive physicalists have espoused a similar view (...) about the causal autonomy of specialscience properties. They argue that since these properties can typically have multiple physical realizations, they are not identical to physical properties, and further they possess causal powers that differ from those of their physical realizers.3 Despite the orthodoxy of this view, it is hard to find a clear exposition of its meaning or a defence of it in terms of a well-motivated account of causation. In this paper, we aim to address this gap in the literature by clarifying what is implied by the doctrine of the causal autonomy of special-science properties and by defending the doctrine using a prominent theory of causation from the philosophy of science. The theory of causation we employ is a simplified version of an “interventionist” theory advanced by James Woodward (2003, forthcoming a, b), according to which a cause makes a counterfactual difference to its effects. In terms of this theory, it is possible to show that a special-science property can make a difference to some effect while the physical property that realizes it does not. Although other philosophers have also used counterfactual analyses of causation to argue for the causal autonomy of special-science properties,4 the theory of causation we employ is able to establish this with an unprecedented level of precision.. (shrink)
One of the jobs of philosophers of the specialsciences is to connect the local concerns of particular disciplines with those of philosophy in general. The two-way complexities of this task are well-illustrated by the case of causation. On the one hand—from the outside, as it were— philosophers interested in general issues about causation are prone to turn to the specialsciences for real-life examples of the use of causal notions. On the other hand, from the (...) inside, the special disciplines themselves throw up philosophical puzzles in which the notion of causation plays a role. When does correlation indicate causation, for example? Physics and economics both generate hard cases of this kind. (shrink)
It is widely held that disciplines are autonomous when their taxonomies are “substrate neutral” and when the events, states and processes that realize their descriptive vocabulary are heterogeneous. This will be particularly true in the case of disciplines whose taxonomy consists largely in terms that individuate by function. Having concluded that the multiple realization of functional kinds is far less widespread than assumed or argued for, Shapiro cannot avail himself of the argument for the autonomy of the special (...) class='Hi'>sciences which relies on multiple realization. This makes urgent the question of whether we must “now give up the idea that functionalist taxonomies have any scientific value?” [p. 650]. He acknowledges that we must either deny that the specialsciences are autonomous, because higher level kinds have only a single realization and can thus be reduced, or else we must deny that there are empirical laws in the specialsciences. “In other words, either specialsciences have no ontological independence from lower level sciences or, worse, they have no empirical laws, which is to say that they are not empirical sciences at all. [p. 650]” Shapiro’s reductionist/eliminativist dilemma for the specialsciences is unreal. For he has not canvassed the most important source of multiple realization in nature, and this source obviates his dilemma for most of the specialsciences. Moreover, the route he offers between the horns of his dilemma leads pretty directly to impalement on its eliminativist horn. Or so I shall try to show in this comment. (shrink)
The Kripkean conception of natural kinds (kinds are defined by essences that are intrinsic to their members and that lie at the microphysical level) indirectly finds support in a certain conception of a law of nature, according to which generalizations must have unlimited scope and be exceptionless to count as laws of nature. On my view, the kinds that constitute the subject matter of specialsciences such as biology may very well turn out to be natural despite the (...) fact that their essences fail to be microphysical or micro-based. On the causal conception of natural kinds I privilege, the naturalness of a kind is a function of the fact that it figures prominently in at least one causal law. However, there is a strong tendency prevailing among contemporary philosophers to assume that, in order to count as proper laws generalizations must be expectionless. Since most generalizations tracked down by the specialsciences turn out not to fulfill these criteria, what this conception of a law implies is that most of the generalizations the specialsciences trade in are not proper laws. It follows that, on this view, most if not all of the kinds the specialsciences dealing with turn out not to constitute natural kinds, understood as kinds to which bona fide laws apply. In order to establish that the non-microstructurally defined kinds that fall within the domain of enquiry of the specialsciences are eligible for the status of natural kind, I must therefore establish that generalizations needn’t have unlimited scope and be exceptionless to count as laws of nature. This is precisely what I seek to do in this paper. I begin by arguing that the question “what is a law of nature?” is most naturally interpreted as the question “what features must generalizations exhibit in order to ground scientific explanations?” and by offering reasons to believe that generalizations needn’t be exceptionless and have unlimited scope to play the crucial role laws have been thought to play in scientific explanation. Drawing on Sandra Mitchell [Mitchell, S. (2000). Philosophy of Science, 67, 242–265] and James Woodward’s [Woodward, J. (1997). Philosophy of science, 64 (proceedings), 524–541; Woodward, J. (2000). British Journal for the philosophy of science, 51(2), 197–254; Woodward, J. (2001). Philosophy of science, 68, 1–20] work, I subsequently develop an alternative account of the criteria generalizations must satisfy in order to count as laws of nature, which at least some of the generalizations of the specialsciences turn out to fulfill. I thus give credence to the idea that at least some of the kinds that fall within the domain of the specialsciences figure in laws of nature, and I thereby restore the possibility that some special science kinds deserve to be deemed natural. (shrink)
In many of the specialsciences, mathematical models are used to provide information about specified target systems. For instance, population models are used in ecology to make predictions about the abundance of real populations of particular organisms. The status of mathematical models, though, is unclear and their use is hotly contested by some practitioners. A common objection levelled against the use of these models is that they ignore all the known, causally-relevant details of the often complex target systems. (...) Indeed, the objection continues, mathematical models, by their very nature, abstract away from what matters and thus cannot be relied upon to provide any useful information about the systems they are supposed to represent. In this paper, I will examine the role of some typical mathematical models in population ecology and elsewhere. I argue that while, in a sense, these models do ignore the causal details, this move can not only be justified, it is necessary. I will argue that idealising away from complicating causal details often gives a clearer view of what really matters. And often what really matters is not the push and shove of base-level causal processes, but higher-level predictions and (non-causal) explanations. (shrink)
The traditional view of science holds that science is essentially nomothetic—that is, the defining characteristic of science is that it seeks to discover and formulate laws for the phenomena in its domain, and that laws are required for explanation and prediction. This paper advances the thesis that there are no laws in the specialsciences, sciences other than fundamental physics, and that this does not impugn their status as sciences. Toward this end, two arguments are presented. (...) The first begins with Donald Davidson’s argument against psychophysical laws and develops a more perspicacious general argument against special science laws. The second is a generalized and more explicitly motivated argument based on J. J. C. Smart’s claim that biology, unlike physics, has no laws. (shrink)
This paper explores whether it is possible to reformulate or re-interpret Lewis’s theory of fundamental laws of nature—his “best system analysis”—in such a way that it becomes a useful theory for special science laws. One major step in this enterprise is to make plausible how law candidates within best system competitions can tolerate exceptions—this is crucial because we expect special science laws to be so called “ceteris paribus laws”. I attempt to show how this is possible and also (...) how we can thereby make the first step towards a solution for the infamous difficulties surrounding the troublesome ceteris paribus clause. The paper outlines the general ideas of the theory but also points out some of its difficulties and background assumptions. (shrink)
David Lewis ([1986b]) gives an attractive and familiar account of counterfactual dependence in the standard context. This account has recently been subject to a counterexample from Adam Elga (). In this article, I formulate a Lewisian response to Elga’s counterexample. The strategy is to add an extra criterion to Lewis’s similarity metric, which determines the comparative similarity of worlds. This extra criterion instructs us to take special science laws into consideration as well as fundamental laws. I argue that the (...) Second Law of Thermodynamics should be seen as a special science law, and give a brief account of what Lewisian special science laws should look like. If successful, this proposal blocks Elga’s counterexample. (shrink)
A new direction in philosophy Between 1920 and 1940 logical empiricism reset the direction of philosophy of science and much of the rest of Anglo-American philosophy. It began as a relatively organized movement centered on the Vienna Circle, and like-minded philosophers elsewhere, especially in Berlin. As Europe drifted into the Nazi era, several important figures, especially Carnap and Neurath, also found common ground in their liberal politics and radical social agenda. Together, the logical empiricists set out to reform traditional philosophy (...) with a new set of doctrines more firmly grounded in logic and science. Criticism and decline Because of Nazi persecution, most of the European adherents of logical empiricism moved to the United States in the late 1930s. During the 1940s, many of their most cherished tenets became targets of criticism from outsiders as well as from within their own ranks. Philosophers of science in the late 1950s and 1960s rejected logical empiricism and, starting in the 1970s, presented such alternative programs such as scientific realism with evolutionary epistemology. A resurgence of interest During the early 1980s, philosophers and historians of philosophy began to study logical empiricism as an important movement. Unlike their predecessors in the 1960s-for whom the debate over logical empiricism now seems to have been largely motivated by professional politics-these philosopher no longer have to take positions for or against logical empiricism. The result has been a more balanced view of that movement, its achievements, its failures, and its influence. Hard-to-find core writings now available This collection makes available a selection of the most influential and representative writings of the logical empiricists, important contemporary criticisms of their doctrines, their responses, as well as the recent reappraisals. Introductions to each volume examine the articles in historical context and provide importantbackground information that is vital to a full understanding of the issues discussed. They outline prevalent trends, identifying leading figures and summarize their positions and reasoning, as well as those of opposing thinkers. (shrink)
Non-reductive physicalism accepts the primacy of the physical while aiming to avoid the constraints of traditional reduction. It respects physicalism via the doctrine that all properties metaphysically supervene on physical properties. It avoids traditional reduction via the thesis that many properties cannot be type-identiﬁed with physical properties. The viability of non-reductive physicalism has been extensively discussed over the half-century since it was ﬁrst explored by Putnam (1960, 1967) and Davidson (1970). Most of the debate has focused on whether non-reductive physicalism (...) can accommodate non-physical causes (cf Kim 1993; Robb and Heil 2003: sect 6.) However, there has been far less discussion of whether non-reductive physicalism can accommodate non-physical laws (though see Block 1997; Kim 1992; Macdonald 1992; Millikan 1999; Papineau 1985, 1992). In this chapter I wish to focus ﬁrst on the issue of non-physical laws. This will turn out to cast some useful light on the question of non-physical causation. Not all non-reductive physicalists think that there are non-physical laws. Davidson, for example, does not (1976). Even so, it is widely supposed that there can be laws in ‘specialsciences’ like biology, psychology, and economics even though their categories do not reduce to physical types. The locus classicus for this position is Fodor’s ‘SpecialSciences’ (1974). Fodor made his analysis graphic in what must be the most-reproduced diagram in philosophy. (shrink)
David Papineau, Jerry Fodor and many others wonder how the conjunction of the following three positions can be true: 1) Special science laws: There are lawlike generalizations in the specialsciences. These sciences trade in kinds that are such that statements about salient, reliable correlations that are projectible and that support counterfactuals apply to the tokens coming under these kinds. 2) Non-reductionism: The laws of some of the specialsciences cannot be reduced to physical (...) laws. 3) Physicalism: Everything there is in the world supervenes on the physical, that is, is fixed by the distribution of the physical properties in the world. The obvious problem is that (3) implies that the similarities among tokens in the world, accounting for the kinds in which the specialsciences trade, and the correlations among such tokens, accounting for the laws of the specialsciences, are fixed by the distribution of the physical properties. By contrast, (2) implies that some of the laws seizing such correlations are not reducible to physical laws. By using the term “token”, I mean a particular instantiating a property. Papineau’s proposal to reconcile these three positions is to account for (2) in terms of selection (pp. 6-9): There can be laws in the specialsciences that are not reducible to physical laws if and only if these laws focus on effects that are selected for in a given context independently of the mechanisms by which they are brought about. Thus, the fact of there being such laws and their non-reducibility to physics do not contradict physicalism (3). The drawback is that the kinds that figure in such laws cannot enter into a rich network of laws 199 and that nothing can be causally efficacious insofar as it is a member of such a kind. In these comments, I shall try to push Papineau further in the direction of a reductive physicalism, thus solving the problem by simply abandoning (2).. (shrink)
Discussion of moral explanation has reached animpasse, with proponents of contemporaryethical naturalism upholding the explanatoryintegrity of moral facts and properties, andopponents – including both anti-realists andnon-naturalistic realists – insisting thatsuch robustly explanatory pretensions as moraltheory has be explained away. I propose thatthe key to solving the problem lies in thequestion whether instances of moral propertiesare causally efficacious. It is argued that,given the truth of contemporary ethicalnaturalism, moral properties are causallyefficacious if the properties of the specialsciences are. Certain objections are rebuttedinvolving (...) the nature of causation, on the onehand, and putative special features of themoral realm, on the other. (shrink)
Ross & Spurrett (R&S) argue that Kim's reductionism rests on a restricted account of supervenience and a misunderstanding about causality. I contend that broadening supervenience does nothing to avoid Kim's argument and that it is difficult to see how employing different notions of causality helps to avoid the problem. I end by sketching a different solution.
The issue of downward causation (and mental causation in particular), and the exclusion problem is discussed by taking into account some recent advances in the philosophy of science. The problem is viewed from the perspective of the new interventionist theory of causation developed by Woodward. It is argued that from this viewpoint, a higher-level (e.g., mental) state can sometimes truly be causally relevant, and moreover, that the underlying physical state which realizes it may fail to be such.
Some causal explanations are non-committal in that mention of a property in the explanans conveys information about the causal origin of the explanandum even if the property in question plays no causal role for the explanandum . Programme explanations are a variety of non-committal causal (NCC) explanations. Yet their interest is very limited since, as I will argue in this paper, their range of applicability is in fact quite narrow. However there is at least another variety of NCC explanations, causal (...) orientation explanations, which offer a plausible model for many explanations in the specialsciences. (shrink)
Philosophers and non-philosophers have been attracted to the idea that the world incorporates levels of being: higher-level items – ordinary objects, artifacts, human beings – depend on, but are not in any sense reducible to, items at lower levels. I argue that the motivation for levels stems from an implicit acceptance of a Picture Theory of language according to which we can ‘read off’ features of the world from ways we describe the world. Abandonment of the Picture Theory opens the (...) way to a ‘no levels’ conception of reality, a conception that honors anti-reductionist sentiments and preserves the status of the specialsciences without the ontological baggage. (shrink)
If we are on the outside, we assume a conspiracy is the perfect working of a scheme. Silent nameless men with unadorned hearts. A conspiracy is everything that ordinary life is not. It’s the inside game, cold, sure, undistracted, forever closed off to us. We are the ﬂawed ones, the innocents, trying to make some rough sense of the daily jostle. Conspirators have a logic and a daring beyond our reach. All conspiracies are the same taut story of men who (...) ﬁnd coherence in some criminal act. (shrink)
An asymmetry between the demands at the computational and algorithmic levels of description furnishes the illusion that the abstract profile at the computational level can be multiply realized, and that something is actually being shared at the algorithmic one. A disembodied rendering of the situation lays the stress upon the different ways in which an algorithm can be implemented. However, from an embodied approach, things look rather different. The relevant pairing, I shall argue, is not between implementation and algorithm, but (...) rather between algorithm and computation. The autonomy of psychology is a result of the failure to appreciate this point. (shrink)
If we are on the outside, we assume a conspiracy is the perfect working of a scheme. Silent nameless men with unadorned hearts. A conspiracy is everything that ordinary life is not. It’s the inside game, cold, sure, undistracted, forever closed off to us. We are the flawed ones, the innocents, trying to make some rough sense of the daily jostle. Conspirators have a logic and a daring beyond our reach. All conspiracies are the same taut story of men who (...) find coherence in some criminal act. — Don DeLillo, ”In Dallas,” pt. 2, Libra (1988). (shrink)
John Earman and John T. Roberts advocate a challenging and radical claim regarding the semantics of laws in the specialsciences: the statistical account. According to this account, a typical special science law “asserts a certain precisely defined statistical relation among well-defined variables” (Earman and Roberts 1999) and this statistical relation does not require being hedged by ceteris paribus conditions. In this paper, we raise two objections against the attempt to cash out the content of special (...) science generalizations in statistical terms. (shrink)
Laws of nature seem to take two forms. Fundamental physics discovers laws that hold without exception, ‘strict laws’, as they are sometimes called; even if some laws of fundamental physics are irreducibly probabilistic, the probabilistic relation is thought not to waver. In the nonfundamental, or special, sciences, matters differ. Laws of such sciences as psychology and economics hold only ceteris paribus – that is, when other things are equal. Sometimes events accord with these ceteris paribus laws (c.p. (...) laws, hereafter), but sometimes the laws are not manifest, as if they have somehow been placed in abeyance: the regular relation indicative of natural law can fail in circumstances where an analogous outcome would effectively refute the assertion of strict law. Many authors have questioned the supposed distinction between strict laws and c.p. laws. The brief against it comprises various considerations: from the complaint that c.p. clauses are void of meaning to the claim that, although understood well enough, they should appear in all law-statements. These two concerns, among others, are addressed in due course, but first, I venture a positive proposal. I contend that there is an important contrast between strict laws and c.p. laws, one that rests on an independent distinction between combinatorial and noncombinatorial nomic principles.2 Instantiations of certain properties, e.g., mass and charge, nomically produce individual forces, or more generally, causal influences,3 in accordance with noncombinatorial.. (shrink)
Chemistry as the special science of the elements Content Type Journal Article DOI 10.1007/s11016-010-9458-4 Authors Klaus Ruthenberg, Faculty of Science, Coburg University of Applied Sciences, 96406 Coburg, Germany Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796.
The view that special science properties are multiply realizable has been attacked in recent years by Shapiro, Bechtel and Mundale, Polger, and others. Focusing on psychological and neuroscientific properties, I argue that these attacks are unsuccessful. By drawing on interspecies physiological comparisons I show that diverse physical mechanisms can converge on common functional properties at multiple levels. This is illustrated with examples from the psychophysics and neuroscience of early vision. This convergence is compatible with the existence of general constraints (...) on the evolution of cognitive systems, and does not involve any ad hoc typing of coarse-grained higher level properties. The mechanisms that realize these common higher level properties are really distinct by the criteria laid down by critics of multiple realizability. Finally, I present an account of how such functional properties might constitute special science kinds by playing a central explanatory role in a range of cognitive models. Behavioral science kinds in particular are the functionally defined constituents picked out by our most successful models of the multilevel systems and mechanisms that explain cognitive capacities. (shrink)
It is well known that people from other disciplines have made significant contributions to philosophy and have influenced philosophers. It is also true (though perhaps not often realized, since philosophers are not on the receiving end, so to speak) that philosophers have made significant contributions to other disciplines and have influenced researchers in these other disciplines, sometimes more so than they have influenced philosophy itself. But what is perhaps not as well known as it ought to be is that researchers (...) in other disciplines, writing in those other disciplines' journals and conference proceedings, are doing philosophically sophisticated work, work that we in philosophy ignore at our peril. Work in cognitive science and artificial intelligence (AI) often overlaps such paradigmatic philosophical specialties as logic, the philosophy of mind, the philosophy of language, and the philosophy of action. This special issue offers a sampling of research in cognitive science and AI that is philosophically relevant and philosophically sophisticated. (shrink)
One way to do socially relevant investigations of science is through conceptual analysis of scientiﬁc terms used in special-interest science (SIS). SIS is science having welfare-related consequences and funded by special interests, e.g., tobacco companies, in order to establish predetermined conclusions. For instance, because the chemical industry seeks deregulation of toxic emissions and avoiding costly cleanups, it funds SIS that supports the concept of “hormesis” (according to which low doses of toxins/carcinogens have beneﬁcial effects). Analyzing the hormesis concept (...) of its main defender, chemical-industry-funded Edward Calabrese, the paper shows Calabrese and others fail to distinguish three different hormesis concepts, H, HG, and HD. H requires toxin-induced, short-term beneﬁcial effects for only one biological endpoint, while HG requires toxin-induced, net-beneﬁcial effects for all endpoints/responses/subjects/ages/conditions. HD requires using the risk-assessment/ regulatory default rule that all low-dose toxic exposures are net-beneﬁcial, thus allowable. Clarifying these concepts, the paper argues for ﬁve main claims. (1) Claims positing H are trivially true but irrelevant to regulations. (2) Claims positing HG are relevant to regulation but scientifically false. (3) Claims positing HD are relevant to regulation but ethically/scientifically questionable. (4) Although no hormesis concept (H, HG, or HD) has both scientiﬁc validity and regulatory relevance, Calabrese and others obscure this fact through repeated equivocation, begging the question, and data-trimming. Consequently (5) their errors provide some undeserved rhetorical plausibility for deregulating low-dose toxins. (shrink)
The paper addresses the problem of the delay of the social sciences with respect to the natural sciences. It is argued that there are no special differences between them from a methodological point of view. The methodology of both can be understood in terms of the idealizational conception of science. Nor is the subject-matter the source of the problems. It is argued that it is the social placement of the social sciences within wider communities that is (...) responsible for the delay. (shrink)
The authors present a balanced critique of the adaptation/exaptation debate and specify some of the hard evidentiary criteria that are needed to advance our understanding of human evolution. Investigators must build more “special design” criteria into their theorizing and research. By documenting that certain traits meet these rigorous criteria, the evolutionary sciences will ultimately rest on a firmer theoretical foundation.
The adaptationist framework is necessary and sufficient for unifying the social and natural sciences. Gintis's “beliefs, preferences, and constraints” (BPC) model compares unfavorably to this framework because it lacks criteria for determining special design, incorrectly assumes that standard evolutionary theory predicts individual rationality maximisation, does not adequately recognize the impact of psychological mechanisms on culture, and is mute on the behavioural implications of intragenomic conflict. (Published Online April 27 2007).
The reigning picture of specialsciences, what we will term the ‘received’ view, grew out of the work of writers, such as Jerry Fodor, William Wimsatt, and Philip Kitcher, who overturned the Positivist’s jaundiced view of these disciplines by looking at real cases from the biological sciences, linguistics, psychology, and economics, amongst other areas.1 Central to the received view is the ontological claim that the ‘multiple realization’ of properties is widespread in the specialsciences which (...) we may frame thus. (shrink)
This thesis examines the claim that the sciences are disunified. Chapter 1 outlines and introduces different accounts of the stratification of the sciences in the literature, in particular, Unificationism, Disunificationism, Eliminativism and Human Science Disunificationism. I argue that all of these competing views are informed by an ideal model for successful science. In particular, all of the views discussed are committed to the claim that a science requires laws to be considered scientifically legitimate. At the end of this (...) chapter, the narrower topic of the thesis is revealed: do the specialsciences have real legitimate ceteris paribus laws? (shrink)
The issue of whether there are laws in biology and the “special science”1 has been of interest owing to the debate about whether scientific explanation requires laws. A well-warn argument goes thus: no laws in social science, no explanations, or at least no scientific explanations, at most explanation-sketches. The conclusion is not just a matter of labeling. If explanations are not scientific they are not epistemically or practically reliable. There are at least three well-known diagnoses of where this argument (...) goes wrong. First, the argument that there are no laws in social science adopts an account of laws that is too stringent, one that not even the physical sciences satisfy (Cartwright 1983, Mitchell 2000). On a less stringent definition, there are plenty of laws in social science (and biology). These laws are, sensu Fodor, “non-strict,” as opposed to the “strict laws” (if any—vide Cartwright 1983) of physics. Second, scientific explanation does not require laws, and when laws do explain, they do so because they satisfy some other requirement on scientific explanation, for example unification, or the identification of causal difference-makers (Friedman 1974, Kitcher 1989, Salmon 1984, Strevens 2009). A third view, increasingly attractive among philosophers of social science and biology is due to James Woodward (2000, 2003). This view, like the second one eschews laws and identifies causes as difference makers. On this view explanations do require regularities, but these regularities need only satisfy a requirement of “invariance” under certain specified circumstances, in order to be explanatory, and.. (shrink)
This paper proposes an explanation in terms of three kinds of freedom, first for the special efficacy of science in general and then for why such efficacy has been more impressive in the natural than the social sciences. This explanation thus complements "post-positivist" interpretations of science which argue that science's effectiveness cannot be accounted for by fundamental epistemic differences from other kinds of discourse. My explanation tries to say what is responsible for science's effectiveness, in purely nonepistemic, sociological (...) terms. All of the three kinds of freedom have so far been denied to most other forums, including in particular nations' populations taken overall. And one of these freedoms, while now allowed to the natural sciences, is still denied to the social sciences. (shrink)
This paper emphasizes the crucial role of variation, at several different levels, for a detailed historical understanding of the development of the biomedical sciences. Going beyond valuable recent studies that focus on model organisms, experimental systems and instruments, we argue that all of these categories can be accommodated within our approach, which pays special attention to organismal and cultural variation. Our empirical examples are drawn in particular from recent historical studies of nineteenth- and early twentieth-century genetics and physiology. (...) Based on the quasi-paradoxical conclusion that biological and cultural variation both constrains and enables innovation in the biomedical sciences, we argue that more attention should be paid to variation as an analytical category in the historiography of the life sciences. (shrink)
This original and exciting study offers a completely new perspective on the philosophy of mathematics. Most philosophers of mathematics try to show either that the sort of knowledge mathematicians have is similiar to the sort of knowledge specialists in the empirical sciences have or that the kind of knowledge mathematicians have, although apparently about objects such as numbers, sets, and so on, isn't really about those sorts of things as well. Jody Azzouni argues that mathematical knowledge really is a (...)special kind of knowledge with its own special means of gathering evidence. He analyses the linguistic pitfalls and misperceptions philosophers in this field are often prone to, and explores the misapplications of epistemic principles from the empirical sciences to the exact sciences. What emerges is a picture of mathematics both sensitive to mathematical practice, and to the ontological and epistemological issues that concern philosophers. (shrink)
Are indices a purely linguistic, textual phenomenon or are linguistic indices a special case of a more general type of indexical signs? In comparing Carlo Ginzburg's restrictive view of indices and traces in particular with Peirce's general approach to indexical signs, this paper argues that Peirce's account of indexicality makes it possible to connect the sciences and the humanities by a flexible relational concept of the epistemic function of an identification that indexical experiences allows for. In this way (...) Peirce's flexible concept of indexicality allows us to connect e.g. the experience of a condensation trace of an electron in a cloud chamber with that of the trace of a deer in the snow. (shrink)
Since the 1990’s, social sciences are living their computational turn. This paper aims to clarify the epistemological meaning of this turn. To do this, we have to discriminate between different epistemic functions of computation among the diverse uses of computers for modeling and simulating in the social sciences. Because of the introduction of a new – and often more user-friendly – way of formalizing and computing, the question of realism of formalisms and of proof value of computational treatments (...) reemerges. Facing the spreading of computational simulations in all disciplines, some enthusiastic observers are claiming that we are entering a new era of unity for social sciences. Finally, the article shows that the conceptual and epistemological distinctions presented in the first sections lead to a more mitigated position: the transdisciplinary computational turn is a great one, but it is of a methodological nature. (shrink)
The problems dealt with in The Idea of a Social Science are philosophical. It is an attempt to place the social science, considered as a single group, on the intellectual map, with special attention to the relations of the discipline to philosophy on the one hand and the natural sciences on the other. The author holds that the relation between the social sciences and philosophy is commonly misunderstood because of certain fashionable misconceptions about the nature of philosophy, (...) and because of an incorrect assessment of the significance of some of Wittgenstein's contributions. He discusses the influence of the natural sciences on our conception of the social sciences and examines some of the most influential ideas of J.S. Mill, Pareto and Max Weber. (shrink)
Investigators of animal behavior since the eighteenth century have sought to make their work integral to the enterprises of natural history and/or the life sciences. In their efforts to do so, they have frequently based their claims of authority on the advantages offered by the special places where they have conducted their research. The zoo, the laboratory, and the field have been major settings for animal behavior studies. The issue of the relative advantages of these different sites has (...) been a persistent one in the history of animal behavior studies up to and including the work of the ethologists of the twentieth century. (shrink)
Abstract. The object of this essay is to explain what there is about discussions of Judaism and the sciences that is distinctive from discussions about religion in general and the sciences. The description draws primarily but not exclusively from recent meetings of the Judaism, Medicine, and Science Group in Tempe, Arizona. The author's Jewish Faith and Modern Science, together with a selective bibliography of writings in this subfield, are used to generate a list of science issues—focused around the (...) religious doctrines of creation, revelation, and redemption in Judaism—that raise specific challenges to Jewish faith. Special attention is given to Leon Kass's The Hungry Soul as an example of a distinctive way to integration knowledge of both science and rabbinic Judaism on a philosophical issue. (shrink)
Introduction to the Special Issue of the Journal of Agricultural and Environmental Ethics from EURSAFE 2010 Content Type Journal Article Pages 1-4 DOI 10.1007/s10806-012-9390-2 Authors Leire Escajedo San-Epifanio, Department of Constitutional Law and History of Political Thought, Faculty of Social Sciences and Communication, University of the Basque Country, Bilbao, Spain Mickey Gjerris, Faculty of Science, Institute of Food and Resource Economics, University of Copenhagen, Copenhagen, Denmark Journal Journal of Agricultural and Environmental Ethics Online ISSN 1573-322X Print ISSN 1187-7863.
A central mistake in Rolf Gruner's recent article on understanding in the socia sciences in ferreted out, and consideration of it is used both to analyse Gruner's interpretation of understanding and to sketch a more adequate interpretation. The mistake is in distinguishing meanings and facts. The analysis suggests that Gruner was forced to see understanding both as a special kind of explanation and at the same time as no explanation. The sketch offers a distinction of three senses of (...) ?understanding? ? as identification of a certain kind of subject matter, as explanation of it, and as a subjective feeling consequent upon such explanation. (shrink)
Editors’ introduction to the special issue on the Causality and Explanation in the Sciences conference, held at the University of Ghent in September 2011.Presentación del número monográfico sobre el congreso Causality and Explanation in the Sciences, celebrado en la Universidad de Gante en septiembre de 2011.
In this paper we briefly examine and evaluate Quine’s physicalism. On the supposition, in accordance with Quine’s views, that there can be no change of any sort without a physical change, we argue that this point leaves plenty of room to understand and accept a limited autonomy of the specialsciences and of other domains of disciplinary and common-sense inquiry and discourse. The argument depends on distinguishing specific, detailed programs of reduction from the general Quinean strategy of reduction (...) by explication. We argue that the details of the relations of particular sciences, disciplines and domains of discourse depend on empirical evidence and empirical-theoretical developments and that the generalized approach of reduction by explication is also subject to related empirical-theoretical constraints. So understood, physicalism lacks much of the controversial force and many of the implications sometimes associated with it. (shrink)
In contrast with the development of big theories in the context of social sciences, there is nowadays an increasing interest in the construction of simulation models for complex phenomena. Those simulation models suggest a certain image of social sciences as a kind of, let us say, "patchwork". In that image, an increase in understanding about the phenomena modeled is obtained through a certain sort of aggregation. There is not an application of sound, established theories to all the phenomena (...) of a certain kind, but an aggregation of the structures supposed, and of the results obtained, when particular systems are modeled. The recent case of the "El Farol Bar" problem, and the models built in order to face this problem, are a good example of this. We will analyze that case, trying to make clear what would be implied by the image above mentioned. Special attention will be paid to the need to take seriously the notion of a bounded rationality, linked to the special circumstances generating each decision problem, and to the existence of an irreducible pluralism of models. (shrink)
That space and time should be integrated into a single entity, spacetime, is the great insight of Einstein's special theory of relativity, and leads us to regard spacetime as a fundamental context in which to make sense of the world around us. But it is not the only one. Causality is equally important and at least as far as the special theory goes, it cannot be subsumed under a fundamentally geometrical form of explanation. In fact, the agent of (...) propagation of causal influence is electromagnetic radiation. In this examination, the authors find support for a rationalist approach to physics, never neglecting experimentation, but rejecting a simple empiricist or positivist view of science. (shrink)
ABSTRACT. May scientists rely on substantive, a priori presuppositions? Quinean naturalists say "no," but Michael Friedman and others claim that such a view cannot be squared with the actual history of science. To make his case, Friedman offers Newton's universal law of gravitation and Einstein's theory of relativity as examples of admired theories that both employ presuppositions (usually of a mathematical nature), presuppositions that do not face empirical evidence directly. In fact, Friedman claims that the use of such presuppositions is (...) a hallmark of "science as we know it." But what should we say about the specialsciences, which typically do not rely on the abstruse formalisms one finds in the exact sciences? I identify a type of a priori presupposition that plays an especially striking role in the development of empirical psychology. These are ontological presuppositions about the type of object a given science purports to study. I show how such presuppositions can be both a priori and rational by investigating their role in an early flap over psychology's contested status as a natural science. The flap focused on one of the field's earliest textbooks, William James's Principles of Psychology. The work was attacked precisely for its reliance on a priori presuppositions about what James had called the "mental state," psychology's (alleged) proper object. I argue that the specific presuppositions James packed into his definition of the "mental state" were not directly responsible to empirical evidence, and so in that sense were a priori; but the presuppositions were rational in that they were crafted to help overcome philosophical objections (championed by neo-Hegelians) to the very idea that there can be a genuine science of mind. Thus, my case study gives an example of substantive, a priori presuppositions being put to use—to rational use—in the specialsciences. In addition to evaluating James's use of presuppositions, my paper also offers historical reflections on two different strands of pragmatist philosophy of science. One strand, tracing back through Quine to C. S. Peirce, is more naturalistic, eschewing the use of a priori elements in science. The other strand, tracing back through Kuhn and C. I. Lewis to James, is more friendly to such presuppositions, and to that extent bears affinity with the positivist tradition Friedman occupies. (shrink)
About the Series Contemporary philosophy of science combines a general study from a philosophical perspective of the methods of science, with an inquiry, again from the philosophical point of view, into foundational issues that arise in the various specialsciences. Methodological philosophy of science has deep connections with issues at the center of pure philosophy. It makes use of important results, for example, in traditional epistemology, metaphysics and the philosophy of language. It also connects in various ways with (...) other disciplines such as the history and sociology of the sciences, with pure logic, and with such branches of mathematics as probability theory. These volumes are, for the most part, devoted to readings in the methodological aspects of the philosophy of science. One volume, however, takes up the philosophical issues in the foundations of a particularly important special science, that is the issues in the foundations of theories of contemporary physics. The methodological volumes cover a number of crucial general problem areas. The first volume takes up issues in the nature of scientific explanation, and the related issues of the nature of scientific law and of the casual relation among events. The second volume explores issues in the nature and structure of scientific theories. The third volume collects inquiries into the nature of scientific change, as one theory is replaced by another. Volume four is devoted to readings concerning the nature of probability and the nature and justification of inductive reasoning in science. The following volume continues the exploration of the issue of confirming and rejecting theories with a series of readings devoted to Bayesian methodologies in science and to the exploration of non-inductive strategies for rationalizing belief. Finally, volume six explores three major problem areas in the foundation of physics: the nature and rationale for physical theories of space and time; the interpretive problems arising out of the quantum theory; and some puzzles arising out of statistical mechanical theories of physics. The readings are selected and arranged to provide the user with systematic access to the most important contemporary themes in methodological philosophy of science and in philosophy of physics. The selections include many recent contributions to the field, as well as papers and extracts from books and journals otherwise not easily available. (shrink)
In this paper I expound an argument which seems to establish that probabilism and special relativity are incompatible. I examine the argument critically, and consider its implications for interpretative problems of quantum theory, and for theoretical physics as a whole.
Are speical relativity and probabilism compatible? Dieks argues that they are. But the possible universe he specifies, designed to exemplify both probabilism and special relativity, either incorporates a universal "now" (and is thus incompatible with special relativity), or amounts to a many world universe (which I have discussed, and rejected as too ad hoc to be taken seriously), or fails to have any one definite overall Minkowskian-type space-time structure (and thus differs drastically from special relativity as ordinarily (...) understood). Probabilism and special relativity appear to be incompatible after all. What is at issue is not whether "the flow of time" can be reconciled with special relativity, but rather whether explicitly probabilistic versions of quantum theory should be rejected because of incompatibility with special relativity. (shrink)
In this paper I show that Einstein made essential use of aim-oriented empiricism in scientific practice in developing special and general relativity. I conclude by considering to what extent Einstein came explicitly to advocate aim-oriented empiricism in his later years.
This survey of major developments in North American philosophy of science begins with the mid-1960s consolidation of the disciplinary synthesis of internalist history and philosophy of science (HPS) as a response to criticisms of logical empiricism. These developments are grouped for discussion under the following headings: historical metamethodologies, scientific realisms, philosophies of the specialsciences, revivals of empiricism, cognitivist naturalisms, social epistemologies, feminist theories of science, studies of experiment and the disunity of science, and studies of science as (...) practice and culture. A unifying theme of the survey is the relation between historical metamethodologists and scientific realists, which dominated philosophical work in the late 1970s. I argue that many of the alternative cognitive naturalisms, social epistemologies, and feminist theories that have been proposed can be understood as analogues to the differences between metamethodological theories of scientific rationality and realist accounts of successful reference to real causal processes. Recent work on experiment, scientific practice, and the culture of science may, however, challenge the underlying conception of the field according to which realism and historical rationalism (or their descendants) are the important alternatives available, and thus may take philosophy of science in new directions. (shrink)
Replies to Kevin de Laplante’s ‘Certainty and Domain-Independence in the Sciences of Complexity’ (de Laplante, 1999), defending the thesis of J. Franklin, ‘The formal sciences discover the philosophers’ stone’, Studies in History and Philosophy of Science, 25 (1994), 513-33, that the sciences of complexity can combine certain knowledge with direct applicability to reality.
The last fifty years have seen the creation of a number of new "formal" or "mathematical" sciences, or "sciences of complexity". Examples are operations research, theoretical computer science, information theory, descriptive statistics, mathematical ecology and control theory. Theorists of science have almost ignored them, despite the remarkable fact that (from the way the practitioners speak) they seem to have come upon the "philosophers' stone": a way of converting knowledge about the real world into certainty, merely by thinking.
Conditions for philosophy of science in the Netherlands are not optimal. The climate of opinion in Dutch philosophy is unsympathetic to the sciences, partly because of the influence of theology. Dutch universities offer no taught graduate programmes in philosophy of science, which would provide an entry route for science graduates. A great deal of Dutch research in philosophy of science is affected by an exegetical attitude, which fosters the interpretation and evaluation of other writers rather than the development of (...) original theories. Doctoral candidates in particular should be trained to greater originality and assertiveness. Nonetheless, much good research in philosophy of science is conducted in the Netherlands, both in philosophy faculties and in institutes dedicated to the foundations of the specialsciences. Distinguished work is done also in the neighbouring disciplines of logic, history of science, and social studies of science. (shrink)
Reductionism is often understood to include two theses: (1) every singular occurrence that the specialsciences can explain also can be explained by physics; (2) every law in a higher-level science can be explained by physics. These claims are widely supposed to have been refuted by the multiple realizability argument, formulated by Putnam (1967, 1975) and Fodor (1968, 1975). The present paper criticizes the argument and identifies a reductionistic thesis that follows from one of the argument's premises.
The special composition question asks, roughly, under what conditions composition occurs. The common sense view is that composition only occurs among some things and that all and only ‘ordinary objects’ exist. Peter van Inwagen has marshaled a devastating argument against this view. The common sense view appears to commit one to giving what van Inwagen calls a ‘series-style answer’ to the special composition question, but van Inwagen argues that series-style answers are impossible because they are inconsistent with the (...) transitivity of parthood. In what follows I answer this objection in addition to other, less troubling objections raised by van Inwagen. (shrink)
This article aims to show that fundamentality is construed differently in the two most prominent strategies of analysis we find in physical science and engineering today: (1) atomistic, reductive analysis and (2) Systems analysis. Correspondingly, atomism is the conception according to which the simplest (smallest) indivisible entity of a certain kind is most fundamental; while systemism , as will be articulated here, is the conception according to which the bonds that structure wholes are most fundamental, and scale and/or constituting entities (...) are of no significance whatsoever for fundamentality. Accordingly, atomists maintain that the basic entities —the atoms —are fundamental, and together with the "external" interactions among them, are sufficient for illuminating all the features and behaviors of the wholes they constitute; whereas systemists proclaim that it is instead structural qualities of systems, that flow from internal relations among their constituents and translate directly into behaviors, that are fundamental, and by themselves largely (if not entirely) sufficient for illuminating the features and behaviors of the wholes thereby structured. Systemism, as will be argued, is consistent with the nonexistence of a fundamental "level" of nondecomposable entities, just as it is consistent with the existence of such a level. Still, systemism is a conception of the fundamental in quite different, but still ontological terms. Systemism can serve the specialsciences—the social sciences especially—better than the conception of fundamentality in terms of atoms. Systemism is, in fact, a conception of fundamentality that has rather different uses—and importantly, different resonances. This conception of fundamentality makes contact with questions pertaining to natural kinds and their situation in the metaphysics of the specialsciences—their situation within an order of autonomous sciences. The controversy over fundamentality is evident in the social sciences too, albeit somewhat imperfectly, in the terms of debate between methodological individualists and functionalists/holists . This article will thus clarify the difference between systemism and holism. (shrink)
Philosophy of Science, broadly construed, is as old as philosophy itself. It was only in the early twentieth century that it emerged as a distinct sub-discipline with its own professional standards and institutional structures, and it has come a long way since these pioneering days. During the century’s first four decades the focus was primarily on what nowadays would be referred to as Ôgeneral philosophy of science’, the study of problems that arise in all scientific disciplines alike. Since the 1960s (...) philosophers have increasingly paid attention to issues in particular sciences – at the beginning primarily in mathematics, physics and biology. Later on chemistry, economics, and the social sciences followed suit. And in the last few decades these so-called Ôphilosophies of the specialsciences’ have reached unprecedented maturity. The 1960s saw another important development: the entry of history of science into philosophical debates, which led to a style of argument that based philosophical analysis on detailed historical case studies. In the 1980s, finally, sociological, and more generally humanities, perspectives on science entered the scene and contributed yet another way to approach the study of both the practices and the results of the sciences. This plenitude of schools, movements, and approaches can be bewildering, and keeping track of all of them is a Herculean task. So novices and seasoned practitioner alike may be grateful for a guiding hand, and that is what the new Routledge Companion to.. (shrink)
In this thesis, I argue that a good historical science will have the following characteristics: Firstly, it will seek to construct causal histories of the past. Secondly, the construction of these causal histories will utilise well-tested regularities of science. Additionally, well-tested regularities will secure the link between observations of physical traces and the causal events of interest. However, the historical sciences cannot use these regularities in a straightforward manner. The regularities must accommodate the idiosyncrasies of the past, and the (...) degradation of evidence over time. Through an examination of how the historical sciences work in practice, I show how they can confirm these unique causal histories, and the limits to their confirmatory strategies. (shrink)
Realism in Action is a selection of essays written by leading representatives in the fields of action theory and philosophy of mind, philosophy of the social sciences and especially the nature of social action, and of epistemology and philosophy of science. Practical reason, reasons and causes in action theory, intending and trying, and folk-psychological explanation are some of the topics discussed by these leading participants. A particular emphasis is laid on trust, commitments and social institutions, on the possibility of (...) grounding social notions in individual social attitudes, on the nature of social groups, institutions and collective intentionality, and on common belief and common knowledge. Applications to the social sciences include, e.g., a look at the Erklären-Verstehen controversy in economics, and at constructivist and realist views on archeological reconstructions of the past. (shrink)
This paper explores the relationship between psychology and neurobiology in the context of cognitive science. Are the sciences that constitute cognitive science independent and theoretically autonomous, or is there a necessary interaction between them? I explore Fodor's Multiple Realization Thesis (MRT) which starts with the fact of multiple realization and purports to derive the theoretical autonomy of specialsciences (such as psychology) from structural sciences (such as neurobiology). After laying out the MRT, it is shown that, (...) on closer inspection, the argument is either circular or self-undermining--the argument either assumes the very autonomy it seeks to demonstrate or the concluded autonomy is contradicted by the theoretical interdependence invoked by the premises of the argument. Next, I explore a concrete example of multiple realization in the explanation of animal behavior: the convergent evolution of jamming avoidance behaviors in three genera of weakly electric fish. Contrary to the image painted by the MRT, the work on these animals involves a high degree of interaction between the various levels of investigation. The fact that our understanding of electric fish behavior involves functional theories and multiple realization without the kind of disunified science that is supposed to follow from such a situation indicates that the mere fact of multiple realization cannot be the basis for an autonomous psychology. (shrink)
Reductionism in the Philosophy of Science develops a novel account of reduction in science and applies it to the relationship between classical and molecular genetics. However, rather than addressing the epistemological issues that have been essential to the reductionism debate in philosophy of biology, the discussion primarily pursues ontological questions, as they are known, about reducing the mental to the physical. For Sachse construes reductionism as a purely philosophical endeavor and defends the possibility of reduction in principle, which may not (...) be relevant to understanding reductionist reasoning and explanation occurring in scientific practice, as discussed by philosophers of science. Likewise, the conceptual framework used stems more from metaphysics and philosophy of mind than philosophy of science. Sachse's aim is twofold. First, he argues for the specialsciences' being reducible to physics, by deriving the in principle possibility of epistemological reduction from ontological reduction. Second, he attempts to simultaneously make room for the legitimacy of the specialsciences, effecting a conservative reduction rather than an elimination of the specialsciences. (shrink)
The Claims of Common Sense investigates the importance of ideas developed by Cambridge philosophers between the World Wars for the social sciences concerning common sense, vague concepts, and ordinary language. John Coates examines the thought of Moore, Ramsey, Wittgenstein and Keynes, and traces their common drift away from early beliefs about the need for precise concepts and a canonical notation in analysis. He argues that Keynes borrowed from Wittgenstein and Ramsey their reappraisal of vague concepts, and developed the novel (...) argument that when analysing something as complex as social reality, theory might be simplified by using concepts which lack sharp boundaries. Coates then contrasts this conclusion with the view shared by two contemporary philosophical paradigms - formal semantics and Continental post-structuralism - that the vagueness of ordinary language inevitably leads to interpretive indeterminacy. Developing a link between Cambridge philosophy and current work on complexity, vague predicates, and fuzzy logic, he argues that Wittgenstein's and Keynes's ideas on the economy of ordinary language present a mediating route for the social sciences between these philosophical paradigms. (shrink)
<span class='Hi'>Reductionism</span> in the Philosophy of Science develops a novel account of reduction in science and applies it to the relationship between classical and molecular genetics. However, rather than addressing the epistemological issues that have been essential to the <span class='Hi'>reductionism</span> debate in philosophy of biology, the discussion primarily pursues ontological questions, as they are known, about reducing the mental to the physical. For Sachse construes <span class='Hi'>reductionism</span> as a purely philosophical endeavor and defends the possibility of reduction in principle, (...) which may not be relevant to understanding reductionist reasoning and explanation occurring in scientific practice, as discussed by philosophers of science. Likewise, the conceptual framework used stems more from metaphysics and philosophy of mind than philosophy of science. Sachse's aim is twofold. First, he argues for the specialsciences' being reducible to physics, by deriving the in principle possibility of epistemological reduction from ontological reduction. Second, he attempts to simultaneously make room for the legitimacy of the specialsciences, effecting a conservative reduction rather than an elimination of the specialsciences. (shrink)
Ross & Spurrett (R&S) fail to take metaphysics seriously because they do not make a clear enough distinction between how we understand the world and what the world is really like. Although they show that the behavioral and cognitive sciences are genuinely explanatory, it is not clear that they have shown that these specialsciences identify properties that are genuinely causal.
1.Puzzle According to a standard view in contemporary metaphysics, there are no necessary connections between distinct properties. But according to a standard view in philosophy of mind there are necessary connections between distinct properties. In short, we have a puzzle: standard metaphysics inconsistent with standard philosophy of mind. By ‘a standard view in contemporary metaphysics’ I mean, of course, Hume’s dictum that there are no necessary connections between distinct existences. I don’t mean the historical Hume; whether the historical Hume held (...) Hume’s dictum I am sure is a controversial issue, and will not concern us. What will concern us rather is the idea that contemporary metaphysicians such as David Lewis and David Armstrong discuss and attribute to Hume (see, e.g., Lewis 1986 and Armstrong 1997). Of course Hume’s dictum does not say anything explicitly about properties; it talks of existences rather than properties. But ‘existences’ I take it, means ‘things that exist’ and, if we set nominalism aside—as I will do here—properties are things that exist. Hence the Humean dictum entails as a special case that there are no necessary connections between distinct properties. (shrink)