Is conceptual analysis required for reductive explanation? If there is no a priori entailment from microphysical truths to phenomenal truths, does reductive explanation of the phenomenal fail? We say yes (Chalmers 1996; Jackson 1994, 1998). Ned Block and Robert Stalnaker say no (Block and Stalnaker 1999).
Why do agent-relative reasons have authority over us, reflective creatures? Reductive accounts base the normativity of agent-relative reasons on agent-neutral considerations like having parents caring especially for their own children serves best the interests of all children. Such accounts, however, beg the question about the source of normativity of agent-relative ways of reason-giving. In this paper, I argue for a non-reductive account of the reflective necessity of agent-relative concerns. Such an account will reveal an important structural complexity of (...) practical reasoning in general. Christine Korsgaard relates the rational binding force of practical reasons to the various identities or self-conceptions under which we value ourselves. The problem is that it is not clear why such self-conceptions would necessitate us rationally, given the fact that most of our identities are simply given. Perhaps, Harry Frankfurt is right in arguing that we are not only necessitated by reason, but also, and predominantly by what we love. I argue, however, that the necessities of love (in Frankfurts phrase) are not to be separated from, but should be seen as belonging to the necessities of reason. Our loves, concerns and related identities provide for a specific and important structure to practical reflection. They function on the background of reasoning, having a specific default role: they would lose their character as concerns, if there was a need for them to be cited on the foreground of deliberation or if there was a need to justify them. This does not mean that our deep concerns cannot be scrutinised. They can only be scrutinised in an indirect way, however, which explains their role in grounding the normativity of agent-relative reasons. It appears that this account can provide for a viable interpretation of Korsgaards argument about the foundational role of practical identities. (shrink)
Lewisian Genuine Realism (GR) about possible worlds is often deemed unable to accommodate impossible worlds and reap the benefits that these bestow to rival theories. This thesis explores two alternative extensions of GR into the terrain of impossible worlds. It is divided in six chapters. Chapter I outlines Lewis’ theory, the motivations for impossible worlds, and the central problem that such worlds present for GR: How can GR even understand the notion of an impossible world, given Lewis’ reductive theoretical (...) framework? Since the desideratum is to incorporate impossible worlds into GR without compromising Lewis’ reductiveanalysis of modality, Chapter II defends that analysis against (old and new) objections. The rest of the thesis is devoted to incorporating impossible worlds into GR. Chapter III explores GR-friendly impossible worlds in the form of set-theoretic constructions out of genuine possibilia. Then, Chapters IV-VI venture into concrete impossible worlds. Chapter IV addresses Lewis’ objection against such worlds, to the effect that contradictions true at impossible worlds amount to true contradictions tout court. I argue that even if so, the relevant contradictions are only ever about the non-actual, and that Lewis’ argument relies on a premise that cannot be nonquestion- beggingly upheld in the face of genuine impossible worlds in any case. Chapter V proposes that Lewis’ reductiveanalysis can be preserved, even in the face of genuine impossibilia, if we differentiate the impossible from the possible by means of accessibility relations, understood non-modally in terms of similarity. Finally, Chapter VI counters objections to the effect that there are certain impossibilities, formulated in Lewis’ theoretical language, which genuine impossibilia should, but cannot, represent. I conclude that Genuine Realism is still very much in the running when the discussion turns to impossible worlds. (shrink)
My topic is two-fold: a reductive account of expertise as an epistemic phenomenon, and applying the reductive account to the question of whether or not philosophers enjoy expertise. I conclude, on the basis of the reductive account, that even though philosophers enjoy something akin to second-order expertise (i.e. they are often experts on the positions of other philosophers, current trends in the philosophical literature, the history of philosophy, conceptual analysis and so on), they nevertheless lack first-order (...) philosophical expertise (i.e. expertise on philosophical positions themselves such as the nature of mind, causality, normativity and so forth). Throughout the paper, I respond to potential objections. (shrink)
Some argue that Lewisian realism fails as a reduction of modality because in order to meet some criterion of success the account needs to invoke primitive modality. I defend Lewisian realism against this charge; in the process, I hope to shed some light on the conditions of success for a reduction. In §1 I detail the resources the Lewisian modal realist needs. In §2 I argue against Lycan and Shalkowski’s charge that Lewis needs a modal notion of ‘world’ to ensure (...) that worlds correspond to possibilities. In §3 I respond to Divers and Melia’s objection that Lewis needs to invoke primitive modality to give a complete account of what worlds there are. In §4 I ask what it is for a notion to ‘involve’ modality. I conclude that the question is either in bad standing or at best offers little traction on the debate, and propose a different way of assessing when materials are appropriately included in a reductive base. (shrink)
In this paper I discuss the claim (advanced in various ways by Joseph Levine, Frank Jackson and David Chalmers) that the successful reduction of qualitative to physical states requires some sort of intelligible connection between our qualitative and physical concepts, which in turn requires a conceptual analysis of our qualitative concepts in causal-functional terms. While I defend this claim against some of its recent critics, I ultimately dispute it, and propose a different way to get the requisite intelligible connection (...) between qualitative and physical concepts. (shrink)
My aim here is threefold: (a) to show that conceptual facts play a more significant role in justifying explanatory reductions than most of the contributors to the current debate realize; (b) to furnish an account of that role, and (c) to trace the consequences of this account for conceivability arguments about the mind.
Although it’s sometimes thought that pluralism about truth is unstable---or, worse, just a non-starter---it’s surprisingly difficult to locate collapsing arguments that conclusively demonstrate either its instability or its inability to get started. This paper exemplifies the point by examining three recent arguments to that effect. However, it ends with a cautionary tale; for pluralism may not be any better off than other traditional theories that face various technical objections, and may be worse off in facing them all.
The paper motivates a novel research programme in the philosophy of action parallel to the ‘Knowledge First’ programme in epistemology. It is argued that much of the grounds for abandoning the quest for a reductiveanalysis of knowledge in favour of the Knowledge First alternative are mirrored in the case of intentional action, inviting the hypothesis that intentional action is also, like knowledge, metaphysically basic. The paper goes on to demonstrate the sort of explanatory contribution that intentional action (...) can make once it is no longer taken to be a target for reductiveanalysis, in explaining other, non-intentional kinds of action and voluntariness. (shrink)
Some theorists who emphasize the complexity of biological and cognitive systems and who advocate the employment of the tools of dynamical systems theory in explaining them construe complexity and reduction as exclusive alternatives. This paper argues that reduction, an approach to explanation that decomposes complex activities and localizes the components within the complex system, is not only compatible with an emphasis on complexity, but provides the foundation for dynamical analysis. Explanation via decomposition and localization is nonetheless extremely challenging, and (...) an analysis of recent cognitive neuroscience research on memory is used to illustrate what is involved. Memory researchers split between advocating memory systems and advocating memory processes, and I argue that it is the latter approach that provides the critical sort of decomposition and localization for explaining memory. The challenges of linking distinguishable functions with brain processes is illustrated by two examples: competing hypotheses about the contribution of the hippocampus and competing attempts to link areas in frontal cortex with memory processing. (shrink)
I present and defend a unified, non-reductiveanalysis of the a priori and a posteriori. It is a mistake to remove all epistemic conditions from the analysis of the a priori (as, for example, Alvin Goldman has recently suggested doing). We can keep epistemic conditions (like unrevisability) in the analysis as long as we insist that a priori and a posteriori justification admit of degrees. I recommend making the degree to which a beliefâs justification is a (...) priori or a posteriori solely dependent on the revisability relations that obtain among the faculties that deliver the belief and all other faculties. (shrink)
A detailed analysis of Quine's paper on ontologicalreduction shows that the proxy-function requirement, in hischaracterization of the concept of ontological reduction,is superfluous for blocking Pythagoreism and inappropriate for a generalblockade of ontological monism.
Three distinctly different interpretations of Aristotle?s notion of a sullogismos in Prior Analytics can be traced: (1) a valid or invalid premise-conclusion argument (2) a single, logically true conditional proposition and (3) a cogent argumentation or deduction. Remarkably the three interpretations hold similar notions about the logical relationships among the sullogismoi. This is most apparent in their conflating three processes that Aristotle especially distinguishes: completion (A4-6)reduction(A7) and analysis (A45). Interpretive problems result from not sufficiently recognizing Aristotle?s remarkable degree of (...) metalogical sophistication to distinguish logical syntax from semantics and, thus, also from not grasping him to refine the deduction system of his underlying logic. While it is obvious that Aristotle most often uses ?sullogimos? to denote a valid argument of a certain kind, we show that at Prior Analytics A4-6, 7, 45 Aristotle specifically treats a sullogismos as an elemental argument pattern having only valid instances and that such a pattern then serves as a rule of deduction in his syllogistic logic. By extracting Aristotle?s understanding of three proof-theoretic processes, this paper provides new insight into what Aristotle thinks reasoning syllogistically is and, moreover, it resolves three problems in the most recent interpretation that takes a sullogismos to be a deduction. (shrink)
Russell’s theory of descriptions in “On Denoting” has long been hailed as a paradigm of the sort of analysis that is constitutiue of philosophical understanding. It is not the only model of logical analysis available to us, however. On Frege’s quite different view, analysis provides not a reduction of some problematic notion to other, unproblematic ones -- as Russell’s analysis does -- but instead a deeper, clearer articulation of the very notion with which we began. This (...) difference, I suggest, is grounded in their two very different conceptions of the nature of language / thought; and it grounds in turn two very different conceptions of the nature of philosophical understanding. (shrink)
This paper examines a paradigm case of allegedly successful reductive explanation, viz. the explanation of the fact that water boils at 100°C based on facts about H2O. The case figures prominently in Joseph Levine’s explanatory gap argument against physicalism. The paper studies the way the argument evolved in the writings of Levine, focusing especially on the question how the reductive explanation of boiling water figures in the argument. It will turn out that there are two versions of the (...) explanatory gap argument to be found in Levine’s writings. The earlier version relies heavily on conceptual analysis and construes reductive explanation as a process of deduction. The later version makes do without conceptual analysis and understands reductive explanations as based on theoretic reductions that are justified by explanatory power. Along the way will be shown that the bridge principles — which are being neglected in the explanatory gap literature — play a crucial role in the explanatory gap argument. (shrink)
The inapplicability of variations on theory reduction in the context of genetics and their irrelevance to ongoing research has led to an anti-reductionist consensus in philosophy of biology. One response to this situation is to focus on forms of reductive explanation that better correspond to actual scientific reasoning (e.g. part–whole relations). Working from this perspective, we explore three different aspects (intrinsicality, fundamentality, and temporality) that arise from distinct facets of reductive explanation: composition and causation. Concentrating on these aspects (...) generates new forms of reductive explanation and conditions for their success or failure in biology and other sciences. This analysis is illustrated using the case of protein folding in molecular biology, which demonstrates its applicability and relevance, as well as illuminating the complexity of reductive reasoning in a specific biological context. (shrink)
Metaphor is commonly taken to be an elliptical simile. This article offers a rational reconstruction of two types of simile theories of metaphor: reductive and non-reductive. Careful analysis shows the differences between these theories, but in the end, neither does the explanatory work it sets out to do. In assimilating metaphor to simile and simile to literal comparison, the reductive simile theory obscures what is most important to an account of metaphor: an account of what it (...) is to interpret a bit of discourse metaphorically. The reductive simile theory fails because the reduction to literal comparison fails. The non-reductive simile theory faces most of the same problems that undermine the simile theory, particularly problems generating a corresponding simile when the metaphor contains quantificational terms and other linguistic complexities. Analysis of the non-reductive simile theory highlights the troublesome assumption (inherent in all simile theories) that metaphor and simile play the same linguistic role: making or prompting us to make comparisons. In both guises, the simile theory mistakes the task of explaining what a metaphor means with how metaphor (in general) has meaning; it confuses explanation with explication. In diagnosing and arguing against the simile theory, this article sets out a framework for understanding the difference between literal and metaphorical interpretation. (shrink)
It is often claimed (1) that levels of nature are related by supervenience, and (2) that processes occurring at particular levels of nature should be studied using dynamical systems theory. However, there has been little consideration of how these claims are related. To address the issue, I show how supervenience relations give rise to ‘supervenience functions’, and use these functions to show how dynamical systems at different levels are related to one another. I then use this analysis to describe (...) a graded approach to non-reductive physicalism, and to critically assess Davidson’s arguments for psychological anomaly. I also show how this approach can inform empirical research in cognitive science. (shrink)
One of the most powerful tools in science is the analytic method, whereby we seek to understand complex systems by studying simpler sub-systems from which the complex is composed. If this method is to be successful, something about the sub-systems must remain invariant as we move from the relatively isolated conditions in which we study them, to the complex conditions in which we want to put our knowledge to use. This paper asks what this invariant could be. The paper shows (...) that the kinds of thing that a Humean might point to – behaviour, laws, and dispositions – cannot play the role required of the invariant in question. Nor, indeed, can non-Humean causal powers of the kind advocated by contemporary metaphysicians such as Ellis and Lierse. The paper suggests that the analytic method presupposes a kind of entity that does not appear in standard ontologies – a metaphysically substantial notion of causal influence. This notion of causal influence is one that Cartwright has also seen the need for, though she does not seem to take the notion as seriously as she should. (shrink)
I examine the relationship between complete analysis and clarificatory analysis and explain why Wittgenstein thought he required both in his account of how to solve the problems of philosophy. I first describe Wittgenstein’s view of how philosophical confusions arise, by explaining how it is possible to misunderstand the logic of everyday language. I argue that any method of logical analysis in the Tractatus will inevitably be circular, but explain why this does not threaten the prospect of solving (...) philosophical problems. I distinguish between complete and clarificatory analysis and argue that Wittgenstein’s ‘strictly correct’ philosophical method is clarificatory analysis. Finally I discuss the relationship between the two forms of analysis and claim that, although, at the time of writing the Tractatus, Wittgenstein believed that the possibility of complete analysis underpins clarificatory analysis, in fact this was a mistake. In the Philosophical Investigations complete analysis is rejected and clarificatory analysis is retained. (shrink)
Conceptual analysis is undergoing a revival in philosophy, and much of the credit goes to Frank Jackson. Jackson argues that conceptual analysis is needed as an integral component of so-called serious metaphysics and that it also does explanatory work in accounting for such phenomena as categorization, meaning change, communication, and linguistic understanding. He even goes so far as to argue that opponents of concep- tual analysis are implicitly committed to it in practice. We show that he is (...) wrong on all of these points and that his case for conceptual analysis doesn. (shrink)
It would be nice if good old a priori conceptual analysis were possible. For many years conceptual analysis was out of fashion, in large part because of the excessive ambitions of verificationist theories of meaning._ _However, those days are over._ _A priori conceptual analysis is once again part of the philosophical mainstream._ _This renewed popularity, moreover, is well-founded. Modern philosophical analysts have exploited developments in philosophical semantics to formulate analyses which avoid the counterintuitive consequences of verificationism, while (...) vindicating our ability to know a priori precisely what it is our words and thoughts represent._ _Despite its apparent promise, however, I. (shrink)
I start by reconsidering two familiar arguments against modal realism. The argument from epistemology relates to the issue whether we can infer the existence of concrete objects by a priori means. The argument from pragmatics purports to refute the analogy between the indispensability of possible worlds and the indispensability of unobserved entities in physical science and of numbers in mathematics. Then I present two novel objections. One focusses on the obscurity of the notion of isolation required by modal realism. The (...) other stresses the arbitrary nature of the rules governing the behaviour of Lewisean universes. All four objections attack the reductiveanalysis of modality that is supposed to be the chief merit of modal realism. (shrink)
It is easy to become battle-weary in metaphysics. In the face of seemingly unresolvable disputes and unanswerable questions, it is tempting to cast aside one’s sword, proclaiming: “there is no fact of the matter who is right!” Sometimes that is the right thing to do. As a case study, consider the search for the criterion of personal identity over time. I say there is no fact of the matter whether the correct criterion is bodily or psychological continuity.1 There exist two (...) candidate meanings for talk of persisting persons, one corresponding to each criterion, and there is simply no fact of the matter which candidate we mean. An argument schema for this sort of “no fact of the matter” thesis will be constructed. An instance of the schema will be defended in the case of personal identity. But scrutiny of this instance will reveal limits of the schema. Questions not settled by conceptual analysis—in particular, some very difficult questions of fundamental ontology—have answers. So do certain questions that can be settled by conceptual analysis, namely those that would be answered definitively by ideal philosophical inquiry. Whether there is a fact of the matter is not easily ascertained merely by looking to see whether disputes seem unresolvable or questions unanswerable: sometimes the truth is out there, however hard (or even impossible) it may be to discover. (shrink)
In this paper, I explore the implications of recent empirical research on concept representation for the philosophical enterprise of conceptual analysis. I argue that conceptual analysis, as it is commonly practiced, is committed to certain assumptions about the nature of our intuitive categorization judgments. I then try to show how these assumptions clash with contemporary accounts of concept representation in cognitive psychology. After entertaining an objection to my argument, I close by considering ways in which conceptual analysis (...) might be altered to accord better with the empirical work. (shrink)
This book provides a concise overview, with excellent historical and systematic coverage, of the problems of the philosophy of language in the analytic tradition. Howard Callaway explains and explores the relation of language to the philosophy of mind and culture, to the theory of knowledge, and to ontology. He places the question of linguistic meaning at the center of his investigations. The teachings of authors who have become classics in the field, including Frege, Russell, Carnap, Quine, Davidson, and Putnam are (...) critically analyzed. I share completely his conviction that contemporary Anglo-American philosophy follows the spirit of the enlightenment in insisting on intellectual sincerity, clarity, and the willingness to meet scientific doubts or objections openly. --Professor Henri Lauener, Editor of Dialectica. (shrink)
Semantic externalism about a class of expressions is often thought to make conceptual analysis about members of that class impossible. In particular, since externalism about natural kind terms makes the essences of natural kinds empirically discoverable, it seems that mere reflection on one's natural kind concept will not be able to tell one anything substantial about what it is for something to fall under one's natural kind concepts. Many hold the further view that one cannot even know anything substantial (...) about the reference-fixers of one's natural kind concepts by armchair reflection. In this paper I want to question this latter view and claim that, because of the way our standard methodology of doing theories of reference relies on semantic intuitions, typical externalists in fact presuppose that one can know the reference-fixers of one's natural kind concepts by mere armchair reflection. The more interesting question is how substantial such knowledge can be. I also take some steps toward answering this question. (shrink)
Some claim that Non-reductive Physicalism (NRP) is an unstable position, on grounds that NRP either collapses into reductive physicalism (contra Non-reduction ), or expands into emergentism of a robust or ‘strong’ variety (contra Physicalism ). I argue that this claim is unfounded, by attention to the notion of a degree of freedom—roughly, an independent parameter needed to characterize an entity as being in a state functionally relevant to its law-governed properties and behavior. I start by distinguishing three relations (...) that may hold between the degrees of freedom needed to characterize certain special science entities, and those needed to characterize (systems consisting of) their composing physical (or physically acceptable) entities; these correspond to what I call ‘reductions’, ‘restrictions’, and ‘eliminations’ in degrees of freedom. I then argue that eliminations in degrees of freedom, in particular—when strictly fewer degrees of freedom are required to characterize certain special science entities than are required to characterize (systems consisting of) their composing physical (or physically acceptable) entities—provide a basis for making sense of how certain special science entities can be both physically acceptable and ontologically irreducible to physical entities. (shrink)
I argue that an adequate account of non-reductive realization must guarantee satisfaction of a certain condition on the token causal powers associated with (instances of) realized and realizing entities---namely, what I call the 'Subset Condition on Causal Powers' (first introduced in Wilson 1999). In terms of states, the condition requires that the token powers had by a realized state on a given occasion be a proper subset of the token powers had by the state that realizes it on that (...) occasion. Accounts of non-reductive realization conforming to this condition are implementing what I call 'the powers-based subset strategy'. I focus on the crucial case involving mental and brain states; the results may be generalized, as appropriate. I ﬁrst situate and motivate the strategy by attention to the problem of mental causation; I make the case, in schematic terms, that implementation of the strategy makes room (contra Kim 1989, 1993, 1998, and elsewhere) for mental states to be ontologically and causally autonomous from their realizing physical states, without inducing problematic causal overdetermination, and compatible with both Physicalism and Non-reduction; and I show that several contemporary accounts of non-reductive realization (in terms of functional realization, parthood, and the determinable/determinate relation) are plausibly seen as implementing the strategy. As I also show, implementation of the powers-based strategy does not require endorsement of any particular accounts of either properties or causation---indeed, a categoricalist contingentist Humean can implement the strategy. The schematic location of the strategy in the space of available responses to the problem of mental (more generally, higher-level) causation, as well as the fact that the schema may be metaphysically instantiated, strongly suggests that the strategy is, appropriately generalized and instantiated, sufficient and moreover necessary for non-reductive realization. I go on to defend the sufficiency and necessity claims against a variety of objections, considering, along the way, how the powers-based subset strategy fares against competing accounts of purportedly non-reductive realization in terms of supervenience, token identity, and constitution. (shrink)
Philosophers expend considerable effort on the analysis of concepts, but the value of such work is not widely appreciated. This paper principally analyses some arguments, beliefs, and presuppositions about the nature of design and the relations between design and science common in the literature to illustrate this point, and to contribute to the foundations of design theory.
In this paper, the relation between identity-based reduction and one specific sort of reductive explanation is considered. The notion of identity-based reduction is spelled out and its role in the reduction debate is sketched. An argument offered by Jaegwon Kim, which is supposed to show that identity-based reduction and reductive explanation are incompatible, is critically examined. From the discussion of this argument, some important consequences about the notion of reduction are pointed out.
This essay concerns the question of how we make genuine epistemic progress through conceptual analysis. Our way into this issue will be through consideration of the paradox of analysis. The paradox challenges us to explain how a given statement can make a substantive contribution to our knowledge, even while it purports merely to make explicit what one’s grasp of the concept under scrutiny consists in. The paradox is often treated primarily as a semantic puzzle. However, in “Sect. 1” (...) I argue that the paradox raises a more fundamental epistemic problem, and in “Sects.1 and 2” I argue that semantic proposals—even ones designed to capture the Fregean link between meaning and epistemic significance—fail to resolve that problem. Seeing our way towards a real solution to the paradox requires more than semantics; we also need to understand how the process of analysis can yield justification for accepting a candidate conceptual analysis. I present an account of this process, and explain how it resolves the paradox, in “Sect. 3”. I conclude in “Sect. 4” by considering the implications for the present account concerning the goal of conceptual analysis, and by arguing that the apparent scarcity of short and finite illuminating analyses in philosophically interesting cases provides no grounds for pessimism concerning the possibility of philosophical progress through conceptual analysis. (shrink)
The paper argues that existing interpretations of Kant's Critique of Pure Reason as an "analysis of experience" (e.g., those of Kitcher and Strawson) fail because they do not properly appreciate the method of the work. The author argues that the Critique provides an analysis of the faculty of reason, and counts as an analysis of experience only in a derivative sense.
Conceptual analysis, like any exclusively theoretical activity, is far from overrated in current psychology. Such a situation can be related both to the contingent influences of contextual and historical character and to the more essential metatheoretical reasons. After a short discussion of the latter it is argued that even within a strictly empirical psychology there are non-trivial tasks that can be attached to well-defined and methodologically reliable, conceptual work. This kind of method, inspired by the ideas of Ludwig Wittgenstein, (...) Peter Strawson (conceptual grammar), and Gilbert Ryle (conceptual geography), is proposed and formally depicted as being holistic, descriptive, and connective. Finally, the newly presented framework of connective conceptual analysis is defended against the “Charge from Psychology,” in a version developed by William Ramsey, claiming that conceptual analysis is based on psychological assumptions that have already been refuted by empirical psychology. (shrink)
Rational analysis (Anderson 1990, 1991a) is an empiricalprogram of attempting to explain why the cognitive system isadaptive, with respect to its goals and the structure of itsenvironment. We argue that rational analysis has two importantimplications for philosophical debate concerning rationality. First,rational analysis provides a model for the relationship betweenformal principles of rationality (such as probability or decisiontheory) and everyday rationality, in the sense of successfulthought and action in daily life. Second, applying the program ofrational analysis to (...) research on human reasoning leads to a radicalreinterpretation of empirical results which are typically viewed asdemonstrating human irrationality. (shrink)
The main purpose of this article is to undertake a conceptual investigation of the Berlin Wisdom Paradigm: a psychological project initiated by Paul Baltes and intended to study the complex phenomenon of wisdom. Firstly, in order to provide a wider perspective for the subsequent analyses, a short historical sketch is given. Secondly, a meta-theoretical issue of the degree to which the subject matter of the Baltesian study can be identified with the traditional philosophical wisdom is addressed. The main result yielded (...) by a careful conceptual analysis is that the philosophical and psychological concepts of wisdom, though not entirely the same, are at least parallel. Finally, one of the revealed aspects of the Berlin Wisdom Paradigm, i.e. its relative neglect of the non-cognitive and personal aspects of wisdom is brought to the fore. This deficiency, it is suggested, can be remedied by the application of the virtue ethics' conceptual framework. (shrink)
This paper examines the notion that psychology is autonomous. It is argued that we need to distinguish between (a) the question of whether psychological explanations are autonomous, and (b) the question of whether the process of psychological discovery is autonomous. The issue is approached by providing a reinterpretation of Robert Cummins's notion of functional analysis (FA). A distinction is drawn between FA as an explanatory strategy and FA as an investigative strategy. It is argued that the identification of functional (...) components of the cognitive system may draw on knowledge about brain structure, without thereby jeopardizing the explanatory autonomy of psychology. (shrink)
This paper identifies an overdetermination problem faced by the non-reductive dispositional property account of disposition ascriptions. Two possible responses to the problem are evaluated and both are shown to have serious drawbacks. Finally it is noted that the traditional conditional analysis of dispositional ascriptions escapes the original difficulty.
We consider the question: under what circumstances can the concept of adaptation be applied to groups, rather than individuals? Gardner and Grafen (2009, J. Evol. Biol.22: 659–671) develop a novel approach to this question, building on Grafen's ‘formal Darwinism’ project, which defines adaptation in terms of links between evolutionary dynamics and optimization. They conclude that only clonal groups, and to a lesser extent groups in which reproductive competition is repressed, can be considered as adaptive units. We re-examine the conditions under (...) which the selection–optimization links hold at the group level. We focus on an important distinction between two ways of understanding the links, which have different implications regarding group adaptationism. We show how the formal Darwinism approach can be reconciled with G.C. Williams’ famous analysis of group adaptation, and we consider the relationships between group adaptation, the Price equation approach to multi-level selection, and the alternative approach based on contextual analysis. (shrink)
Robert Cummins has recently used the program of Clark Hull to illustrate the effects of logical positivist epistemology upon psychological theory. On Cummins's account, Hull's theory is best understood as a functional analysis, rather than a nomological subsumption. Hull's commitment to the logical positivist view of explanation is said to have blinded him to this aspect of this theory, and thus restricted its scope. We will argue that this interpretation of Hull's epistemology, though common, is mistaken. Hull's epistemological views (...) were developed independently of, and in considerable contrast to, the principles of logical positivism. (shrink)
This is a volume of specially commissioned essays of analytical philosophy, on topics of current interest in ethics and the philosophy of logic and language. Among the topics discussed are the making of wicked promises, G. E. Moore's early ethical views, as well as indexicals, tense, indeterminism, conventionalism in mathematics, and identity and necessity. The essays are all by former students of Casimir Lewy, until recently Reader in Philosophy at the University of Cambridge and an exponent of a particularly thoroughgoing (...) form of philosophical analysis. Together, they represent some of the best work in these areas at present, and express what may be described as a characteristic 'Cambridge' voice. (shrink)
Some have attempted to justify benefit/ cost analysis by appealing to a moral theory that appears to directly ground the technique. This approach is unsuccessful because the moral theory in question is wildly implausible and, even if it were correct, it would probably not endorse the unrestricted use of benefit/ cost analysis. Nevertheless, there is reason to think that a carefully restricted use of benefit/ cost analysis will be justifiable from a wide variety of plausible moral perspectives. (...) From this, it is reasonable to conclude that such use of the technique is probably morally justified and should be acceptable to most people. (shrink)
In recent years, the debate on the problem of causal exclusion has seen an ‘interventionist turn’. Numerous non-reductive physicalists (e.g. Shapiro and Sober 2007) have argued that Woodward's (2003) interventionist theory of causation provides a means to empirically establish the existence of non-reducible mental-to-physical causation. By contrast, Baumgartner (2010) has presented an interventionist exclusion argument showing that interventionism is in fact incompatible with non-reductive physicalism. In response, a number of revised versions of interventionism have been suggested that are (...) compatible with non-reductive physicalism. The first part of this paper reconstructs the definitional details of these modified interventionist theories. The second part investigates whether the modification proposed in Woodward (2011) is not only compatible with, but moreover supports non-reductive physicalism. In particular, it is examined whether that newest variant of interventionism allows for empirically resolving the problem of causal exclusion as envisaged by Shapiro, Sober and others. (shrink)
Several authors within psychology, neuroscience and philosophy take for granted that standard empirical research techniques are applicable when studying consciousness. In this article, it is discussed whether one of the key methods in cognitive neuroscience – the contrastive analysis – suffers from any serious confounding when applied to the field of consciousness studies; that is to say, if there are any systematic difficulties when studying consciousness with this method that make the results untrustworthy. Through an analysis of theoretical (...) arguments in favour of using contrastive analysis, combined with analyses of empirical findings, I conclude by arguing for three factors that currently are confounding of research using contrastive analysis. These are (1) unconscious processes, (2) introspective reports, and (3) attention. (shrink)
Instances of explanatory reduction are often advocated on metaphysical grounds; given that the only real things in the world are subatomic particles and their interaction, we have to try to explain everything in terms of the laws of physics. In this paper, we show that explanatory reduction cannot be defended on metaphysical grounds. Nevertheless, indispensability arguments for reductive explanations can be developed, taking into account actual scientific practice and the role of epistemic interests. Reductive explanations might be indispensable (...) to address some epistemic interest answering a specific explanation-seeking question in the most accurate, adequate and efficient way. Just like explanatory pluralists often advocate the indispensability of higher levels of explanation pointing at the pragmatic value of the explanatory information obtained on these higher levels, we argue that explanatory reduction—traditionally understood as the contender of pluralism—can be defended in a similar way. The pragmatic value reductionist, lower level explanations might have in the biomedical sciences and the social sciences is illustrated by some case studies. (shrink)
The background hypothesis of this essay is that psychological phenomena are typically explained, not by subsuming them under psychological laws, but by functional analysis. Causal subsumption is an appropriate strategy for explaining changes of state, but not for explaining capacities, and it is capacities that are the central explananda of psychology. The contrast between functional analysis and causal subsumption is illustrated, and the background hypothesis supported, by a critical reassessment of the motivational psychology of Clark Hull. I argue (...) that Hull's work makes little sense construed along the subsumptivist lines he advocated himself, but emerges as both interesting and methodologically sound when construed as an exercise in the sort of functional analysis featured in contemporary cognitive science. (shrink)
Cummins (1982) argues that etiological considerations are not onlyinsufficient butirrelevant for the determination offunction. I argue that his claim of irrelevance rests on a misrepresentation of the use of functions in evolutionary explanations. I go on to suggest how accepting anetiological constraint on functional analysis might help resolve some problems involving the use of functional explanations.
Recent encounters with structuralist and poststructuralist critical theory, linguistics, and cognitive sciences have brought the theory and analysis of music into the orbit of important developments in present-day intellectual history. Without seeking to impose an explicit redefinition of either theory or analysis, this book explores the limits of both. Essays on decidability, ambiguity, metaphor, music as text, and music analysis as cognitive theory are complemented by studies of works by Debussy, Schoenberg, Birtwistle and Boulez.
In this paper we study a new approach to classify mathematical theorems according to their computational content. Basically, we are asking the question which theorems can be continuously or computably transferred into each other? For this purpose theorems are considered via their realizers which are operations with certain input and output data. The technical tool to express continuous or computable relations between such operations is Weihrauch reducibility and the partially ordered degree structure induced by it. We have identified certain choice (...) principles such as co-finite choice, discrete choice, interval choice, compact choice and closed choice, which are cornerstones among Weihrauch degrees and it turns out that certain core theorems in analysis can be classified naturally in this structure. In particular, we study theorems such as the Intermediate Value Theorem, the Baire Category Theorem, the Banach Inverse Mapping Theorem, the Closed Graph Theorem and the Uniform Boundedness Theorem. We also explore how existing classifications of the Hahn—Banach Theorem and Weak Kőnig's Lemma fit into this picture. Well-known omniscience principles from constructive mathematics such as LPO and LLPO can also naturally be considered as Weihrauch degrees and they play an important role in our classification. Based on this we compare the results of our classification with existing classifications in constructive and reverse mathematics and we claim that in a certain sense our classification is finer and sheds some new light on the computational content of the respective theorems. Our classification scheme does not require any particular logical framework or axiomatic setting, but it can be carried out in the framework of classical mathematics using tools of topology, computability theory and computable analysis. We develop a number of separation techniques based on a new parallelization principle, on certain invariance properties of Weihrauch reducibility, on the Low Basis Theorem of Jockusch and Soare and based on the Baire Category Theorem. Finally, we present a number of metatheorems that allow to derive upper bounds for the classification of the Weihrauch degree of many theorems and we discuss the Brouwer Fixed Point Theorem as an example. (shrink)
BackgroundThe Netherlands is one of the few countries where euthanasia is legal under strict conditions. This study investigates whether Dutch newspaper articles use the term ‘euthanasia’ according to the legal definition and determines what arguments for and against euthanasia they contain.MethodsWe did an electronic search of seven Dutch national newspapers between January 2009 and May 2010 and conducted a content analysis.ResultsOf the 284 articles containing the term ‘euthanasia’, 24% referred to practices outside the scope of the law, mostly relating (...) to the forgoing of life-prolonging treatments and assistance in suicide by others than physicians. Of the articles with euthanasia as the main topic, 36% described euthanasia in the context of a terminally ill patient, 24% for older persons, 16% for persons with dementia, and 9% for persons with a psychiatric disorder. The most frequent arguments for euthanasia included the importance of self-determination and the fact that euthanasia contributes to a good death. The most frequent arguments opposing euthanasia were that suffering should instead be alleviated by better care, that providing euthanasia can be disturbing, and that society should protect the vulnerable.ConclusionsOf the newspaper articles, 24% uses the term ‘euthanasia’ for practices that are outside the scope of the euthanasia law. Typically, the more unusual cases are discussed. This might lead to misunderstandings between citizens and physicians. Despite the Dutch legalisation of euthanasia, the debate about its acceptability and boundaries is ongoing and both sides of the debate are clearly represented. (shrink)
The late scholastics, from the fourteenth to the seventeenth centuries, contributed to many ﬁelds of knowledge other than philosophy. They developed a method of conceptual analysis that was very productive in those disciplines in which theory is relatively more important than empirical results. That includes mathematics, where the scholastics developed the analysis of continuous motion, which fed into the calculus, and the theory of risk and probability. The method came to the fore especially in the social sciences. In (...) legal theory they developed, for example, the ethical analyses of the conditions of validity of contracts, and natural rights theory. In political theory, they introduced constitutionalism and the thought experiment of a “state of nature”. Their contributions to economics included concepts still regarded as basic, such as demand, capital, labour, and scarcity. Faculty psychology and semiotics are other areas of signiﬁ cance. In such disciplines, later developments rely crucially on scholastic concepts and vocabulary. (shrink)
Classical fitting-attitude analyses understand value in terms of its being fitting, or more generally, there being a reason to favour the bearer of value. Recently, such analyses have been interpreted as referring to two reason-notions rather than to only one. The idea is that the properties of the object provide reason not only for a certain kind of favouring(s) vis-à-vis the object, but the very same properties should also figure in the intentional content of the favouring; the agent should favour (...) the object on account of those properties that provide reason for favouring the object in the first place. While this expansion of the original proposal might seem intuitive given that favourings are discerning attitudes, it is nonetheless argued that proponents of the fitting-attitude analysis are in fact not served by such an expansion of the classical analysis. The objections raised here are relevant not only for advocates and critics of fitting-attitude analyses, but for anyone interested in the relation between normative reasons and motivation. (shrink)
Ranking systems such as The Times Higher Education’s World University Rankings and Shanghai Jiao Tong University’s Academic Rankings of World Universities simultaneously mark global status and stimulate global academic competition. As international ranking systems have become more prominent, researchers have begun to examine whether global rankings are creating increased inequality within and between universities. Using a panel Tobit regression analysis, this study assesses the extent to which markers of inter-institutional stratification and organizational segmentation predict global status among US research (...) universities as measured by position in ARWU. Findings indicate some support that both inter-institutional stratification and organizational segmentation predict global status. (shrink)
This paper is based on a doctoral thesis which aimed at investigating on whether the use of strategic vagueness in Security Council resolutions relating to Iraq has contributed to the breakout of the 2002–2003s Gulf war instead of a diplomatic solution of the controversies. This work contains a linguistic and legal comparative analysis between UN and U.S. documents and their drafts in order to demonstrate how vagueness was deliberately added to the final versions of the documents before being passed, (...) and thus strategically used vagueness has played a crucial role in UN resolutions related to the outbreak of war in Iraq, and in relevant legislation produced by the United States for its Congressional authorisation for war. The comparative analysis between S/RES/1441(2002) and US legislation has evidenced that that there would have been diplomatic solutions to the Iraq crises which were not synonymous of light-handed intervention against Iraq, but deliberately vague UN wording allowed the US to build its own legislation with a personal interpretation implying that the UN did not impede military action. (shrink)
This paper introduces current acoustic theories relating to the phenomenology of sound as a framework for interrogating concepts relating to the ecologies of acoustic and landscape phenomena in a Japanese stroll garden. By applying the technique of Formal Concept Analysis, a partially ordered lattice of garden objects and attributes is visualized as a means to investigate the relationship between elements of the taxonomy.
This is a contribution to construction of a research roadmap for future cognitive systems, including intelligent robots, in the context of the euCognition network, and UKCRC Grand Challenge 5: Architecture of Brain and Mind. -/- A meeting on the euCognition roadmap project was held at Munich Airport on 11th Jan 2007. This document was in part a response to discussions at that meeting. An explanation of why specifying requirements is a hard problem, and why it needs to be done, along (...) with some suggestions for making progress, can be found in this presentation: http://www.cs.bham.ac.uk/research/projects/cosy/papers/#pr0701 "What's a Research Roadmap For? Why do we need one? How can we produce one?" Working on that presentation made me realise that certain deceptively familiar words and phrases frequently used in this context (e.g. "robust". "flexible", "autonomous") appear not to need explanation because everyone understands them, whereas in fact they have obscure semantics that needs to be elucidated. Only then can we understand what the implications are for research targets. In particular, they need explanation and analysis if they are to be used to specify requirements and research goals, especially for publicly funded projects. -/- First draft analyses are presented here. In the long term I would like to expand and clarify those analyses, and to provide many different examples to illustrate the points made. This will probably have to be a collaborative research activity. (shrink)
ABSTRACT: Confronted with the , several proponents of the fitting attitude analysis of emotional values have argued in favor of an epistemic approach. In such a view, an emotion fits its object because the emotion is correct. However, I argue that we should reorient our search towards a practical approach because only practical considerations can provide a satisfying explanation of the fittingness of emotional responses. This practical approach is partially revisionist, particularly because it is no longer an analysis (...) of final value and because it is relativistic. (shrink)
The development of culture-independent strategies to study microbial diversity and function has led to a revolution in microbial ecology, enabling us to address fundamental questions about the distribution of microbes and their influence on Earth’s biogeochemical cycles. This article discusses some of the progress that scientists have made with the use of so-called “omic” techniques (metagenomics, metatranscriptomics, and metaproteomics) and the limitations and major challenges these approaches are currently facing. These ‘omic methods have been used to describe the taxonomic structure (...) of microbial communities in different environments and to discover new genes and enzymes of industrial and medical interest. However, microbial community structure varies in different spatial and temporal scales and none of the ‘omic techniques are individually able to elucidate the complex aspects of microbial communities and ecosystems. In this article we highlight the importance of a spatiotemporal sampling design, together with a multilevel ‘omic approach and a community analysis strategy (association networks and modeling) to examine and predict interacting microbial communities and their impact on the environment. (shrink)
The two principal models of design in methodological circles in architecture—analysis/synthesis and conjecture/analysis—have their roots in philosophy of science, in different conceptions of scientific method. This paper explores the philosophical origins of these models and the reasons for rejecting analysis/synthesis in favour of conjecture/analysis, the latter being derived from Karl Popper’s view of scientific method. I discuss a fundamental problem with Popper’s view, however, and indicate a framework for conjecture/analysis to avoid this problem.
The work of Bertrand Russell had a decisive influence on the emergence of analytic philosophy, and on its subsequent development. The essays collected in this volume, by one of the leading authorities on Russell's philosophy, all aim at recapturing and articulating aspects of Russell's philosophical vision during his most influential and important period, the two decades following his break with Idealism in 1899. One theme of the collection concerns Russell's views about propositions and their analysis, and the relation of (...) those ideas to his rejection of Idealism. Another theme is the development of Russell's logicism, culminating in Whitehead's and Russell's Principia Mathematica, and Hylton offers a revealing view of the conception of logic which underlies it. Here again there is an emphasis on Russell's argument against Idealism, on the idea that his logicism was a crucial part of that argument. A further focus of the volume is Russell's views about functions and propositional functions. This theme is part of a contrast that Hylton draws between Russell's general philosophical position and that of Frege; in particular, there is a close parallel with the quite different views that the two philosophers held about the nature of philosophical analysis. Hylton also sheds valuable light on the much-disputed idea of an operation, which Wittgenstein advances in the Tractatus Logico-Philosophicus. (shrink)
Does context and context-dependence belong to the research agenda of semantics - and, specifically, of formal semantics? Not so long ago many linguists and philosophers would probably have given a negative answer to the question. However, recent developments in formal semantics have indicated that analyzing natural language semantics without a thorough accommodation of context-dependence is next to impossible. The classification of the ways in which context and context-dependence enter semantic analysis, though, is still a matter of much controversy and (...) some of these disputes are ventilated in the present collection. This book is not only a collection of papers addressing context-dependence and methods for dealing with it: it also records comments to the papers and the authors' replies to the comments. In this way, the contributions themselves are contextually dependent. In view of the fact that the contributors to the volume are such key figures in contemporary formal semantics as Hans Kamp, Barbara Partee, Reinhard Muskens, Nicholas Asher, Manfred Krifka, Jaroslav Peregrin and many others, the book represents a quite unique inquiry into the current activities on the semantics side of the semantics/pragmatics boundary. (shrink)
This is the first collection to bring together well-known scholars writing from feminist perspectives within critical discourse analysis. The theoretical structure of CDA is illustrated with empirical research in Eastern and Western Europe, New Zealand, Asia, South America and the US, demonstrating the complex workings of power and ideology in discourse in sustaining particular gender(ed) orders. These studies deal with texts and talk in domains ranging from parliamentary settings, news and advertising media, the classroom, community literacy programs and the (...) workplace. (shrink)
This fundamental and straightforward text addresses a weakness observed among present-day students, namely a lack of familiarity with formal proof. Beginning with the idea of mathematical proof and the need for it, associated technical and logical skills are developed with care and then brought to bear on the core material of analysis in such a lucid presentation that the development reads naturally and in a straightforward progression. Retaining the core text, the second edition has additional worked examples which users (...) have indicated a need for, in addition to more emphasis on how analysis can be used to tell the accuracy of the approximations to the quantities of interest which arise in analytical limits. (shrink)
A systematic rhetorical analysis may reveal elements of multimodal argumentative discourse that would otherwise remain hidden. In this article, we present simultaneously (both) the basics of the method we have developed to integrate theories about different modalities in one parallel processing framework for rhetorical analysis and the results of its application to an intriguing ad.
Most philosophical accounts of emergence are incompatible with reduction. Most scientists regard a system property as emergent relative to properties of the system's parts if it depends upon their mode of organization--a view consistent with reduction. Emergence can be analyzed as a failure of aggregativity--a state in which "the whole is nothing more than the sum of its parts." Aggregativity requires four conditions, giving tools for analyzing modes of organization. Differently met for different decompositions of the system, and in different (...) degrees, these conditions provide powerful evaluation criteria for choosing decompositions, and heuristics for detecting biases of vulgar reductionisms. This analysis of emergence is compatible with reduction. (shrink)
Since the mid-90s dispositionalism, the view that dispositions are irreducible, real properties, gained strength due to forceful counterexamples (finks and antidotes) that could be launched against Humean anti-dispositionalist attempts to reductively analyse dispositional predicates. -/- In the light of these anti-Humean successes, and in combination with ideas surrounding metaphysical necessity put forward by Kripke and Putnam, some dispositionalists felt encouraged to propose a strong anti-Humean view under the name of “Dispositional Essentialism”. -/- In this paper, I show that, ironically, the (...) counterexamples dispositionalists have used against the Humean reductiveanalysis of dispositional predicates also prove to be problems for a strong form of dispositional essentialism that assimilates dispositionality and metaphysical necessity. -/- Help comes from an unlike ally—Carnapian reductions sentences—but the alliance is not unproblematic. (shrink)
The explanatory gap . Consciousness is a mystery. No one has ever given an account, even a highly speculative, hypothetical, and incomplete account of how a physical thing could have phenomenal states. (Nagel, 1974, Levine, 1983) Suppose that consciousness is identical to a property of the brain, say activity in the pyramidal cells of layer 5 of the cortex involving reverberatory circuits from cortical layer 6 to the thalamus and back to layers 4 and 6,as Crick and Koch have suggested (...) for visual consciousness. (See Crick (1994).) Still, that identity itself calls out for explanation! Proponents of an explanatory gap disagree about whether the gap is permanent. Some (e.g. Nagel, 1974) say that we are like the scientifically naive person who is told that matter = energy, but does not have the concepts required to make sense of the idea. If we can acquire these concepts, the gap is closable. Others say the gap is uncloseable because of our cognitive limitations. (McGinn, 1991) Still others say that the gap is a consequence of the fundamental nature of consciousness. (shrink)
The following is a transcript of the interview I (Yasuko Kitano) conducted with Neil Levy (The Centre for Applied Philosophy and Public Ethics, CAPPE) on the 23rd in July 2009, while he was in Tokyo to give a series of lectures on neuroethics at The University of Tokyo Center for Philosophy. I edited his words for publication with his approval.
Compare two conceptions of validity: under an example of a modal conception, an argument is valid just in case it is impossible for the premises to be true and the conclusion false; under an example of a topic-neutral conception, an argument is valid just in case there are no arguments of the same logical form with true premises and a false conclusion. This taxonomy of positions suggests a project in the philosophy of logic: the reductiveanalysis of the (...) modal conception of logical consequence to the topic-neutral conception. Such a project would dispel the alleged obscurity of the notion of necessity employed in the modal conception in favour of the clarity of an account of logical consequence given in terms of tractable notions of logical form, universal generalization and truth simpliciter. In a series of publications, John Etchemendy has characterized the model-theoretic definition of logical consequence as truth preservation in all models as intended to provide just such an analysis. In this paper, I will argue that Aristotle intends to provide an account of a modal conception of logical consequence in topic-neutral terms and so is engaged in a project comparable to the one described above. That Aristotle would be engaged in this sort of project is controversial. Under the standard reading of the Prior Analytics, Aristotle does not and cannot provide an account of logical consequence. Rather, he must take the validity of the first figure syllogisms (such as the syllogism known by its medieval mnemonic ‘Barbara’: A belongs to all B; B belongs to all C; so A belongs to all C) as obvious and not needing justification; he then establishes the validity of the other syllogisms by showing that they stand in a suitable relation to the first figure syllogisms. I will argue that Aristotle does attempt to provide an account of logical consequence—namely, by appeal to certain mereological theorems. For example, he defends the status of Barbara as a syllogism by appeal to the transitivity of mereological containment. There are, as I will discuss, reasons to doubt the success of this account. But the attempt is not implausible given certain theses Aristotle holds in semantics, mereology and the theory of relations. (shrink)
A reductiveanalysis of a concept decomposes it into more basic constituent parts. Metaethicists today are in almost unanimous agreement that normative language and concepts cannot be reductively analyzed into entirely nonnormative language and concepts. Basic normative concepts are widely thought to be primitive or elemental in our thought, and therefore to admit of no further (reductive) explanation. G. E. Moore inferred from the unanalyzability of normative concepts the metaphysical doctrine that basic normative properties and relations are (...) irreducible to complexes of entirely nonnormative properties and relations; they are metaphysical primitives or elements that cannot be further explained. On this nonreductive view, now dominant again,¹ normativity enters our world, experience, and thought only by virtue of some elemental essence that.. (shrink)
In §2-4, I survey three extant ways of making sense of indeterminate truth and find each of them wanting. All the later sections of the paper are concerned with showing that the most promising way of making sense of indeterminate truth is via either a theory of truthmaker gaps or via a theory of truthmaking gaps. The first intimations of a truthmaker–truthmaking gap theory of indeterminacy are to be found in Quine (1981). In §5, we see how Quine proposes to (...) solve Unger’s problem of the many via positing the possibility of groundless truth. In §6, I elaborate the truthmaker gap model of indeterminacy first sketched by Sorensen (2001, ch.11) and use it to give a reductiveanalysis of indeterminate truth. In §7, I briefly assess what kind of formal framework can best express the possibility of truthmaker gaps. In §8, I contrast what I dub ‘the ordinary conception of worldly indeterminacy’ with Williamson’s conception of worldly indeterminacy. In §9, I show how one can distinguish linguistic from worldly indeterminacy on a truthmaker gap conception. In §10, I briefly sketch the relationship between truthmaker gaps and ignorance. In §11, I assess whether a truthmaker gap conception of vagueness is really just a form of epistemicism. In §12, I propose that truthmaker gaps can yield a plausible model of (semantic) presupposition failure. In §13, in response to the worry that a truthmaker gap conception of indeterminacy is both parochial and controversial—since it commits us to an implausibly strong theory of truthmaking—I set forth a truthmaking gap conception of indeterminacy. In §14, I answer the worry that groundless truths, of whatever species, are just unacceptably queer. A key part of this answer is that a truthmaker–truthmaking gap model of indeterminacy turns out to be considerably less queer than any model of indeterminacy which gives up on Tarski’s T-schema for truth (and cognate schemas). (shrink)
This article aims to show that fundamentality is construed differently in the two most prominent strategies of analysis we find in physical science and engineering today: (1) atomistic, reductiveanalysis and (2) Systems analysis. Correspondingly, atomism is the conception according to which the simplest (smallest) indivisible entity of a certain kind is most fundamental; while systemism , as will be articulated here, is the conception according to which the bonds that structure wholes are most fundamental, and (...) scale and/or constituting entities are of no significance whatsoever for fundamentality. Accordingly, atomists maintain that the basic entities —the atoms —are fundamental, and together with the "external" interactions among them, are sufficient for illuminating all the features and behaviors of the wholes they constitute; whereas systemists proclaim that it is instead structural qualities of systems, that flow from internal relations among their constituents and translate directly into behaviors, that are fundamental, and by themselves largely (if not entirely) sufficient for illuminating the features and behaviors of the wholes thereby structured. Systemism, as will be argued, is consistent with the nonexistence of a fundamental "level" of nondecomposable entities, just as it is consistent with the existence of such a level. Still, systemism is a conception of the fundamental in quite different, but still ontological terms. Systemism can serve the special sciences—the social sciences especially—better than the conception of fundamentality in terms of atoms. Systemism is, in fact, a conception of fundamentality that has rather different uses—and importantly, different resonances. This conception of fundamentality makes contact with questions pertaining to natural kinds and their situation in the metaphysics of the special sciences—their situation within an order of autonomous sciences. The controversy over fundamentality is evident in the social sciences too, albeit somewhat imperfectly, in the terms of debate between methodological individualists and functionalists/holists . This article will thus clarify the difference between systemism and holism. (shrink)
After more than thirty-ﬁve years of debate and discussion, versions of the functionalist theory of mind originating in the work of Hilary Putnam, Jerry Fodor, and David Lewis still remain the most popular positions among philosophers of mind on the nature of mental states and processes. Functionalism has enjoyed such popularity owing, at least in part, to its claim to offer a plausible and compelling description of the nature of the mental that is also consistent with an underlying physicalist or (...) materialist ontology. Yet despite its continued popularity, many philosophers now think that functionalism leaves something out, in particular that functional explanations and analyses fail to account for consciousness, qualia, or phenomenal states of experience or awareness.¹ If the objection is correct, then functionalism fails in its inability.. (shrink)
When we open our eyes, the world seems full of colored opaque objects, light sources, and transparent volumes. One historically popular view, _eliminativism_, is that the world is not in this respect as it appears to be: nothing has any color. Color _realism_, the denial of eliminativism, comes in three mutually exclusive varieties, which may be taken to exhaust the space of plausible realist theories. Acccording to _dispositionalism_, colors are _psychological_ dispositions: dispositions to produce certain kinds of visual experiences. According (...) to both _primitivism_ and _physicalism_, colors are not psychological dispositions; they differ in that primitivism says that no reductiveanalysis of the colors is possible, whereas physicalism says that they are physical properties. This paper is a defense of physicalism about color. (shrink)
In this paper, I propose two theses, and then examine what the consequences of those theses are for discussions of reduction and emergence. The first thesis is that what have traditionally been seen as robust, reductions of one theory or one branch of science by another more fundamental one are a largely a myth. Although there are such reductions in the physical sciences, they are quite rare, and depend on special requirements. In the biological sciences, these prima facie sweeping reductions (...) fade away, like the body of the famous Cheshire cat, leaving only a smile. ... The second thesis is that the “smiles” are fragmentary patchy explanations, and though patchy and fragmentary, they are very important, potentially Nobel-prize winning advances. To get the best grasp of these “smiles,” I want to argue that, we need to return to the roots of discussions and analyses of scientific explanation more generally, and not focus mainly on reduction models, though three conditions based on earlier reduction models are retained in the present analysis. I briefly review the scientific explanation literature as it relates to reduction, and then offer my account of explanation. The account of scientific explanation I present is one I have discussed before, but in this paper I try to simplify it, and characterize it as involving field elements (FE) and a preferred causal model system (PCMS) abbreviated as FE and PCMS. In an important sense, this FE and PCMS analysis locates an “explanation” in a typical scientific research article. This FE and PCMS account is illustrated using a recent set of neurogenetic papers on two kinds of worm foraging behaviors: solitary and social feeding. One of the preferred model systems from a 2002 Nature article in this set is used to exemplify the FE and PCMS analysis, which is shown to have both reductive and nonreductive aspects. The paper closes with a brief discussion of how this FE and PCMS approach differs from and is congruent with Bickle’s “ruthless reductionism” and the recently revived mechanistic philosophy of science of Machamer, Darden, and Craver. (shrink)
This paper discusses the relationship between dispositions and laws and the prospects for any analysis of talk of laws in terms of talk of dispositions. Recent attempts at such a reduction have often been motivated by the desire to give an account of ceteris paribus laws and in this they have had some success. However, such accounts differ as to whether they view dispositions as properties fundamentally of individuals or of kinds. I argue that if dispositions are properties of (...) individuals, we cannot give a complete account of ceteris paribus laws. Alternatively, if dispositions are properties of kinds, any reductiveanalysis of laws would require an extension of the notion of the dispositional beyond its usual meaning so that in effect there can be no reduction of laws to dispositions as traditionally understood. An attempt to reduce the nomological to the dispositional is therefore not the way to provide a unified account of traditional and ceteris paribus laws. (shrink)
During the past three decades, there has been an ongoing debate on the quality of health care. Defining quality is an important part of it. This paper offers a review of definitions and a conceptual analysis in order to understand and explain the differences between them. The analysis results in a semantic rule, expressing the meaning of quality as an optimal balance between possibilities realised and a framework of norms and values. This rule is postulated as a formal (...) criterion of meaning, e.g. when (correctly) applied people understand each other. The rule suits the abstract nature of the term quality. Quality doesn't exist as such. It is constructed in an interaction between people. This interaction is guided by rules in order to transfer information, e.g. communicate on quality. The rule improves our ability to discuss the debate on quality and to develop a theory grounding actions such as quality assurance or quality improvement. (shrink)
Some of these views are realist: objects like oranges and lemons have the colors we mostly take them to have. Others are eliminativist: oranges and lemons are not colored. The usual kind of realism is reductive: the color properties are identified with properties specified in other terms (as ways of altering light, for instance). If no reductiveanalysis is available—if the colors are primitive sui generis properties—this is often taken to be a convincing argument for eliminativism. Realist (...) primitivism, in other words, is usually thought to be untenable. (shrink)
This Alfred Schutz Memorial Lecture discusses the relationship between the phenomenological life-world analysis and the methodology of the social sciences, which was the central motive of Schutz’s work. I have set two major goals in this lecture. The first is to scrutinize the postulate of adequacy, as this postulate is the most crucial of Schutz’s methodological postulates. Max Weber devised the postulate ‘adequacy of meaning’ in analogy to the postulate of ‘causal adequacy’ (a concept used in jurisprudence) and regarded (...) both as complementary and, in the context of sociological analysis, critical. Schutz extracted the two postulates from the Neokantian epistemology, dismissed the concept of causality, and reduced Weber’s two postulates of adequacy into one, namely, the adequacy of meaning. I discuss the benefits and shortcomings of this reduction. A major problem, in my view, is that Schutz’s reformulation lost the empirical concern that was inherent in Weber’s ‘causal adequacy’. As a result, the models of economics (which shaped Schutz’s conception of social science) are considered to be adequate if they are ‘understandable’ to an everyday actor, even when they are based on the most unrealistic assumptions. To recapture Weber’s empirical orientation I recommend a more restrictive interpretation of the postulate of adequacy that links it to qualitative research and unfolds the critical potential of Schutz’s phenomenological life-world analysis. My second goal is to report on some current developments in German sociology in which a number of approaches explicitly refer to Schutz’s analysis of the life-world and attempt to pursue ‘adequate’ empirical research. This lecture focuses on three approaches: ethnophenomenology, life-world analytic ethnography, and social scientific hermeneutics. (shrink)
For the claim that the satisfaction of certain conditions is sufficient for the application of some concept to serve as part of the (`reductive') analysis of that concept, we require the conditions to be specified without employing that very concept. An account of the application conditions of a concept not meeting this requirement, we call analytically circular. For such a claim to be usable in determining the extension of the concept, however, such circularity may not matter, since if (...) the concept figures in a certain kind of intensional context in the specification of the conditions, the satisfaction of those conditions may not itself depend on the extension of the concept. We put this by saying that although analytically circular, the account may yet not be inferentially circular. (shrink)
The article attempts to show some limitations to reductive accounts in science and philosophy of body-mind relations, experience and skill. Extensive literature has developed in analytic philosophy of mind recently due to new technology and theories in the neurosciences. In the sporting sciences, there are also attempts to reduce experiences and skills to biology, mechanics, chemistry and physiology. The article argues there are three fundamental problems for reductive accounts that lead to an explanatory gap between the reduction and (...) the conscious experience. First, reductive accounts deal with objective observations; conscious experiences are subjective. Second, subjective experience seems difficult to identify with physical events described by chemistry, biology, mechanics or neurophysiology. Finally, sport involves knowing how and knowing how is also difficult to reduce to propositional knowledge, which is the reductive scientific/philosophical project. The article argues that sport provides an excellent platform to better understand what is wrong with reductiveanalysis in body-mind relations, since both conscious experience and knowing how are fundamental to sport performance. (shrink)
Various scientific theories stand in a reductive relation to each other. In a recent article, we have argued that a generalized version of the Nagel-Schaffner model (GNS) is the right account of this relation. In this article, we present a Bayesian analysis of how GNS impacts on confirmation. We formalize the relation between the reducing and the reduced theory before and after the reduction using Bayesian networks, and thereby show that, post-reduction, the two theories are confirmatory of each (...) other. We then ask when a purported reduction should be accepted on epistemic grounds. To do so, we compare the prior and posterior probabilities of the conjunction of both theories before and after the reduction and ask how well each is confirmed by the available evidence. (shrink)
The first paragraph of the article reads: "Classical analysis is concerned neither with cataloguing usage nor with intellectual therapy (except of course by aiming to satisfy curiosity and remove puzzlement). Of recent sorts of analysis, it's the attempt to find the "logical structure of the world" or the "logical form" of various facts that chiefly claims our attention. But philosophers in every period have been absorbed by such analysis. Think of the Greek search for real definitions. Or (...) think of metaphysical appearance/reality distinctions, and attempted reductions of appearance to reality: to monads, to spirits and ideas, or to atomic facts.". (shrink)
Lei Zhong (2012. Counterfactuals, regularity and the autonomy approach. Analysis 72: 75–85) argues that non-reductive physicalists cannot establish the autonomy of mental causation by adopting a counterfactual theory of causation since such a theory supports a so-called downward causation argument which rules out mental-to-mental causation. We respond that non-reductive physicalists can consistently resist Zhong's downward causation argument as it equivocates between two familiar notions of a physical realizer.
Jonardon Ganeri, Paul Noordhof, and Murali Ramachandran (1996) have proposed a new counterfactual analysis of causation. We argue that this – the PCA-analysis – is incorrect. In section 1, we explain David Lewis’s ﬁrst counterfactual analysis of causation, and a problem that led him to propose a second. In section 2 we explain the PCA-analysis, advertised as an improvement on Lewis’s later account. We then give counterexamples to the necessity (section 3) and sufﬁciency (section 4) of (...) the PCA-analysis. (shrink)
The method of positron emission tomography (PET imaging) illustrates the circular logic popular in subtractive neuroimaging and linear reductive cognitive psychology. Both require that strictly feed-forward, modular, cognitive components exist, before the fact, to justify the inference of particular components from images (or other observables) after the fact. Also, both require a "true" componential theory of cognition and laboratory tasks, before the fact, to guarantee reliable choices for subtractive contrasts. None of these possibilities are likely. Consequently, linear reductive (...)analysis has failed to yield general, reliable, componential accounts. (shrink)
This paper examines key aspects of Allan Gibbard's psychological account of moral activity. Inspired by evolutionary theory, Gibbard paints a naturalistic picture of morality mainly based on two specific types of emotion: guilt and anger. His sentimentalist and expressivist analysis is also based on a particular conception of rationality. I begin by introducing Gibbard's theory before testing some key assumptions underlying his system against recent empirical data and theories. The results cast doubt on some crucial aspects of Gibbard's philosophical (...) theory, namely his reduction of morality to anger and guilt, and his theory of “normative governance.” Gibbard's particular version of expressivism may be undermined by these doubts. (shrink)