I examine the relationship between complete analysis and clarificatory analysis and explain why Wittgenstein thought he required both in his account of how to solve the problems of philosophy. I first describe Wittgenstein’s view of how philosophical confusions arise, by explaining how it is possible to misunderstand the logic of everyday language. I argue that any method of logical analysis in the Tractatus will inevitably be circular, but explain why this does not threaten the prospect of solving (...) philosophical problems. I distinguish between complete and clarificatory analysis and argue that Wittgenstein’s ‘strictly correct’ philosophical method is clarificatory analysis. Finally I discuss the relationship between the two forms of analysis and claim that, although, at the time of writing the Tractatus, Wittgenstein believed that the possibility of complete analysis underpins clarificatory analysis, in fact this was a mistake. In the Philosophical Investigations complete analysis is rejected and clarificatory analysis is retained. (shrink)
Is conceptual analysis required for reductive explanation? If there is no a priori entailment from microphysical truths to phenomenal truths, does reductive explanation of the phenomenal fail? We say yes (Chalmers 1996; Jackson 1994, 1998). Ned Block and Robert Stalnaker say no (Block and Stalnaker 1999).
Conceptual analysis is undergoing a revival in philosophy, and much of the credit goes to Frank Jackson. Jackson argues that conceptual analysis is needed as an integral component of so-called serious metaphysics and that it also does explanatory work in accounting for such phenomena as categorization, meaning change, communication, and linguistic understanding. He even goes so far as to argue that opponents of concep- tual analysis are implicitly committed to it in practice. We show that he is (...) wrong on all of these points and that his case for conceptual analysis doesn. (shrink)
It would be nice if good old a priori conceptual analysis were possible. For many years conceptual analysis was out of fashion, in large part because of the excessive ambitions of verificationist theories of meaning._ _However, those days are over._ _A priori conceptual analysis is once again part of the philosophical mainstream._ _This renewed popularity, moreover, is well-founded. Modern philosophical analysts have exploited developments in philosophical semantics to formulate analyses which avoid the counterintuitive consequences of verificationism, while (...) vindicating our ability to know a priori precisely what it is our words and thoughts represent._ _Despite its apparent promise, however, I. (shrink)
It is easy to become battle-weary in metaphysics. In the face of seemingly unresolvable disputes and unanswerable questions, it is tempting to cast aside one’s sword, proclaiming: “there is no fact of the matter who is right!” Sometimes that is the right thing to do. As a case study, consider the search for the criterion of personal identity over time. I say there is no fact of the matter whether the correct criterion is bodily or psychological continuity.1 There exist two (...) candidate meanings for talk of persisting persons, one corresponding to each criterion, and there is simply no fact of the matter which candidate we mean. An argument schema for this sort of “no fact of the matter” thesis will be constructed. An instance of the schema will be defended in the case of personal identity. But scrutiny of this instance will reveal limits of the schema. Questions not settled by conceptual analysis—in particular, some very difficult questions of fundamental ontology—have answers. So do certain questions that can be settled by conceptual analysis, namely those that would be answered definitively by ideal philosophical inquiry. Whether there is a fact of the matter is not easily ascertained merely by looking to see whether disputes seem unresolvable or questions unanswerable: sometimes the truth is out there, however hard (or even impossible) it may be to discover. (shrink)
In this paper, I explore the implications of recent empirical research on concept representation for the philosophical enterprise of conceptual analysis. I argue that conceptual analysis, as it is commonly practiced, is committed to certain assumptions about the nature of our intuitive categorization judgments. I then try to show how these assumptions clash with contemporary accounts of concept representation in cognitive psychology. After entertaining an objection to my argument, I close by considering ways in which conceptual analysis (...) might be altered to accord better with the empirical work. (shrink)
This book provides a concise overview, with excellent historical and systematic coverage, of the problems of the philosophy of language in the analytic tradition. Howard Callaway explains and explores the relation of language to the philosophy of mind and culture, to the theory of knowledge, and to ontology. He places the question of linguistic meaning at the center of his investigations. The teachings of authors who have become classics in the field, including Frege, Russell, Carnap, Quine, Davidson, and Putnam are (...) critically analyzed. I share completely his conviction that contemporary Anglo-American philosophy follows the spirit of the enlightenment in insisting on intellectual sincerity, clarity, and the willingness to meet scientific doubts or objections openly. --Professor Henri Lauener, Editor of Dialectica. (shrink)
Semantic externalism about a class of expressions is often thought to make conceptual analysis about members of that class impossible. In particular, since externalism about natural kind terms makes the essences of natural kinds empirically discoverable, it seems that mere reflection on one's natural kind concept will not be able to tell one anything substantial about what it is for something to fall under one's natural kind concepts. Many hold the further view that one cannot even know anything substantial (...) about the reference-fixers of one's natural kind concepts by armchair reflection. In this paper I want to question this latter view and claim that, because of the way our standard methodology of doing theories of reference relies on semantic intuitions, typical externalists in fact presuppose that one can know the reference-fixers of one's natural kind concepts by mere armchair reflection. The more interesting question is how substantial such knowledge can be. I also take some steps toward answering this question. (shrink)
Philosophers expend considerable effort on the analysis of concepts, but the value of such work is not widely appreciated. This paper principally analyses some arguments, beliefs, and presuppositions about the nature of design and the relations between design and science common in the literature to illustrate this point, and to contribute to the foundations of design theory.
In this paper I discuss the claim (advanced in various ways by Joseph Levine, Frank Jackson and David Chalmers) that the successful reduction of qualitative to physical states requires some sort of intelligible connection between our qualitative and physical concepts, which in turn requires a conceptual analysis of our qualitative concepts in causal-functional terms. While I defend this claim against some of its recent critics, I ultimately dispute it, and propose a different way to get the requisite intelligible connection (...) between qualitative and physical concepts. (shrink)
This essay concerns the question of how we make genuine epistemic progress through conceptual analysis. Our way into this issue will be through consideration of the paradox of analysis. The paradox challenges us to explain how a given statement can make a substantive contribution to our knowledge, even while it purports merely to make explicit what one’s grasp of the concept under scrutiny consists in. The paradox is often treated primarily as a semantic puzzle. However, in “Sect. 1” (...) I argue that the paradox raises a more fundamental epistemic problem, and in “Sects.1 and 2” I argue that semantic proposals—even ones designed to capture the Fregean link between meaning and epistemic significance—fail to resolve that problem. Seeing our way towards a real solution to the paradox requires more than semantics; we also need to understand how the process of analysis can yield justification for accepting a candidate conceptual analysis. I present an account of this process, and explain how it resolves the paradox, in “Sect. 3”. I conclude in “Sect. 4” by considering the implications for the present account concerning the goal of conceptual analysis, and by arguing that the apparent scarcity of short and finite illuminating analyses in philosophically interesting cases provides no grounds for pessimism concerning the possibility of philosophical progress through conceptual analysis. (shrink)
The paper argues that existing interpretations of Kant's Critique of Pure Reason as an "analysis of experience" (e.g., those of Kitcher and Strawson) fail because they do not properly appreciate the method of the work. The author argues that the Critique provides an analysis of the faculty of reason, and counts as an analysis of experience only in a derivative sense.
Conceptual analysis, like any exclusively theoretical activity, is far from overrated in current psychology. Such a situation can be related both to the contingent influences of contextual and historical character and to the more essential metatheoretical reasons. After a short discussion of the latter it is argued that even within a strictly empirical psychology there are non-trivial tasks that can be attached to well-defined and methodologically reliable, conceptual work. This kind of method, inspired by the ideas of Ludwig Wittgenstein, (...) Peter Strawson (conceptual grammar), and Gilbert Ryle (conceptual geography), is proposed and formally depicted as being holistic, descriptive, and connective. Finally, the newly presented framework of connective conceptual analysis is defended against the “Charge from Psychology,” in a version developed by William Ramsey, claiming that conceptual analysis is based on psychological assumptions that have already been refuted by empirical psychology. (shrink)
The main purpose of this article is to undertake a conceptual investigation of the Berlin Wisdom Paradigm: a psychological project initiated by Paul Baltes and intended to study the complex phenomenon of wisdom. Firstly, in order to provide a wider perspective for the subsequent analyses, a short historical sketch is given. Secondly, a meta-theoretical issue of the degree to which the subject matter of the Baltesian study can be identified with the traditional philosophical wisdom is addressed. The main result yielded (...) by a careful conceptual analysis is that the philosophical and psychological concepts of wisdom, though not entirely the same, are at least parallel. Finally, one of the revealed aspects of the Berlin Wisdom Paradigm, i.e. its relative neglect of the non-cognitive and personal aspects of wisdom is brought to the fore. This deficiency, it is suggested, can be remedied by the application of the virtue ethics' conceptual framework. (shrink)
Rational analysis (Anderson 1990, 1991a) is an empiricalprogram of attempting to explain why the cognitive system isadaptive, with respect to its goals and the structure of itsenvironment. We argue that rational analysis has two importantimplications for philosophical debate concerning rationality. First,rational analysis provides a model for the relationship betweenformal principles of rationality (such as probability or decisiontheory) and everyday rationality, in the sense of successfulthought and action in daily life. Second, applying the program ofrational analysis to (...) research on human reasoning leads to a radicalreinterpretation of empirical results which are typically viewed asdemonstrating human irrationality. (shrink)
This paper examines the notion that psychology is autonomous. It is argued that we need to distinguish between (a) the question of whether psychological explanations are autonomous, and (b) the question of whether the process of psychological discovery is autonomous. The issue is approached by providing a reinterpretation of Robert Cummins's notion of functional analysis (FA). A distinction is drawn between FA as an explanatory strategy and FA as an investigative strategy. It is argued that the identification of functional (...) components of the cognitive system may draw on knowledge about brain structure, without thereby jeopardizing the explanatory autonomy of psychology. (shrink)
We consider the question: under what circumstances can the concept of adaptation be applied to groups, rather than individuals? Gardner and Grafen (2009, J. Evol. Biol.22: 659–671) develop a novel approach to this question, building on Grafen's ‘formal Darwinism’ project, which defines adaptation in terms of links between evolutionary dynamics and optimization. They conclude that only clonal groups, and to a lesser extent groups in which reproductive competition is repressed, can be considered as adaptive units. We re-examine the conditions under (...) which the selection–optimization links hold at the group level. We focus on an important distinction between two ways of understanding the links, which have different implications regarding group adaptationism. We show how the formal Darwinism approach can be reconciled with G.C. Williams’ famous analysis of group adaptation, and we consider the relationships between group adaptation, the Price equation approach to multi-level selection, and the alternative approach based on contextual analysis. (shrink)
Robert Cummins has recently used the program of Clark Hull to illustrate the effects of logical positivist epistemology upon psychological theory. On Cummins's account, Hull's theory is best understood as a functional analysis, rather than a nomological subsumption. Hull's commitment to the logical positivist view of explanation is said to have blinded him to this aspect of this theory, and thus restricted its scope. We will argue that this interpretation of Hull's epistemology, though common, is mistaken. Hull's epistemological views (...) were developed independently of, and in considerable contrast to, the principles of logical positivism. (shrink)
This is a volume of specially commissioned essays of analytical philosophy, on topics of current interest in ethics and the philosophy of logic and language. Among the topics discussed are the making of wicked promises, G. E. Moore's early ethical views, as well as indexicals, tense, indeterminism, conventionalism in mathematics, and identity and necessity. The essays are all by former students of Casimir Lewy, until recently Reader in Philosophy at the University of Cambridge and an exponent of a particularly thoroughgoing (...) form of philosophical analysis. Together, they represent some of the best work in these areas at present, and express what may be described as a characteristic 'Cambridge' voice. (shrink)
Some have attempted to justify benefit/ cost analysis by appealing to a moral theory that appears to directly ground the technique. This approach is unsuccessful because the moral theory in question is wildly implausible and, even if it were correct, it would probably not endorse the unrestricted use of benefit/ cost analysis. Nevertheless, there is reason to think that a carefully restricted use of benefit/ cost analysis will be justifiable from a wide variety of plausible moral perspectives. (...) From this, it is reasonable to conclude that such use of the technique is probably morally justified and should be acceptable to most people. (shrink)
Several authors within psychology, neuroscience and philosophy take for granted that standard empirical research techniques are applicable when studying consciousness. In this article, it is discussed whether one of the key methods in cognitive neuroscience – the contrastive analysis – suffers from any serious confounding when applied to the field of consciousness studies; that is to say, if there are any systematic difficulties when studying consciousness with this method that make the results untrustworthy. Through an analysis of theoretical (...) arguments in favour of using contrastive analysis, combined with analyses of empirical findings, I conclude by arguing for three factors that currently are confounding of research using contrastive analysis. These are (1) unconscious processes, (2) introspective reports, and (3) attention. (shrink)
The background hypothesis of this essay is that psychological phenomena are typically explained, not by subsuming them under psychological laws, but by functional analysis. Causal subsumption is an appropriate strategy for explaining changes of state, but not for explaining capacities, and it is capacities that are the central explananda of psychology. The contrast between functional analysis and causal subsumption is illustrated, and the background hypothesis supported, by a critical reassessment of the motivational psychology of Clark Hull. I argue (...) that Hull's work makes little sense construed along the subsumptivist lines he advocated himself, but emerges as both interesting and methodologically sound when construed as an exercise in the sort of functional analysis featured in contemporary cognitive science. (shrink)
Cummins (1982) argues that etiological considerations are not onlyinsufficient butirrelevant for the determination offunction. I argue that his claim of irrelevance rests on a misrepresentation of the use of functions in evolutionary explanations. I go on to suggest how accepting anetiological constraint on functional analysis might help resolve some problems involving the use of functional explanations.
Recent encounters with structuralist and poststructuralist critical theory, linguistics, and cognitive sciences have brought the theory and analysis of music into the orbit of important developments in present-day intellectual history. Without seeking to impose an explicit redefinition of either theory or analysis, this book explores the limits of both. Essays on decidability, ambiguity, metaphor, music as text, and music analysis as cognitive theory are complemented by studies of works by Debussy, Schoenberg, Birtwistle and Boulez.
In this paper we study a new approach to classify mathematical theorems according to their computational content. Basically, we are asking the question which theorems can be continuously or computably transferred into each other? For this purpose theorems are considered via their realizers which are operations with certain input and output data. The technical tool to express continuous or computable relations between such operations is Weihrauch reducibility and the partially ordered degree structure induced by it. We have identified certain choice (...) principles such as co-finite choice, discrete choice, interval choice, compact choice and closed choice, which are cornerstones among Weihrauch degrees and it turns out that certain core theorems in analysis can be classified naturally in this structure. In particular, we study theorems such as the Intermediate Value Theorem, the Baire Category Theorem, the Banach Inverse Mapping Theorem, the Closed Graph Theorem and the Uniform Boundedness Theorem. We also explore how existing classifications of the Hahn—Banach Theorem and Weak Kőnig's Lemma fit into this picture. Well-known omniscience principles from constructive mathematics such as LPO and LLPO can also naturally be considered as Weihrauch degrees and they play an important role in our classification. Based on this we compare the results of our classification with existing classifications in constructive and reverse mathematics and we claim that in a certain sense our classification is finer and sheds some new light on the computational content of the respective theorems. Our classification scheme does not require any particular logical framework or axiomatic setting, but it can be carried out in the framework of classical mathematics using tools of topology, computability theory and computable analysis. We develop a number of separation techniques based on a new parallelization principle, on certain invariance properties of Weihrauch reducibility, on the Low Basis Theorem of Jockusch and Soare and based on the Baire Category Theorem. Finally, we present a number of metatheorems that allow to derive upper bounds for the classification of the Weihrauch degree of many theorems and we discuss the Brouwer Fixed Point Theorem as an example. (shrink)
BackgroundThe Netherlands is one of the few countries where euthanasia is legal under strict conditions. This study investigates whether Dutch newspaper articles use the term ‘euthanasia’ according to the legal definition and determines what arguments for and against euthanasia they contain.MethodsWe did an electronic search of seven Dutch national newspapers between January 2009 and May 2010 and conducted a content analysis.ResultsOf the 284 articles containing the term ‘euthanasia’, 24% referred to practices outside the scope of the law, mostly relating (...) to the forgoing of life-prolonging treatments and assistance in suicide by others than physicians. Of the articles with euthanasia as the main topic, 36% described euthanasia in the context of a terminally ill patient, 24% for older persons, 16% for persons with dementia, and 9% for persons with a psychiatric disorder. The most frequent arguments for euthanasia included the importance of self-determination and the fact that euthanasia contributes to a good death. The most frequent arguments opposing euthanasia were that suffering should instead be alleviated by better care, that providing euthanasia can be disturbing, and that society should protect the vulnerable.ConclusionsOf the newspaper articles, 24% uses the term ‘euthanasia’ for practices that are outside the scope of the euthanasia law. Typically, the more unusual cases are discussed. This might lead to misunderstandings between citizens and physicians. Despite the Dutch legalisation of euthanasia, the debate about its acceptability and boundaries is ongoing and both sides of the debate are clearly represented. (shrink)
Ranking systems such as The Times Higher Education’s World University Rankings and Shanghai Jiao Tong University’s Academic Rankings of World Universities simultaneously mark global status and stimulate global academic competition. As international ranking systems have become more prominent, researchers have begun to examine whether global rankings are creating increased inequality within and between universities. Using a panel Tobit regression analysis, this study assesses the extent to which markers of inter-institutional stratification and organizational segmentation predict global status among US research (...) universities as measured by position in ARWU. Findings indicate some support that both inter-institutional stratification and organizational segmentation predict global status. (shrink)
The late scholastics, from the fourteenth to the seventeenth centuries, contributed to many ﬁelds of knowledge other than philosophy. They developed a method of conceptual analysis that was very productive in those disciplines in which theory is relatively more important than empirical results. That includes mathematics, where the scholastics developed the analysis of continuous motion, which fed into the calculus, and the theory of risk and probability. The method came to the fore especially in the social sciences. In (...) legal theory they developed, for example, the ethical analyses of the conditions of validity of contracts, and natural rights theory. In political theory, they introduced constitutionalism and the thought experiment of a “state of nature”. Their contributions to economics included concepts still regarded as basic, such as demand, capital, labour, and scarcity. Faculty psychology and semiotics are other areas of signiﬁ cance. In such disciplines, later developments rely crucially on scholastic concepts and vocabulary. (shrink)
Classical fitting-attitude analyses understand value in terms of its being fitting, or more generally, there being a reason to favour the bearer of value. Recently, such analyses have been interpreted as referring to two reason-notions rather than to only one. The idea is that the properties of the object provide reason not only for a certain kind of favouring(s) vis-à-vis the object, but the very same properties should also figure in the intentional content of the favouring; the agent should favour (...) the object on account of those properties that provide reason for favouring the object in the first place. While this expansion of the original proposal might seem intuitive given that favourings are discerning attitudes, it is nonetheless argued that proponents of the fitting-attitude analysis are in fact not served by such an expansion of the classical analysis. The objections raised here are relevant not only for advocates and critics of fitting-attitude analyses, but for anyone interested in the relation between normative reasons and motivation. (shrink)
This paper is based on a doctoral thesis which aimed at investigating on whether the use of strategic vagueness in Security Council resolutions relating to Iraq has contributed to the breakout of the 2002–2003s Gulf war instead of a diplomatic solution of the controversies. This work contains a linguistic and legal comparative analysis between UN and U.S. documents and their drafts in order to demonstrate how vagueness was deliberately added to the final versions of the documents before being passed, (...) and thus strategically used vagueness has played a crucial role in UN resolutions related to the outbreak of war in Iraq, and in relevant legislation produced by the United States for its Congressional authorisation for war. The comparative analysis between S/RES/1441(2002) and US legislation has evidenced that that there would have been diplomatic solutions to the Iraq crises which were not synonymous of light-handed intervention against Iraq, but deliberately vague UN wording allowed the US to build its own legislation with a personal interpretation implying that the UN did not impede military action. (shrink)
This paper introduces current acoustic theories relating to the phenomenology of sound as a framework for interrogating concepts relating to the ecologies of acoustic and landscape phenomena in a Japanese stroll garden. By applying the technique of Formal Concept Analysis, a partially ordered lattice of garden objects and attributes is visualized as a means to investigate the relationship between elements of the taxonomy.
This is a contribution to construction of a research roadmap for future cognitive systems, including intelligent robots, in the context of the euCognition network, and UKCRC Grand Challenge 5: Architecture of Brain and Mind. -/- A meeting on the euCognition roadmap project was held at Munich Airport on 11th Jan 2007. This document was in part a response to discussions at that meeting. An explanation of why specifying requirements is a hard problem, and why it needs to be done, along (...) with some suggestions for making progress, can be found in this presentation: http://www.cs.bham.ac.uk/research/projects/cosy/papers/#pr0701 "What's a Research Roadmap For? Why do we need one? How can we produce one?" Working on that presentation made me realise that certain deceptively familiar words and phrases frequently used in this context (e.g. "robust". "flexible", "autonomous") appear not to need explanation because everyone understands them, whereas in fact they have obscure semantics that needs to be elucidated. Only then can we understand what the implications are for research targets. In particular, they need explanation and analysis if they are to be used to specify requirements and research goals, especially for publicly funded projects. -/- First draft analyses are presented here. In the long term I would like to expand and clarify those analyses, and to provide many different examples to illustrate the points made. This will probably have to be a collaborative research activity. (shrink)
ABSTRACT: Confronted with the , several proponents of the fitting attitude analysis of emotional values have argued in favor of an epistemic approach. In such a view, an emotion fits its object because the emotion is correct. However, I argue that we should reorient our search towards a practical approach because only practical considerations can provide a satisfying explanation of the fittingness of emotional responses. This practical approach is partially revisionist, particularly because it is no longer an analysis (...) of final value and because it is relativistic. (shrink)
The development of culture-independent strategies to study microbial diversity and function has led to a revolution in microbial ecology, enabling us to address fundamental questions about the distribution of microbes and their influence on Earth’s biogeochemical cycles. This article discusses some of the progress that scientists have made with the use of so-called “omic” techniques (metagenomics, metatranscriptomics, and metaproteomics) and the limitations and major challenges these approaches are currently facing. These ‘omic methods have been used to describe the taxonomic structure (...) of microbial communities in different environments and to discover new genes and enzymes of industrial and medical interest. However, microbial community structure varies in different spatial and temporal scales and none of the ‘omic techniques are individually able to elucidate the complex aspects of microbial communities and ecosystems. In this article we highlight the importance of a spatiotemporal sampling design, together with a multilevel ‘omic approach and a community analysis strategy (association networks and modeling) to examine and predict interacting microbial communities and their impact on the environment. (shrink)
The two principal models of design in methodological circles in architecture—analysis/synthesis and conjecture/analysis—have their roots in philosophy of science, in different conceptions of scientific method. This paper explores the philosophical origins of these models and the reasons for rejecting analysis/synthesis in favour of conjecture/analysis, the latter being derived from Karl Popper’s view of scientific method. I discuss a fundamental problem with Popper’s view, however, and indicate a framework for conjecture/analysis to avoid this problem.
The work of Bertrand Russell had a decisive influence on the emergence of analytic philosophy, and on its subsequent development. The essays collected in this volume, by one of the leading authorities on Russell's philosophy, all aim at recapturing and articulating aspects of Russell's philosophical vision during his most influential and important period, the two decades following his break with Idealism in 1899. One theme of the collection concerns Russell's views about propositions and their analysis, and the relation of (...) those ideas to his rejection of Idealism. Another theme is the development of Russell's logicism, culminating in Whitehead's and Russell's Principia Mathematica, and Hylton offers a revealing view of the conception of logic which underlies it. Here again there is an emphasis on Russell's argument against Idealism, on the idea that his logicism was a crucial part of that argument. A further focus of the volume is Russell's views about functions and propositional functions. This theme is part of a contrast that Hylton draws between Russell's general philosophical position and that of Frege; in particular, there is a close parallel with the quite different views that the two philosophers held about the nature of philosophical analysis. Hylton also sheds valuable light on the much-disputed idea of an operation, which Wittgenstein advances in the Tractatus Logico-Philosophicus. (shrink)
Does context and context-dependence belong to the research agenda of semantics - and, specifically, of formal semantics? Not so long ago many linguists and philosophers would probably have given a negative answer to the question. However, recent developments in formal semantics have indicated that analyzing natural language semantics without a thorough accommodation of context-dependence is next to impossible. The classification of the ways in which context and context-dependence enter semantic analysis, though, is still a matter of much controversy and (...) some of these disputes are ventilated in the present collection. This book is not only a collection of papers addressing context-dependence and methods for dealing with it: it also records comments to the papers and the authors' replies to the comments. In this way, the contributions themselves are contextually dependent. In view of the fact that the contributors to the volume are such key figures in contemporary formal semantics as Hans Kamp, Barbara Partee, Reinhard Muskens, Nicholas Asher, Manfred Krifka, Jaroslav Peregrin and many others, the book represents a quite unique inquiry into the current activities on the semantics side of the semantics/pragmatics boundary. (shrink)
This is the first collection to bring together well-known scholars writing from feminist perspectives within critical discourse analysis. The theoretical structure of CDA is illustrated with empirical research in Eastern and Western Europe, New Zealand, Asia, South America and the US, demonstrating the complex workings of power and ideology in discourse in sustaining particular gender(ed) orders. These studies deal with texts and talk in domains ranging from parliamentary settings, news and advertising media, the classroom, community literacy programs and the (...) workplace. (shrink)
This fundamental and straightforward text addresses a weakness observed among present-day students, namely a lack of familiarity with formal proof. Beginning with the idea of mathematical proof and the need for it, associated technical and logical skills are developed with care and then brought to bear on the core material of analysis in such a lucid presentation that the development reads naturally and in a straightforward progression. Retaining the core text, the second edition has additional worked examples which users (...) have indicated a need for, in addition to more emphasis on how analysis can be used to tell the accuracy of the approximations to the quantities of interest which arise in analytical limits. (shrink)
A systematic rhetorical analysis may reveal elements of multimodal argumentative discourse that would otherwise remain hidden. In this article, we present simultaneously (both) the basics of the method we have developed to integrate theories about different modalities in one parallel processing framework for rhetorical analysis and the results of its application to an intriguing ad.
The explanatory gap . Consciousness is a mystery. No one has ever given an account, even a highly speculative, hypothetical, and incomplete account of how a physical thing could have phenomenal states. (Nagel, 1974, Levine, 1983) Suppose that consciousness is identical to a property of the brain, say activity in the pyramidal cells of layer 5 of the cortex involving reverberatory circuits from cortical layer 6 to the thalamus and back to layers 4 and 6,as Crick and Koch have suggested (...) for visual consciousness. (See Crick (1994).) Still, that identity itself calls out for explanation! Proponents of an explanatory gap disagree about whether the gap is permanent. Some (e.g. Nagel, 1974) say that we are like the scientifically naive person who is told that matter = energy, but does not have the concepts required to make sense of the idea. If we can acquire these concepts, the gap is closable. Others say the gap is uncloseable because of our cognitive limitations. (McGinn, 1991) Still others say that the gap is a consequence of the fundamental nature of consciousness. (shrink)
My aim here is threefold: (a) to show that conceptual facts play a more significant role in justifying explanatory reductions than most of the contributors to the current debate realize; (b) to furnish an account of that role, and (c) to trace the consequences of this account for conceivability arguments about the mind.
The following is a transcript of the interview I (Yasuko Kitano) conducted with Neil Levy (The Centre for Applied Philosophy and Public Ethics, CAPPE) on the 23rd in July 2009, while he was in Tokyo to give a series of lectures on neuroethics at The University of Tokyo Center for Philosophy. I edited his words for publication with his approval.
After more than thirty-ﬁve years of debate and discussion, versions of the functionalist theory of mind originating in the work of Hilary Putnam, Jerry Fodor, and David Lewis still remain the most popular positions among philosophers of mind on the nature of mental states and processes. Functionalism has enjoyed such popularity owing, at least in part, to its claim to offer a plausible and compelling description of the nature of the mental that is also consistent with an underlying physicalist or (...) materialist ontology. Yet despite its continued popularity, many philosophers now think that functionalism leaves something out, in particular that functional explanations and analyses fail to account for consciousness, qualia, or phenomenal states of experience or awareness.¹ If the objection is correct, then functionalism fails in its inability.. (shrink)
During the past three decades, there has been an ongoing debate on the quality of health care. Defining quality is an important part of it. This paper offers a review of definitions and a conceptual analysis in order to understand and explain the differences between them. The analysis results in a semantic rule, expressing the meaning of quality as an optimal balance between possibilities realised and a framework of norms and values. This rule is postulated as a formal (...) criterion of meaning, e.g. when (correctly) applied people understand each other. The rule suits the abstract nature of the term quality. Quality doesn't exist as such. It is constructed in an interaction between people. This interaction is guided by rules in order to transfer information, e.g. communicate on quality. The rule improves our ability to discuss the debate on quality and to develop a theory grounding actions such as quality assurance or quality improvement. (shrink)
Jonardon Ganeri, Paul Noordhof, and Murali Ramachandran (1996) have proposed a new counterfactual analysis of causation. We argue that this – the PCA-analysis – is incorrect. In section 1, we explain David Lewis’s ﬁrst counterfactual analysis of causation, and a problem that led him to propose a second. In section 2 we explain the PCA-analysis, advertised as an improvement on Lewis’s later account. We then give counterexamples to the necessity (section 3) and sufﬁciency (section 4) of (...) the PCA-analysis. (shrink)
This article discusses the latest developments regarding euthanasia and palliative care in the Netherlands. On the one hand, a legally codified practice of euthanasia has been established. On the other hand, there has been a strong development of palliative care. The combination of these simultaneous processes seems to be rather unique. This contribution first focuses on these remarkable developments. Subsequently, the analysis concentrates on the question of how these new developments have influenced the ethical debate.
The work of Bertrand Russell had a decisive influence on the emergence of analytic philosophy, and on its subsequent development. The prize-winning Russell scholar Peter Hylton presents here some of his most celebrated essays from the last two decades, all of which strive to recapture and articulate Russell's monumental vision. Relating his work to that of other philosophers, particularly Frege and Wittgenstein, and featuring a previously unpublished essay and a helpful new introduction, the volume will be essential for anyone engaged (...) with the history of twentieth-century ideas. (shrink)
A recently published book, 'The Economics of Health Reconsidered' by Tom Rice, provides a strong critique of the role of markets in health care. Many of the issues of 'market failure' raised by Rice, however, have been, to varying extents, recognised previously in the health economics literature (at least outside the U.S.). What perhaps sets Rice's book apart from previous attempts to document such issues is its elegance and the methodical manner in which this critique is delivered. Significantly the critique (...) is based solely on conventional economic arguments. There has, however, been an emerging strand of the health economics literature not acknowledged in Rice's book which has approached some of these issues of market failure from a different perspective. Notably this research has involved, in part, borrowing from the ideas and methodological traditions of other disciplines. The emphasis in this work has been to expand the scope and the concerns of economic analysis in health care. (shrink)
Medical criteria rooted in evidence-based medicine are often seen as a value-neutral âtrump cardâ which puts paid to any further debate about setting priorities for treatment. On this argument, doctors should stop providing treatment at the point when it becomes medically futile, and that is also the threshold at which the health purchaser should stop purchasing. This paper offers three kinds of ethical criteria as a counterweight to analysis based solely on medical criteria. The first set of arguments concerns (...) futility, probability and utility; the second, justice and fairness; the third, consent and competence. The argument is illustrated by two recent case studies about futility and priority-setting: the US example of âBaby Ryanâ and the UK case of âChild Bâ. (shrink)
We construct an algebra of generalized functions endowed with a canonical embedding of the space of Schwartz distributions.We offer a solution to the problem of multiplication of Schwartz distributions similar to but different from Colombeauâs solution.We show that the set of scalars of our algebra is an algebraically closed field unlike its counterpart in Colombeau theory, which is a ring with zero divisors. We prove a HahnâBanach extension principle which does not hold in Colombeau theory. We establish a connection between (...) our theory with non-standard analysis and thus answer, although indirectly, a question raised by Colombeau. This article provides a bridge between Colombeau theory of generalized functions and non-standard analysis. (shrink)
Building on the modified theory of planned behavior (TPB), this study examined the underlying psychological motives for academic dishonesty in a sample of 250 undergraduates drawn from three selected Malaysian public universities. The results yielded additional supports for usefulness of modified TPB model in predicting academic misconduct. All components of the model exerted statistically significant effects on intention towards academic misconduct, and intention itself exerted a statistically significant impact on academic dishonesty. This suggests that students’ academic misconducts could be addressed (...) effectively if proper attention is given to the underpinning factors. Further, the findings revealed that the hypothesized relationships among variables of the modified model were all statistically significant. The uniqueness of this study lies in the large amount of variance (69 % and 75 %) explained by components of the model (in the prediction of intention and academic dishonesty respectively). These variances have rarely been accounted for in the previous studies. Implications of the findings are discussed and suggestions advanced for future studies. (shrink)
When assessing the cost effectiveness of health care programmes, health economists typically presume that distant events should be given less weight than present events. This article examines the moral reasonableness of arguments advanced for positive discounting in cost-effectiveness analysis both from an intergenerational and an intrapersonal perspective and assesses if arguments are equally applicable to health and monetary outcomes. The article concludes that behavioral effects related to time preferences give little or no reason for why society at large should (...) favour the present over the future when making intergenerational choices regarding health. The strongest argument for discounting stems from the combined argument of diminishing marginal utility in the presence of growth. However, this hinges on the assumption of actual growth in the relevant good. Moreover, current modern democracy may be insufficiently sensitive to the concerns of future generations. The second part of the article categorises preference failures (which justify paternalistic responses) into two distinct groups, myopic and acratic. The existence of these types of preference failures makes elicited time preferences of little normative relevance when making decisions regarding the social discount rate, even in an intrapersonal context. As with intergenerational discounting, the combined arguments of growth and diminishing marginal utility offer the strongest arguments for discounting in the intrapersonal context. However, there is no prima facie reason to assume that this argument should apply equally to health and monetary values. To be sure, selecting an approach towards discounting health is a complex matter. However, the life-or-death implications of any approach require that the discussion not be downplayed to merely a technical matter for economists to settle. (shrink)
This study proposes a new quadrat method that can be applied to the study of point distributions in a network space. While the conventional planar quadrat method remains one of the most fundamental spatial analytical methods on a two-dimensional plane, its quadrats are usually identified by regular, square grids. However, assuming that they are observed along a network, points in a single quadrat are not necessarily close to each other in terms of their network distance. Using planar quadrats in such (...) cases may distort the representation of the distribution pattern of points on a network. The network-based units used in this article, on the other hand, consist of subsets of the actual network, providing more accurate aggregation of the data points along the network. The performance of the network-based quadrat method is compared with that of the conventional quadrat method through a case study on a point distribution on a network. The χ2 statistic and Moran's I statistic of the two quadrat types indicate that (1) the conventional planar quadrat method tends to overestimate the overall degree of dispersion and (2) the network-based quadrat method derives a more accurate estimate on the local similarity. The article also performs sensitivity analysis on network and planar quadrats across different scales and with different spatial arrangements, in which the abovementioned statistical tendencies are also confirmed. (shrink)
One of the most powerful tools in science is the analytic method, whereby we seek to understand complex systems by studying simpler sub-systems from which the complex is composed. If this method is to be successful, something about the sub-systems must remain invariant as we move from the relatively isolated conditions in which we study them, to the complex conditions in which we want to put our knowledge to use. This paper asks what this invariant could be. The paper shows (...) that the kinds of thing that a Humean might point to – behaviour, laws, and dispositions – cannot play the role required of the invariant in question. Nor, indeed, can non-Humean causal powers of the kind advocated by contemporary metaphysicians such as Ellis and Lierse. The paper suggests that the analytic method presupposes a kind of entity that does not appear in standard ontologies – a metaphysically substantial notion of causal influence. This notion of causal influence is one that Cartwright has also seen the need for, though she does not seem to take the notion as seriously as she should. (shrink)
John Hospers. By means of our senses, or so we ordinarily believe, we come to know of the existence of physical objects such as tables and trees, rocks and hills , stars and human bodies. But are our senses infalliable? How do we know that ...