Defenders of the prosentential theory of truth claim that the English language contains prosentences which function analogously to their better known cousins – pronouns. Statements such as ‘That is true’ or ‘It is true’, they claim, inherit their content from antecedent statements, just as pronouns inherit their reference from antecedent singular terms. Prosentential theorists claim that the content of these prosentences is exhausted by the content of their antecedents. They then use the notion of the inheritance of content from an (...) antecedent to explain the various functions of the predicate ‘. . . is true’. Defenders of the prosentential theory of truth are mistaken, I claim, in thinking that in order to oppose the view that ‘. . . is true’ is used to ascribe a substantive truth property to propositions they need to claim that no uses of ‘. . . is true’ ever attribute any property. I identify an ‘attributive’ use of prosentences in which reliability is implicitly attributed to a subject. I then use the capacity of prosentences to serve as implicit attributions of reliability as a basis for explicating the logical structure of explicit attributions of reliability. The identification of an attributive use of prosentences does not constitute a fundamental change in the prosentential theory. (shrink)
The principal question asked in this paper is: in the case of attributive usage, is the definite description to be analyzed as Russell said or is it to be treated as a referring expression, functioning semantically as a proper name? It answers by defending the former alternative.
This paper enters the continuing fray over the semantic significance of Donnellan’s referential/attributive distinction. Some holdthat the distinction is at bottom a pragmatic one: i.e., that the difference between the referential use and the attributive use arises at the level of speaker’s meaning rather the level of sentence-or utterance-meaning. This view has recently been challenged byMarga Reimer andMichael Devitt, both of whom argue that the fact that descriptions are regularly, that is standardly, usedto refer defeats the pragmatic approach. (...) The present paper examines a variety of issues bearing on the regularity in question: whether the regularity would arise in a Russellian language, whether the regularity is similar to the standard use ofcomplex demonstratives, and whether the pragmatic approach founders on the problem of dead metaphors. I argue that the pragmatic approach can readily explain all of these facets ofthe referential use of descriptions. (shrink)
The origins of these essays -- Introduction -- Presupposition -- A projection problem for speaker presupposition -- Language and linguistic competence -- Linguistics and psychology -- Semantics and psychology -- Semantics and semantic competence -- The necessity argument -- Truth, meaning, and understanding -- Truth and meaning in perspective -- Semantics and pragmatics -- Naming and asserting -- The gap between meaning and assertion : why what we literally say often differs from what our words literally mean -- Drawing the (...) line between meaning and implicaturem and relating both to assertion -- Descriptions -- Incomplete definite descriptions -- Donnellan's referential/attributive distinction -- Why incomplete descriptions don't refute Russell's theory of descriptions -- Meaning and use : lessons for legal interpretation -- Interpreting legal texts : what is and what is not special about the law. (shrink)
Aristotle’s economic thinking in the Nicomachean Ethics 5.5 and Politics 1 provides one of the earliest analyses of the economic nature exchange. Establishing the significance of Aristotle in this area has often led modern commentators to equate Aristotle’s descriptive analysis of use and exchange to the definitions of use-value and exchange-value as it is found in Karl Marx. In this article, I show that Aristotle’s understanding of use and exchange is qualitatively different from this interpretation, focusing in particular on the (...) ethical nature of use and how, for Aristotle, exchange is an extension of practical deliberation. (shrink)
A placebo is a substance or intervention believed to be inactive, but is administered by the healthcare professional as if it was an active medication. Unlike standard treatments, clinical use of placebo usually involves deception and is therefore ethically problematic. Our attitudes toward the clinical use of placebo, which inevitably includes deception or withholding information, have a tremendous effect on our practice regarding truth-telling and informed consent. A casual attitude towards it weakens the current practice based on shared decision-making and (...) mutual trust between patients and healthcare professionals. Issues concerning the clinical use of placebo are thus intimately related to patient-provider relationships, the public's trust in medicine, and medical education. A review of recent survey studies suggests that the clinical use of placebo appears to be fairly well accepted among healthcare professionals and is common in clinical settings in various countries. However, we think that an ethical discussion is urgently needed because of its controversial nature. If judged to be ethically wrong, the practice should end. In the present paper, we discuss the ethicality of the clinical use of placebo with deception and argue against it, concluding that it is unethical and should be banned. We will show that most arguments in favor of the clinical use of placebo can be refuted and are therefore incorrect or weak. These arguments will be presented and examined individually. Finally, we will briefly consider issues relevant to the clinical use of placebo without deception. (shrink)
The concept of dual-use encapsulates the potential for well-intentioned, beneficial scientific research to also be misused by a third party for malicious ends. The concept of dual-use challenges scientists to look beyond the immediate outcomes of their research and to develop an awareness of possible future (mis)uses of scientific research. Since 2001 much attention has been paid to the possible need to regulate the dual-use potential of the life sciences. Regulation initiatives fall under two broad categories—those that develop the ethical (...) education of scientists and foster an awareness and responsibility of dual-use issues, and those which assess the regulation of information being generated by current research. Both types of initiatives are premised on a cautious, risk-adverse philosophy which advocates careful examination of all future endpoints of research endeavors. This caution advocated within initiatives such as pre-publication review of journal articles contrasts to the obligation to share underpinning data sharing discussions. As the dual-use debate has yet to make a significant impact on data sharing discussions (and vice versa) it is possible that these two areas of knowledge control may present areas of ethical conflict for scientists, and thus need to be more closely examined. This paper examines the tension between the obligation to share exemplified by data sharing principles and the concerns raised by the risk-cautious culture of the dual-use debates. The paper concludes by reflecting on the issues of responsibility as raised by dual-use as relating to data sharing, such as the chain of custody for shared data. (shrink)
Kripke made a good case that “…the phi…” is not semantically ambiguous between referential and attributive meanings, and many semanticists agree with Kripke. Russell says that “…the phi…” is always to be analyzed attributively. Agreeing with Kripke that “…the phi…” is not ambiguous, many semanticists have tried to give a Russellian analysis of the referential-attributive distinction: the gross deviations between what is communicated by “…the phi..”, on the one hand, and what Russell’s theory says it literally means, on (...) the other, are chalked up to implicature. This paper shows that, when the phenomenon of implicature is scrutinized, there is overwhelming reason to doubt that a Russellian analysis can succeed. A positive, non-Russellian analysis is proposed: it is shown that, if deﬁnite descriptions are treated as referring expressions, it is easy to deal with the referential-attributive distinction. When “…the phi…” is functioning attributively, the deﬁnite description is seen as referring to some object described in an understood, antecedent existence claim. (shrink)
Quotation exhibits characteristics of both use and mention. I argue against the recently popular pragmatic reductions of quotation to mere language use (Recanati 2001), and in favor of a truly hybrid account synthesizing and extending Potts (2007) and Geurts and Maier (2005), using a mention logic and a dynamic semantics with presupposition to establish a context-driven meaning shift. The main advantages are an account of error neutralization and shifted indexicality under quotation. The current paper addresses the problematic data involving quoted (...) non-constituents. (shrink)
Kaplan, Stalnaker and Wettstein all urge a two-stage theory of language whereon the propositions expressed by sentences are generated prior to being evaluated. A new ambiguity for sentences emerges, propositional rather syntactic or semantic. Kaplan and Wettstein then propose to explain Donnellan's referential/attributive ambiguity as simply being two-stage propositional ambiguity. This is tacitly seen as further confirmation for two-stage theory. Modal ambiguities are prime motivators for two-stage theory which distinguishes local from exotic evaluation to explain them. But if sentences (...) can be found which exhibit both modal and referential/attribute ambiguity, an apparent paradox arises for a two-stage account. The theory recognizes both singular and general propositions, in Kaplan's senses. But reflecting one sense of such a doubly ambiguous sentence, two-stage theory would seem to need a proposition both singular and general with respect to a definite description attributively used. Since modal operators will come into rendering the problem sentences, an obvious idea is to let scope distinctions rescue two-stage theory from the apparent paradox. But while a rescue based on multiple renderings is proposed, it is not strictly a scope rescue, though different scopes are involved. Readers are asked to trust the author on missing formalities of an intuitively transparent two-sorted modal language that is employed. Two-stage theorists explicitly oppose scope treatments of modal ambiguities seeing them as rivals. Stalnaker, in particular, argues against them. But his arguments are shown not to count against the proposed rescue, on which the anticipated rivalry proves to be minimal. (shrink)
It has long ben recognised that there are referential uses of definite descriptions. It is not as widely recognised that there are atttributives uses of idexicals and other such paradigmatically singular terms. I offer an account of the referential/attributive distinction which is intended to give a unified treatment of both sorts of cases. I argue that the best way to account for the referential/attributive distinction is to treat is as semantically underdetermined which sort of propositions is expressed in (...) a context. In certain contexts the proposition expressed will be a descriptive one, and in others it will be an object-dependent one. I appeal to Sperer and Wilson's (1986) idea that the recovery of the content of an utterance involves pragmatic processes of enrichment of a representation of the logical form of utterance. According to the account I offer, the first-level descriptive meaning associated with an expression (whether this is an indexical or a definite description) is pragmatically enriched and then used either to track an individual in the context, or is taken to lay down a condition of satisfaction for an individual. The proposition that the listener takes the speaker to have expressed is recovered on the basis of considerations of relevance and contextually available information about the speaker's directive intentions. Although my account has affinities with those of Recanati (1993) and Nunberg (1993), it also differs from theirs in crucial ways. Each of these authors sees asymmetries where I see none. I give reasons for preferring my symmetrical account. (shrink)
Keith Donnellan (1931 – ) began his studies at the University of Maryland, and earned his Bachelor’s degree from Cornell University. He stayed on at Cornell, earning a Master’s and a PhD in 1961. He also taught at there for several years before moving to UCLA in 1970, where he is currently Emeritus Professor of Philosophy. Donnellan’s work is mainly in the philosophy of language, with an emphasis on the connections between semantics and pragmatics. His most influential work was his (...) 1966 paper “Reference and Definite Descriptions”. In this paper, he challenges the canonical view, due to Bertrand Russell, about definite descriptions. Russell had argued that the proper semantic treatment of a definite description such as “the present king of France” was quantificational. Thus, a sentence like “the present king of France is bald” should be analyzed as “There exists one and only one entity x that is the present king of France, and x is bald”. Donnellan argues that in natural languages, there are actually two different kinds of uses of definite descriptions. Russell’s analysis picks out the “attributive” use of definite descriptions. When we use a definite description (“the F”) this way, we mean to make statements about the unique entity x that is F. However, Donnellan notes that we also sometimes use definite descriptions “referentially” to pick out a given entity and say something about it. To see this, imagine you are at a party where virtually everyone is drinking beer. However, you and your friend are observing a man in a corner of the room holding a martini glass. Unbeknownst to you, the man’s glass is filled with water. You turn to your friend and ask, “who is the man drinking a martini?” Suppose further that your friend knows that the man in question is Fred and that Fred’s glass is filled with water. According to the Russellian attributive analysis, such a question would amount to asking for the identity of the one and only one man drinking a martini. But the presupposition that there is a man drinking a martini is false, and so there should be no answer to the question.. (shrink)
This is a welcome opportunity to clarify my approach to referential uses of definite descriptions, as well as to highlight what I take to be the main shortcomings of the view that definite descriptions have referential meanings. Michael Devitt and I have previously debated referential uses in the course of stating our respective views (see our 2004 articles), but here in this issue we both aim to dispel certain misunderstandings and to sharpen our criticisms of the other’s views.1 Devitt recognizes (...) that it was not enough to target the view that referential uses are akin to particularized conversational implicatures. So now he focuses on the view that they are akin to generalized conversational implicatures (GCIs). He argues that although in principle the GGI model could explain referential uses, it does not in fact provide the best explanation of them. He insists that the fact that definite descriptions are standardized for being used referentially is best explained on the supposition that, as a matter of semantic convention, they have referential meanings, in addition to the quantificational meanings given by Russell’s theory of descriptions. He acknowledges that this commits him to the view that the word ‘the’ is semantically ambiguous. Accordingly, recognizing a use as referential (or as attributive, for that matter) is not like recognizing a GCI but is more akin to, indeed is a case of, disambiguation. Devitt devotes a good part of his article to rebutting my account of referential uses. He challenges the GCI model and identifies a number of difficulties with my view, which he construes as based on that model. However, my account does not rely on that model. I do say that referential uses are “akin” to GCIs, but I did not mean that they are, or involve, GCI. All I meant was that they too are cases of standardized uses, as opposed to.. (shrink)
Our ascriptions of content to utterances in the past attribute to them a level of determinacy that extends beyond what could supervene upon the usage up to the time of those utterances. If one accepts the truth of such ascriptions, one can either (1) argue that future use must be added to the supervenience base that determines meaning, or (2) argue that such cases show that meaning does not supervene upon use at all. The following will argue against authors such (...) as Lance, Hawthorn and Ebbs that first of these options is the more promising of the two. However, maintaining the supervenience thesis ultimately requires that that the doctrine that use determines meaning be understood as 'normative' in two important ways. The first (more familiar) way is that the function from use to meaning must be of a sort that allows us to maintain a robust distinction between correct usage and actual usage. This first type of normativity is accepted by defenders of many more temporally restricted versions of the supervenience thesis, but the second sort of normativity is unique to theories that extend the supervenience base into the future. In particular, if meaning is partially a function of future use, we can understand other commitments we are often taken to have about meaning, particularly the commitment to meaning being 'determinate', as practical commitments that structure our linguistic practices rather than theoretical commitment that merely describe such practices. (shrink)
It is not well known that in his Göttingen period (1900–1916) Edmund Husserl developed a kind of direct reference theory, anticipating,among other things, the distinction between referential and attributive use of adefinite description, which was rediscovered by Keith Donnellan in 1966 and further analysed by Saul Kripke in 1977. This paper defends the claim that Husserl''s idea of the mental act given voice to in an utterance sheds new light on that distinction and particularly on cases where semantic referent (...) and speaker''s referent diverge. It is argued that whenembedded in a dynamic theory of intentionality, the idea of giving-voice-to allows for a pragmatic (as opposed to a purely semantic) analysis of such cases. In Section 1 an example involving a referentially used description is presented, and the view that descriptions that can be used both attributively and referentially are thus ambiguous is criticized. Section 2 is concerned with Husserl''s discussion of a case where someone seems to demonstratively refer to something that he mistakes for something else. On thebasis of this discussion, a dynamic conception of the intentional content (and referent) of the mental act given voice to in an utterance is developed. Section 3 applies this neo-Husserlian conception to the example described in Section 1. Finally, it is shown how this conception serves to elucidate the referential/attributive distinction. (shrink)
The article is devoted to the nature of science. To what extent are science and mathematics affected by the society in which they are developed? Philosophy of science has accepted the social influence on science, but limits it only to the context of discovery (a "locational" approach). An opposite "attributive" approach states that any part of science may be so influenced. L. Graham is sure that even the mathematical equations at the core of fundamental physical theories may display social (...) attributes. He has used the investigations of the famous Soviet physicist V. Fock on the General Theory of Relativity which were under the influence of Marxism. The Goal of the article is to demonstrate: 1) Why Soviet science is not an appropriate subject-matter for testing the thesis of social constructivism, 2) That differnt levels of science and different stages in the development of science undergo social influences in different degrees ranging from very significant and unavoidable to absolutely trivial and easy eliminated. (shrink)
Rights have been criticized as incorporating features that are antithetical to ecofeminism: rights are allegedly inherently adversarial; they are based on a conception of the person that fails to reflect women’s experience, biased in an illegitimate way toward humans rather than nonhumans, overly formal, and incapable of admitting the importance of emotion in ethics. Such criticisms are founded in misunderstandings of the ways in which rights operate and may be met by an adequate theory of rights. The notions of entitlement (...) and immunity that flow from a conception of rights have great use and potential in environmental ethics. Nonetheless, our understanding of moral rights must be revised in order to realize this potential. The usual attribution of moral rights is structurally arbitrary because obligations arising from others’ rights are unjustifiably distinguished from other sorts of obligations for which the same sorts of justificatory bases obtain. Once this arbitrariness is recognized, there remains little reason not to extend a continuous framework of entitlement toward nonhuman animals and nature more generally. Reassessing moral rights according to a basic principle of respect delivers an integrated account of our moral obligations toward one another, and a satisfactory basis from which to account for our diverse obligations toward nonhuman animals and the environment. (shrink)
America’s industries and families continue to forsake cities for suburban and rural environs, in the process leaving nonproductive lands (brownfields) and simultaneously removing greenfield land from agriculturally or biologically productive use. In spite of noteworthy exceptions, urban regions which once functioned as vital communities continue in economic and social decline. Discussion or debate about the problem (or, indeed, whether it is a problem at all) invokes systems of values which often are not articulated. Some attribute the urban exodus to departure (...) from personal ethical norms (e.g., substance abuse, violence, welfare addiction) by urban residents, as though ethical decline is driving the phenomenon. Others take the exact opposite stance, that social and economic decline follow the departure of the economic base. There is no consensus on what government should do about the problem, or whether government should be involved at all. I present elements of a land-use ethic which can accommodate the foregoing. I argue that government is already involved in the brownfields problem because urban flight is facilitated by public policies which de facto subsidize the process. I further argue that the debate invokes key—but unexamined—assumptions regarding limits. Where there are few substitutes for resources and the social cost of exploitation is high, government intervention in the market is necessary; “value-free” economic approaches need to be supplemented by values concerning what ought to be, i.e., what is desirable for society. (shrink)
This study investigates to what extent the amount of variation in a visual scene causes speakers to mention the attribute color in their definite target descriptions, focusing on scenes in which this attribute is not needed for identification of the target. The results of our three experiments show that speakers are more likely to redundantly include a color attribute when the scene variation is high as compared with when this variation is low (even if this leads to overspecified descriptions). We (...) argue that these findings are problematic for existing algorithms that aim to automatically generate psychologically realistic target descriptions, such as the Incremental Algorithm, as these algorithms make use of a fixed preference order per domain and do not take visual scene variation into account. (shrink)
This paper sketches a network of analogies reaching from linguosemiotics (including theory of reference in analytical philosophy of language) to biosemiotics. It results in the following proportion: attributive use of referring expressions : referential use of referring expressions : ‘generative’ use of referring expressions = signifying : referring : ‘poetic pointing’ = ‘functional’ semiosis : ‘adaptational’ semiosis : semiosis in the narrow sense.
A Russellian theory of (definite) descriptions takes an utterance of the form ‘The F is G’ to express a purely general proposition that affirms the existence of a (contextually) unique F: there is exactly one F [which is C] and it is G. Strawson, by contrast, takes the utterer to presuppose in some sense that there is exactly one salient F, but this is not part of what is asserted; rather, when the presupposition is not met, the utterance simply fails (...) to express a (true or false) proposition. A defender of Strawson’s approach, however, must square up to what appear to be straightforward counterexamples to the presupposition thesis, and must also provide an account of certain linguistic phenomena that supposedly demand treating descriptions as quantifiers, as the Russellian theory does. In this paper I propose fresh considerations in favour of Strawson’s approach. I shift attention from what the utterer presupposes to preconditions for the use of descriptions, and distinguish between referring and predicative uses of descriptions (not to be confused with referential and attributive uses); importantly, the referring and predicative uses have different preconditions, I argue, and these provide some satisfactory responses to the aforementioned challenges facing the Strawsonian. (shrink)
Morgan's canon can be construed as claiming that an intentional explanation of a behavior should be ruled out if there exists an explanation of this behavior in terms of 'lower' mechanisms. Unfortunately, Morgan's conception of higher and lower faculties is based on dubious evolutionary considerations. I examine alternative interpretations of the terms 'higher' and 'lower', and show that none can turn the canon into a principle that is both correct and useful in drawing the line between thinkers and non-thinkers. In (...) the process, I identify a number of problems that an adequate formulation of the canon should avoid. I then consider two more recent versions of the canon, proposed by Elliott Sober and Jonathan Bennett. Both are found unsatisfactory, but I argue that a version of Bennett's unity condition that is restricted to the attribution of recognitional concepts is on the right track. (shrink)
Abstract Standards of comparison in Greek can be marked either by a preposition or by use of the genitive case. The prepositional standards are compatible with both synthetic and analytic comparative forms, while genitive standards are found only with synthetic comparatives. I show that this follows if genitive case is assigned by the affix to its complement, and that this structure furthermore supports a straightforward semantic composition, both in predicative and attributive uses.
An ancient argument attributed to the philosopher Carneades is presented that raises critical questions about the concept of an all-virtuous Divine being. The argument is based on the premises that virtue involves overcoming pains and dangers, and that only a being that can suffer or be destroyed is one for whom there are pains and dangers. The conclusion is that an all-virtuous Divine (perfect) being cannot exist. After presenting this argument, reconstructed from sources in Sextus Empiricus and Cicero, this paper (...) goes on to model it as a deductively valid sequence of reasoning. The paper also discusses whet her the premises are true. Questions about the possibility and value of proving and disproving the existence of God by logical reasoning are raised, as well as ethical questions about how the cardinal ethical virtues should be defined. (shrink)
The Internet has drastically changed how people interact, communicate, conduct business, seek jobs, find partners, and shop. Millions of people are using social networking sites to connect with others, and employers are using these sites as a source of background information on job applicants. Employers report making decisions not to hire people based on the information posted on social networking sites. Few employers have policies in place to govern when and how these online character checks should be used and how (...) to ensure that the information viewed is accurate. In this article, we explore how these inexpensive, informal online character checks are harmful to society. Guidance is provided to employers on when and how to use these sites in a socially responsible manner. (shrink)
According to Donnellan the characteristic mark of a referential use of a definite description is the fact that it can be used to pick out an individual that does not satisfy the attributes in the description. Friends and foes of the referential/attributive distinction have equally dismissed that point as obviously wrong or as a sign that Donnellan’s distinction lacks semantic import. I will argue that, on a strict semantic conception of what it is for an expression to be a (...) genuine referential device, Donnellan is right: if a use of a definite description is referential, it has got to be possible for it to refer to an object independently of any attributes associated with the description, including those that constitute its conventional meaning. (shrink)
There is currently much concern over the use of pharmaceuticals and other biomedical techniques to enhance athletic performance—a practice we might refer to as doping. Many justifications of anti-doping efforts claim that doping involves a serious moral transgression. In this article, I review a number of arguments in support of that claim, but show that they are not conclusive, suggesting that we do not have good reasons for thinking that doping is wrong.
The dictum that meaning is use, that for a word to have a meaning is for it to have a use, is typically presented as placing emphasis on the public nature of linguistic activity, as appropriately situating the notion of meaning in its characteristic context of communication, and more generally as dissuading us from a Cartesian conception of subjects as essentially cut off from one another in private realms. According to its proponents, the appeal to use promises to de-mystify meaning (...) by suitably re-connecting talk of meaning with the familiar and concrete linguistic practices into which we are naturally habituated. Since its first airing, the claim that meaning is use has gained considerable currency. While perhaps not as popular as it once was, it is fair to say that, in one form or another, it is accepted by many prominent philosophers. Indeed, Brian Loar claims that contemporary ‘theory of meaning is divided into two: truth theories, and use theories’ (2006: 85; cf. Borg 2004: 4).2 It seems, then, that the The use of ‘use’ ‘Homeric struggle’ Peter Strawson famously identified battles on (2004: 132). (shrink)
There is abundant evidence of contextual variation in the use of “S knows p.” Contextualist theories explain this variation in terms of semantic hypotheses that refer to standards of justification determined by “practical” features of either the subject’s context (Hawthorne & Stanley) or the ascriber’s context (Lewis, Cohen, & DeRose). There is extensive linguistic counterevidence to both forms. I maintain that the contextual variation of knowledge claims is better explained by common pragmatic factors. I show here that one is variable (...) strictness. “S knows p” is commonly used loosely to implicate “S is close enough to knowing p for contextually indicated purposes.” A pragmatic account may use a range of semantics, even contextualist. I use an invariant semantics on which knowledge requires complete justification. This combination meets the Moorean constraint as well as any linguistic theory should, and meets the intuition constraint much better than contextualism. There is no need for ad hoc error theories. The variation in conditions of assertability and practical rationality is better explained by variably strict constraints. It will follow that “S knows p” is used loosely to implicate that the condition for asserting “p” and using it in practical reasoning are satisfied. (shrink)
Increasingly research in the field of business and society suggests that ethics and corporate social responsibility can be profitable. Yet this work raises a troubling question: Is it ethical to use ethics and social responsibility in a strategic way? Is it possible to be ethical or socially responsible for the wrong reason? In this article, we define a strategy concept in order to situate the different approaches to the strategic use of ethics and social responsibility found in the current literature. (...) We then analyze the ethics of such approaches using both utilitarianism and deontology and end by defining limits to the strategic use of ethics. (shrink)
Part I argues that the use theory in Horwich's Meaning does not give sufficient attention to the relation between language and thought. A development of the theory is proposed that gives explanatory priority to the mental. The paper also urges that Horwich's identification of a word's meaning by its role in explaining the cause of sentences should be broadened to include its role in explaining the linguistic and nonlinguistic behavior that sentences cause. Part II argues that Horwich greatly overstates the (...) case for his use theory; that the arguments from ignorance and error against description theories of reference can be adapted against the use theory; and that a tempting development of the use theory would risk both the collapse of the theory into truth referentialism and the difficulties that have plagued truth referentialism. Finally, a consideration of our ordinary thought ascriptions provides evidence against any use theory. (shrink)
Self-plagiarism requires clear definition within an environment that places integrity at the heart of the research enterprise. This paper explores the whole notion of self-plagiarism by academics and distinguishes between appropriate and inappropriate textual re-use in academic publications, while considering research on other forms of plagiarism such as student plagiarism. Based on the practical experience of the authors in identifying academics’ self-plagiarism using both electronic detection and manual analysis, a simple model is proposed for identifying self-plagiarism by academics.
The current debate in medical ethics on placebos focuses mainly on their use in health research. Whereas this is certainly an important topic the discussion tends to overlook another longstanding but nevertheless highly relevant question, namely if and how the placebo effect should be employed in clinical practice. This paper describes the way the placebo effect is perceived in modern medicine and offers some historical reflections on how these perceptions have developed; discusses elements of a definition of the placebo effect; (...) and suggests some conditions under which making use of the therapeutic potential of the placebo effect can be ethically acceptable, if not warranted. (shrink)
What I hope to achieve in this paper is some rather deeper understanding of the semantic and pragmatic properties of utterances which are said to involve the phenomenon of metalinguistic negation[FN1]. According to Laurence Horn, who has been primarily responsible for drawing our attention to it, this is a special non-truthfunctional use of the negation operator, which can be glossed as 'I object to U' where U is a linguistic utterance. This is to be distinguished from descriptive truthfunctional negation which (...) operates over a proposition. (shrink)
A number of prominent philosophers advance the following ideas: (1) Meaning is use. (2) Meaning is an intrinsically normative notion. Call (1) the use thesis, hereafter UT, and (2) the normativity thesis, hereafter NT. They come together in the view that for a linguistic expression to have meaning is for there to be certain proprieties governing its employment.1 These ideas are often associated with a third.
In his celebrated 'Good and Evil' (l956) Professor Geach fights a war on two fronts. On the one hand, he wants to establish, as against the nonnaturalists, that the predicative 'good', as used by Moore, is senseless. 'Good' when properly used is attributive. 'There is no such thing as being just good or bad, [that is, no predicative 'good'] there is only being a good or bad so and so'. (GE, page 65) The predicative 'good' is a philosopher's word (...) and we cannot be 'asked to take it for granted from the outset that a peculiarly philosophical use of words means anything at all'! (GE, page 67.) Attempts to define this phantom have foundered for the simple reason that there is really no such use to be defined. The search for a property for which 'good' stands - a 'way out of the Naturalistic Fallacy' - is a vain one. The idea that 'it' stands for a non-natural property is thus a pseudo-solution to a pseudo-problem. (GE, pages 66-67.) On the other hand, Geach insists, as against non-cognitivists, that good-judgements are entirely 'descriptive'. By a consideration of what it is to be an A, we can determine what it is to be a good A. (shrink)
Kant's discussion of the feeling of respect presents a puzzle regarding both the precise nature of this feeling and its role in his moral theory as an incentive that motivates us to follow the moral law. If it is a feeling that motivates us to follow the law, this would contradict Kant's view that moral obligation is based on reason alone. I argue that Kant has an account of respect as feeling that is nevertheless not separate from the use of (...) reason, but is intrinsic to willing. I demonstrate this by taking literally Kant's references to force in the second Critique. By referring to Kant's pre-critical essay on Negative Magnitudes (1763), I show that Kant's account of how the moral law effects in us a feeling of respect is underpinned by his view that the will is a kind of negative magnitude, or force. I conclude by noting some of the implications of my discussion for Kant's account of virtue. (shrink)
We introduce insertion domains that support the placement of new, higher, vertices into finite trees. We prove that every nonincreasing insertion domain has an element with simple structural properties in the style of classical Ramsey theory. This result is proved using standard large cardinal axioms that go well beyond the usual axioms for mathematics. We also establish that this result cannot be proved without these large cardinal axioms. We also introduce insertion rules that specify the placement of new, higher, vertices (...) into finite trees. We prove that every insertion rule greedily generates a tree with these same structural properties; and every decreasing insertion rule generates (or admits) a tree with these same structural properties. It is also necessary and sufficient to use the same large cardinals (in the precise sense of Corollary D.25). The results suggest new areas of research in discrete mathematics called "Ramsey tree theory" and "greedy Ramsey theory" which demonstrably require more than the usual axioms for mathematics. (shrink)
This paper is on theoretical commitments involved in connecting use and meaning. Wittgenstein maintained, in his Philosophical Investigations, that meaning more or less 'is' use; and he more or less proclaimed that in philosophy, we must 'not advance any kind of theory' (PI § 109). He presented a connection between use and meaning by describing a sequence of language-games where richness of vocabularies and complexity of embedding behaviour grow simultaneously. This presentation is very impressive in the sequence of PI §§ (...) 2, 8, 15, and 21, even if it needs sympathetic touching up. If supplemented, the presentation makes a convincing case for claiming that there is a connection between use and meaning in the following sense: Within everyday, innocent talk about meanings of expressions, all questions and controversies about meanings are ultimately to be answered and to be decided by appeal to correct descriptions of the expressions' use. This may be a very modest statement of the meaning-is-use connection. However, establishing even this modest statement requires that one implicitly relies on controversial, explanatory theories from philosophy of language, as sober analysis of the sequence presented by Wittgenstein will reveal.This is not to say that the modest statement is in any way fishy. Rather, I want to remind readers of how desirable it is to restrict the interpretation of Wittgenstein's famous hostile remarks on theories to that kind of metaphysical misunderstandings of our everyday language which the context of PI § 109 is about.In (1) I characterize, by way of listing examples from the Philosophical Investigations, the area of what I think Wittgenstein regarded as innocent, everyday meaning talk, talk that is not yet infected by bad philosophy. In (2), I argue that what Wittgenstein wanted to show was that such talk is in some sense replaceable by use descriptions, i.e. by descriptions of language-games. In (3), I argue that not all kinds of language-games are relevant; in particular, those of teaching and explaining words have to be excluded. As I restrict myself to the four remaining 'primitive' language-games in PI §§ 2, 8, 15, and 21, I have to defend my approach, in (4), against Joachim Schulte's case for reading Wittgenstein's comparison of these language-games with real languages as ironical. How the invitation to regard such a language-game as a complete, primitive language should in fact be construed is a question I discuss in (5), defending my interpretation against Richard Raatzsch in particular. How increases of expressive power are brought about by increases of the use repertoires is shown by an analysis of modified versions of the language-games in question, and of alternatives thereof, in (6), (7), (8), and (9) respectively, pointing out the places where theoretical commitments enter. Section (10) sums up commitments that have emerged from a sympathetic defence of a modest reading of the meaning-and-use connection. (shrink)
Starting with a discussion of what I call Koyré’s paradox of conceptual novelty, I introduce the ideas of Damerow et al. on the establishment of classical mechanics in Galileo’s work. I then argue that although the view of Damerow et al. on the nature of Galileo’s conceptual innovation is convincing, it misses an essential element: Galileo’s use of the experiments described in the first day of the Two New Sciences. I describe these experiments and analyze their function. Central to my (...) analysis is the idea that Galileo’s pendulum experiments serve to secure the reference of his theoretical models in actually occurring cases of free fall. In this way Galileo’s experiments constitute an essential part of the meaning of the new concepts of classical mechanics. (shrink)
The cognitive developmental theory of ethics suggests that there is a positive relationship between ethical reasoning and ethical behavior. In this study, we trained a sample of accounting and finance students in performing competitive stock trading in our state-of-the-art trading room. The subjects then performed trading of stocks under two experimental conditions: insider information, and no-insider information where significant performance-based financial awards were at stake. We also administered the Defining Issues Test (DIT). Ethical behavior, as the dependent variable was measured (...) in a binary scale: whether the subjects used insider information for trading of stocks or not. Ethical reasoning as measured by the DIT P-score indicated statistically significant effect on ethical behavior. The results have important implications for recruitment and training of professionals engaged in the use of financial markets for securities trading. (shrink)
The philosophical debate over the compatibility between causaldeterminism and moral responsibility relies heavily on ourreactions to examples. Although we believe that there is noalternative to this methodology in this area of philosophy, someexamples that feature prominently in the literature are positivelymisleading. In this vein, we criticize the use that incompatibilistsmake of the phenomenon of ``brainwashing,'''' as well as the Frankfurt-styleexamples favored by compatibilists. We provide an instance of thekind of thought experiment that is needed to genuinely test thehypothesis that moral (...) accountability and causal determinism arecompatible. (shrink)
The relatively recent increase in empirical research conducted in business ethics has been accompanied by a growing literature which addresses its present shortcomings and continuing challenges. Particular attention has been focused on the difficulties of obtaining valid and reliable primary data. However, little or no attention has been paid to the use of secondary data. The aim of this paper is to stimulate the interest of business ethics researchers in using secondary data, either as a substitute or complement for primary (...) data, bearing in mind both the benefits and shortcomings of doing so. It is suggested that secondary data not only offer advantages in terms of cost and effort, as conventionally described in research methods books, but also that in certain cases their use may overcome some of the difficulties that particularly afflict business ethics researchers in the gathering of primary data. In order to help business ethicists respond to this call for greater consideration of the potential offered by secondary data, the wide variety of forms that such data may take is indicated and a number of themes regarding their use discussed. (shrink)
"Semantic Minimalists" holds that there are virtually no semantically context sensitive expressions in English. In particular, they claim that the semantics for terms like "red", "tall", "ready", "every", or "know" are not (contrary to many popular semantic theories) context sensitive. While minimalism strikes many as obviously false, it will be argued here that the view is more plausible than commonly assumed if one accepts the 'normative' conception of the relation between meaning and use characteristic of the literature on semantic externalism.
There is widespread controversy about the use of intuitions in philosophy. In this paper I will argue that there are legitimate concerns about this use, and that these concerns cannot be fully responded to using the traditional methods of philosophy. We need an understanding of how intuitions are generated and what it is they are based on, and this understanding must be founded on the psychological investigation of the mind. I explore how a psychological understanding of intuitions is likely to (...) impact a range of philosophical projects, from conceptual analysis to the study of (non-conceptual) "things themselves" to experimental philosophy. (shrink)
Thought experiments have played a pivotal role in many debates within ethicsâ€”and in particular within applied ethicsâ€”over the past 30 years. Nonetheless, despite their having become a commonly used philosophical tool, there is something odd about the extensive reliance upon thought experiments in areas of philosophy, such as applied ethics, that are so obviously oriented towards practical life. Herein I provide a moderate defence of their use in applied philosophy against those three objections. I do not defend all possible uses (...) of thought experiments but suggest that we should distinguish between legitimate and illegitimate uses. Their legitimate uses are determined not so much by the modal content of any actual thought experiment itself, but by the extent to which the argument in which it is nested follows basic tenets of informal logic and respects the fundamental contingency of applied ethical problems. In pursuing these ideas, I do not so much provide a set of criteria for their legitimate use, but more modestly present two significant ways in which their use can go awry. (shrink)
One of Frege's most characteristic ideas is his conception of truth-values as objects. On his account (from 1891 onwards), concepts are functions that map objects onto one of the two truth-values, the True and the False. These two truth-values are also seen as objects, an implication of Frege's sharp distinction between objects and functions. Crucial to this account is his use of function-argument analysis, and in this paper I explore the relationship between this use and his introduction of truth-values as (...) objects.In the first section I look at Frege's use of function-argument analysis in his first work, the Begriffsschrift, and stress the importance of the idea that such a use permits alternative analyses. In the second section I examine his early notion of conceptual content, and argue that there is a problem in understanding that notion once alternative analyses are allowed. In the third section I turn to his key 1891 paper, 'Function and Concept', where the idea of truth-values as objects first appears, and consider its motivation. In the concluding section I comment on Frege's general philosophical approach, which allowed objects to be readily 'analyzed out' in transforming one sentence into another. (shrink)
According to Horwich’s use theory of meaning, the meaning of a word W is engendered by the underived acceptance of certain sentences containing W. Horwich applies this theory to provide an account of semantic stipulation: Semantic stipulation proceeds by deciding to accept sentences containing an as yet meaningless word W. Thereby one brings it about that W gets an underived acceptance property. Since a word’s meaning is constituted by its (basic) underived acceptance property, this decision endows the word with a (...) meaning. The use-theoretic account of semantic stipulation contrasts with the standard view that semantic stipulation proceeds by assigning the meaning (reference) to W that makes a certain set of sentences express true propositions. In this paper I will argue that the use-theoretic account does not work. I take Frege to have already made the crucial point: "a definition does not assert anything but lays down something ["etwas festsetzt"]” (Frege 1899, 36). A semantic stipulation for W cannot be the decision to accept a sentence containing W or be explained in terms of such an acceptance. Semantic stipulation constitutes a problem for Horwich's use theory of meaning, especially his basic notion of acceptance. (shrink)
Wittgenstein remarked 'What we do is to bring words back from their metaphysical to their everyday use' (PI §116). On this basis, his 'later philosophy' is generally regarded as a version of 'ordinary language philosophy'. He is taken to criticize philosophers for making ('metaphysical') statements which deviate in different ways from the everyday use of some of their component expressions. I marshal textual evidence for another reading of this remark, and show that he used 'metaphysical' in a traditional way, namely, (...) to describe philosophical attempts to delineate the essence of things by establishing necessities and impossibilities. On his conception, 'everyday' simply means 'non-metaphysical' (in this precise sense). Comparisons of philosophical utterances with non-philosophical uses of words are meant to call attention to this crucial distinction. (shrink)
In this paper I will address whether the restriction on the creation of human embryos solely for the purpose of research in which they will be used and destroyed in the creation of human stem cell lines is ethically justified. Of course, a cynical but perhaps accurate reading of the new Obama policy is that leaving this restriction in place was done for political, not ethical, reasons, in light of the apparent public opposition to creating embryos for use in this (...) research. But the issue of whether the restriction is ethically justified remains important, even if only for another day in the policy arena. (shrink)
Making use of capital to develop China’s socialist market economy requires China not only to fully recognize the tendency of capital civilization but also to realize its intrinsic limitations and to seek conditions and a path for overcoming contradictions in the mode of capitalist production. Karl Marx’s theory of capital provides us with a key to understanding and dealing properly with problems of capital. At the same time we should also pay heed to Western research on, experience with, and lessons (...) from capitalist economies developed over the past four centuries summarized in the field of “business ethics”. (shrink)
The emerging concern about software piracy and illegal or unauthorized use of information technology and software has been evident in the media and open literature for the last few years. In the course of conducting their academic assignments, the authors began to compare observations from classroom experiences related to ethics in the use of software and information technology and systems. Qualitatively and anecdotally, it appeared that many if not most, students had misconceptions about what represented ethical and unethical behaviors in (...) these realms. Clearly, one can argue that if college students are uncertain about what constitutes appropriate and inappropriate behavior then this uncertainty will be carried forward into their workplaces upon graduation. Furthermore, if their workplaces don't provide ethics training as a component of a new employee orientation program, one can project a potential for unintentional violations and infringements of copyrights and law in the field. This study was conducted among graduate and undergraduate students to gain insight into their attitudes, perceptions and understanding of some of the relevant ethics issues. A questionnaire of 11 statements was employed that described ubiquitous but most likely unethical (or surely dubious) behaviors in the prevailing business and academic environments. Each respondent was asked to evaluate each statement twice (once for “self” and once for “colleague”) on a five-option highly ethical (5) to neutral (3) to highly unethical (1) scale. The statements were worded such that lower instrument score was associated with higher ethical responses. The questionnaire's two-part structure was designed to solicit honest answers. The encouraging learning from this study was that the overall sample and its various sub-samples did not consider any of the eleven behaviors to be “ethical” or “highly ethical.” It was also encouraging to note that the overall sample and all sub-samples considered “highly unethical” those behaviors associated with personal privacy or property or outright theft. This indicated that moral judgment and probity prevail. The discouraging learning was that behaviors associated with the use of enterprise property were viewed as “neutral” i.e., neither “ethical” nor “unethical.” These findings suggested confusion and lack of clarity and definition around workplace deportment as it regards ethics in software and information technology use. The current study suggests that additional research needs to be conducted to define and clarify the issues, which in turn can form the basis for programs to rectify or at least ameliorate the situation. (shrink)
Although arguments for and against competing theories of vagueness often appeal to claims about the use of vague predicates by ordinary speakers, such claims are rarely tested. An exception is Bonini et al. (1999), who report empirical results on the use of vague predicates by Italian speakers, and take the results to count in favor of epistemicism. Yet several methodological difficulties mar their experiments; we outline these problems and devise revised experiments that do not show the same results. We then (...) describe three additional empirical studies that investigate further claims in the literature on vagueness: the hypothesis that speakers confuse ‘P’ with ‘definitely P’, the relative persuasiveness of different formulations of the inductive premise of the Sorites, and the interaction of vague predicates with three different forms of negation. (shrink)
The idea that the use of instruments in science is theory‐dependent seems to threaten the extent to which the output of those instruments can act as an independent arbiter of theory. This issue is explored by studying an early use of the electron microscope to observe dislocations in crystals. It is shown that this usage did indeed involve the theory of the electron microscope but that, nevertheless, it was possible to argue strongly for the experimental results, the theory of dislocations (...) being tested, and the theory of the instrument, all at the same time. (shrink)
In Coming to Our Senses (1996), I argued for a certain truth-referential theory of meaning and against various other theories, both truth-referential and not. In this paper I shall consider some subsequent developments. I shall start by summarizing my theory. I will then consider some of the latest from direct-reference theorists, particularly from Scott Soames. Finally, I will consider the use theory proposed by Paul Horwich.
Is it permissible to use a human embryo in stem cell research, or in general as a means for benefit of others? Acknowledging each embryo as an object of moral concern, Louis M.Guenin argues that it is morally permissible to decline intrauterine transfer of an embryo formed outside the body, and that from this permission and the duty of beneficence, there follows a consensus justification for using donated embryos in service of humanitarian ends. He then proceeds to show how this (...) justification commands assent even within moral and religious views commonly thought to oppose embryo use. Beneath his moral reasoning lies a carefully constructed metaphysical foundation incorporating accounts of the ontology of development, embryos, and species. He also incisively discusses nonreprocloning, reprocloning, ectogenesis, and related scientific frontiers. This compelling philosophical study will interest all concerned to understand virtue and obligation in the relief of suffering. (shrink)
Quote marks, I claim, serve to select from the multiple ostensions that are produced whenever any expression is uttered; they act to constrain pragmatic ambiguity or indeterminacy. My argument proceeds by showing that the proffered account fares better than its rivals-the Name, Description, Demonstrative, and Identity Theories. Along the way I shall need to explain and emphasize that quoting is not simply the same thing as mentioning. Quoting, but not mentioning, relies on the use of conventional devices.
Moral issues have been included in the studies of consumer misbehavior research, but little is known about the joint moderating effect of moral intensity and moral judgment on the consumer’s use intention of pirated software. This study aims to understand the consumer’s use intention of pirated software in Taiwan based on the theory of planned behavior (TPB) proposed by Ajzen (Organizational Behavior and Human Decision Processes, 50, 179, 1991). In addition, moral intensity and moral judgment are adopted as a joint (...) moderator to examine their combined influence on the proposed research framework. The results obtained from this Taiwan case reveal that the antecedent constructs proposed in the TPB model–an individual’s attitude and subjective norms toward using pirated software, and perceived behavioral control to use pirated software–indeed have positive impacts on the consumer’s use intention of pirated software. In addition, the joint moderating effect of moral intensity and moral judgment is manifested in the consumer’s use intention of pirated software. The results of this study not only could substantiate the results of consumer misbehavior research, but also could provide some managerial suggestions for Taiwanese government authorities concerned and the related software industries devoted to fighting pirated software. (shrink)
As software piracy continues to be a threat to the growth of national and global economies, understanding why people continue to use pirated software and learning how to discourage the use of pirated software are urgent and important issues. In addition to applying the theory of planned behavior (TPB) perspective to capture behavioral intention to use pirated software, this paper considers perceived risk as a salient belief influencing attitude and intention toward using pirated software. Four perceived risk components related to (...) the use of pirated software (performance, social, prosecution and psychological risks) have been identified, measured and tested. Data were collected through an online survey of 305 participants. The results indicate that perceived prosecution risk has an impact on intention to use pirated software, and perceived psychological risk is a strong predictor of attitude toward using pirated software. In addition, attitude and perceived behavior control contribute significantly to the intended use of pirated software. However, the proposed direct relationship between subjective norm and intention to use pirated software is not supported. Implications for research and practice are discussed. (shrink)
At the heart of natural language processing is the understanding of context dependent meanings. This paper presents a preliminary model of formal contexts based on situation theory. It also gives a worked-out example to show the use of contexts in lifting, i.e., how propositions holding in a particular context transform when they are moved to another context. This is useful in NLP applications where preserving meaning is a desideratum.
The contributions of adolescent and parent perspectives to ethical planning of survey research on youth drug use and suicide behaviors are highlighted through an empirical examination of 322 7th-12th graders' and 160 parents' opinions on questions related to 4 ethical dimensions of survey research practice: (a)evaluating research risks and benefits, (b)establishing guardian permission requirements, (c)developing confidentiality and disclosure policies, and (d)using cash incentives for recruitment. Generational and ethnic variation in response to questionnaire items developed from discussions within adolescent and parent (...) focus groups are described. The article concludes with a discussion of the potential contributions and challenges of adolescent and parent perspectives for planning scientifically valid and ethically responsible youth risk survey research. (shrink)
This paper contains a critical discussion of Paul Horwich’s use theory of meaning. Horwich attempts to dissolve the problem of representation through a combination of his theory of meaning and a deflationism about truth. I argue that the dissolution works only if deflationism makes strong and dubious claims about semantic concepts. Horwich offers a specific version of the use theory of meaning. I argue that this version rests on an unacceptable identification: an identification of principles that are fundamental to an (...) explanation of the acceptance of sentences with principles that are fundamental tomeaning. (shrink)
Goffman makes considerable use of the metaphor of social life as theater. This metaphor has a significant impact on his thought in three areas: 1) it is central to his changing views about cynicism and trust in everyday life; 2) metaphor in general is a method of sociological inquiry; and 3) metaphor suggests a "limit" that his later work attempts to transcend.
Inferentialist accounts of concept possession are often supported by examples in which rejection of some inference seems to amount to rejection of some concept, with the apparently implausible consequence that anyone who rejects the inference cannot so much as understand those who use the concept. This consequence can be avoided by distinguishing conditions necessary for direct uses of a concept (to describe the non-cognitive world) from conditions necessary for content-specifying uses (to specify what someone thinks or says). I consider how (...) this claim about the non-uniformity of concept possession accords with different theories of attitude ascription and with claims about reverse compositionality. Surprisingly little stands in the way of the claim that someone unable to use a concept directly can nevertheless satisfy conditions for using it in a content-specifying thought. (shrink)
Use theories of meaning (UTMs) seem ill-equipped to accommodate the intuition that ignorant but deferential speakers use natural kind terms (e.g. 'zinc') and technical expression (e.g. 'credit default swap') with the same meanings as the experts do. After all, their use deviates from the experts', and if use determines meaning, a deviant use ordinarily would determine a deviant meaning. Yet the intuition is plausible and advocates of UTMs believe it can be accommodated. I examine Gilbert Harman's and Paul Horwich's views, (...) and argue that they do not offer a satisfactory reconciliation of the intuition with the theory. I propose an accommodation based on a novel account of semantic deference, and show that it is consistent with UTMs that a speaker may use a word with a certain meaning without fully or adequately knowing it. (shrink)
Peter Geach’s distinction between logically predicative and logically attributive adjectives has gained a certain currency in philosophy. For all that, no satisfactory explanation of what an attributive adjective is has yet been provided. We argue that Geach’s discussion suggests two different ways of understanding the notion. According to one, an adjective is attributive just in case predications of it in combination with a noun fail to behave in inferences like a logical conjunction of two separate predications. According (...) to the other, an adjective is attributive just in case it cannot be applied in a truth-value-yielding fashion unless combined with a noun. The latter way of understanding the notion has been largely neglected by Geach’s critics, but we argue that taking account of it shows the misguided nature of some of their objections, and also yields a more satisfactory explanation of attributivity than does the other. (shrink)
The origin of paraconsistent logic is closely related with the argument, 'from the assertion of two mutually contradictory statements any other statement can be deduced'; this can be referred to as ex contradictione sequitur quodlibet (ECSQ). Despite its medieval origin, only by the 1930s did it become the main reason for the unfeasibility of having contradictions in a deductive system. The purpose of this article is to study what happened earlier: from Principia Mathematica to that time, when it became well (...) established. The two main historical claims that I am going to advance are the following: (1) the first explicit use of ECSQ as the main argument for supporting the necessity of excluding any contradiction from deductive systems is to be found in the first edition of the book Grundz ge der Theoretischen Logik (Hilbert, D. and Ackermann, W. 1928. Grundz ge der Theoretischen Logik . Berlin: Julius Springer Verlag); (2) ukasiewicz's position regarding the logical constraints against contradictions varies considerably from his studies on the principle of (non-) contradiction in Aristotle, published in 1910 and what is stated in his 'authorized lectured notes' on mathematical logic that appeared in 1929. The two texts are: 1) a paper in German ( ukasiewicz, J. 1910. ' ber den Satz des Widerspruchs bei Aristotles'. Bulletin International de l'Acad mie des sciences de Cracovie, Classe d'Histoire et de Philosophie, pp. 15-38) [English translation: ukasiewicz, J. 1971. 'On the principle of contradiction in Aristotle', Review of Metaphysics , XXIV , 485-509]; and 2) a book in Polish. ukasiewicz, J. 1910. O zasadzie sprzecznosci u Aristotelesa Studium krytyczne , Warsaw: Panstwowe Wydawnictwo Naukowe [German translation: ukasiewicz, J. 1993. ber den Satz des Widerspruchs bei Aristotles . Hildesheim: Georg Olms Verlag]. The lecture notes were then published as a book ( ukasiewicz, J. 1958. Elementy Logiki Matematycznej . Warszawa: Panstwowe Wydawnictwo Naukowe [PWN] and then translated into English ( ukasiewicz, J. 1963. Elements of Mathematical Logic. Oxford, New York: Pergamon Press/The Macmillan Company) . The second half of this article will concentrate on ukasiewicz's position on ECSQ. This will lead me to propose that to regard him as a forerunner of paraconsistent logic by virtue of those early writings is accurate only if his book published in Polish is considered but not if the analysis is restricted to the paper originally published in German (as has been the case for the principal reconstructions of the history of paraconsistent logic). Furthermore, I will stress that in the 1929 book he presented one formalization of ECSQ as an axiom for sentential calculus and, also, he used ECSQ to defend the necessity of consistency, apparently independently of Hilbert and Ackermann's book. At the end, I will suggest that the aim of twentieth century usage of ECSQ was to change from the centuries-long philosophical discussion about contradictions to a more 'technical' one. But with paraconsistent logic viewed as a technical solution to this restriction, then, the philosophical problem revives but having now at one's disposal an improved understanding of it. Finally, ukasiewicz's two different positions about ECSQ open an interesting question about the history of paraconsistent logic: do we have to attempt a consistent reconstruction of it, or are we prepared to admit inconsistencies within it? (shrink)
Introduction -- The common hellenic meaning of "genus" -- The Pollaxos legomena or things said in many ways -- Genus in the explanation of change : the subject and substratum principles -- To what is Aristotle's theory of change a response? : the pre-socratic and platonic background -- Change : the principles of nature in physics I -- A first mention of matter and form -- Genus in the explanation of change : the definition of change -- Aristotle's definition of (...) change : physics III -- The circularity objections -- The advantages of Aristotle's theory -- The use of genus in change -- Genus in the explanation of change generation, "for man begets man" (1032-24) -- Generation -- Change and genus -- The generation of animals as organic substances -- Animal generation and the mule -- Genus in definitions : the Aristotelian and platonic division of a genus -- "What is definition?" -- Platonic division and definition -- Aristotle's use of genus and animal taxonomy -- The use of "genus" in Pa I -- Analogy vs. the more and the less -- Taxonomy : the megista gene -- Genus in definitions : why Aristotle was a realist -- Division and definition in Aristotle -- Causal definitions , substantial definitions, and definitions by matter and form -- The unity of definition and division -- The use of genus in definition -- Case study I : the definition of the psyche -- On matter as substratum -- On matter : the domain problem -- On matter : is it substance? -- On matter : potentiality -- The elements : is ontological reduction possible? -- Proper matter and generation revisited -- The indeterminacy vs. nature problem -- On genus as matter -- The analogy interpretation : Aristotle's mention of genus as matter -- The literal interpretation : Aristotle's use of genus as matter -- The principal unity of Aristotle's thought. (shrink)
Joan Weiner has recently claimed that Frege neither uses, nor has any need to use, a truth-predicate in his justification of the logical laws. She argues that because of the assimilation of sentences to proper names in his system, Frege does not need to make use of the Quinean device of semantic ascent in order to formulate the logical laws, and that the predicate ‘is the True’, which is used in Frege's justification, is not to be considered as a truth-predicate, (...) because it does not apply to true sentences or true thoughts. The present paper aims to show that Frege needs to use, and does use, a truth-predicate in this context. It is argued, first, that Frege needs to use a truthpredicate in order to show that the truth of the logical laws is evident from the senses of the sentences by means of which they are formulated, and second, that the predicate that he actually uses, ‘is the True’, must be considered as a truth-predicate in the relevant sense, because it can be used and is actually used by Frege to explain the truth-conditions of thoughts. To defend this interpretation, it is discussed whether the explanatory use of ‘is the True’ in Frege's system is compatible with his deflationary analysis of ‘true’. The paper's conclusion is that there is indeed a conflict here; but, from Frege's point of view, this conflict is due merely to the logical imperfection of natural language and does not affect the proper system but only its propaedeutic. CiteULike Connotea Del.icio.us What's this? (shrink)
While most theoreticians of meaning in the first half of the twentieth century subscribed to a representational theory (viewing meanings as entities stood for by the expressions), the second half of the century was marked by the rise of various versions of use-theories of meaning. The roots of this ‘pragmatist turn’ are detectable in the writings of the later Wittgenstein, the Oxford speech act theorists (Austin, Grice) and the American neopragmatists (Quine, Sellars). Though it is now rather popular (and sometimes (...) even fashionable) to invoke the use-theory of meaning, it is by far not so popular to inquire what such a theory really is. In this paper we try to give at least a part of the answer, whereby we find out that the usual conception of such a theory is unsatisfactory. We propose that for an improvement we must, together with Wittgenstein and Sellars, conceive language as a (tool of a) rule-based activity, which enables us to replace the concept of disposition, usually constituting the backbone of the use-theory, by the concept of propriety. The resulting normative version of the use-theory then becomes the investigation of the rules which expressions acquire vis-`a-vis the rules of the relevant language games – especially of the rules of inference. (shrink)
Making use of facilitating payments is a very widespread form of corruption. These consist of small payments or gifts made to a person – generally a public official or an employee of a private company – to obtain a favour, such as expediting an administrative process; obtaining a permit, licence or service; or avoiding an abuse of power. Unlike the worst forms of corruption, facilitating payments do not usually involve an outright injustice on the part of the payer as they (...) are entitled to what they request. This may be why public opinion tends to condone such payments; often they are assumed to be unavoidable and are excused on the grounds of low wages and lack of professionalism among public officials and disorganisation in government offices. Many companies that take the fight against “grand” corruption very seriously are inclined to overlook these “petty” transgressions, which are seen as the grease that makes the wheels of the bureaucratic machine turn more smoothly. Despite this, facilitating payments have a pernicious effect on the working of public and private administrations: all too often they are the slippery slope to more serious forms of corruption; they impose additional costs on companies and citizens; and in the long run they sap the ethical foundations of organisations. Although many articles on corruption mention facilitating payments, there have been no systematic studies from a company’s point of view. This article thus focuses on facilitating payments from the point of view of the company that makes the payment, either as the active partner (when it is the company that takes the initiative) or as the passive partner (when the official or employee is the instigator). (shrink)
Discussions about biotechnology tend to assume that it is something to do with genetics or manipulating biological processes in some way. However, the field of biometrics––the measurement of physical characteristics––is also biotechnology and is likely to affect the lives of more people more quickly than any other form. The possibility of social exclusion resulting from the use of biometrics data for such uses as identity cards has not yet been fully explored. Social exclusion is unethical, as it unfairly discriminates against (...) individuals or classes of people. Social exclusion is unethical, as it unfairly discriminates against individuals or classes of people. This article looks at some of the ways in which social exclusion might arise from the use of biometric data, and introduces a model of balancing individual interests with which to analyse whether it is justified to run the risk of excluding some members of society for the benefit of others. (shrink)
Tool use rivals language as an important domain of cognitive phenomena, and so as a source of insight into the nature of cognition in general. But the favoured current definition of tool use is inadequate because it does not carve the phenomena of interest at the joints. Heidegger's notion of equipment provides a more adequate theoretical framework. But Heidegger's account leads directly to a non-individualist view of the nature of cognition. Thus non-individualism is supported by concrete considerations about the nature (...) of tools and tool use. (shrink)
That both language and novel life-history stages are unique to humans is an interesting datum. But failure to distinguish between language and language use results in an exaggeration of the language acquisition period, which in turn vitiates claims that new developmental stages were causative factors in language evolution.
The Human Tissue Act 2004 in the United Kingdom clearly represents not a principled approach but instead a compromise, a pragmatic approach which balances several different ethical considerations against each other. In regards to the use of tissue in research it has left much of the more difficult decisions to be made by research ethics committees on a case by case basis. In particular it is now the role of research ethics committees to decide whether research can be carried out (...) using human tissue where no consent was given for the use of this tissue in research. Likewise research ethics committees are now charged with approving of human tissue banks which then need no further ethical approval to carry out research solely using tissue from that bank. There has however been little guidance in regards to the decisions these committees must make. This paper aims to delineate these decisions and offer some philosophical guidance to research ethics committees in making these decisions. (shrink)
The criminal law depends upon 'commonsense' or 'folk' psychology, a seemingly innate theory used by all normal human beings as a means to understand and predict other humans' behavior. This paper discusses two major types of arguments that commonsense psychology is not a true theory of human behavior, and thus should be eliminated and replaced. The paper argues that eliminitivist projects fail to provide evidence that commonsense psychology is a false theory, and argues that there is no need to seek (...) a replacement theory of behavior for use in the criminal law. (shrink)
In a recent article in this journal, Federica Russo and Jon Williamson argue that an analysis of causality in terms of probabilistic relationships does not do justice to the use of mechanistic evidence to support causal claims. I will present Ronald Giere's theory of probabilistic causation, and show that it can account for the use of mechanistic evidence (both in the health sciences—on which Russo and Williamson focus—and elsewhere). I also review some other probabilistic theories of causation (of Suppes, Eells, (...) and Humphreys) and show that they cannot account for the use of mechanistic evidence. I argue that these theories are also inferior to Giere's theory in other respects. (shrink)
The intention here is that of giving a formal underpinning to the idea of ‘meaning-is-use’ which, even if based on proofs, it is rather different from proof-theoretic semantics as in the Dummett–Prawitz tradition. Instead, it is based on the idea that the meaning of logical constants are given by the explanation of immediate consequences, which in formalistic terms means the effect of elimination rules on the result of introduction rules, i.e. the so-called reduction rules. For that we suggest an extension (...) to the Curry– Howard interpretation which draws on the idea of labelled deduction, and brings back Frege’s device of variable-abstraction to operate on the labels (i.e., proof-terms) alongside formulas of predicate logic. (shrink)
A new view of the functional role of the left anterior cortex in language use is proposed. The experimental record indicates that most human linguistic abilities are not localized in this region. In particular, most of syntax (long thought to be there) is not located in Broca's area and its vicinity (operculum, insula, and subjacent white matter). This cerebral region, implicated in Broca's aphasia, does have a role in syntactic processing, but a highly specific one: It is the neural home (...) to receptive mechanisms involved in the computation of the relation between transformationally moved phrasal constituents and their extraction sites (in line with the Trace-Deletion Hypothesis). It is also involved in the construction of higher parts of the syntactic tree in speech production. By contrast, basic combinatorial capacities necessary for language processing – for example, structure-building operations, lexical insertion – are not supported by the neural tissue of this cerebral region, nor is lexical or combinatorial semantics. The dense body of empirical evidence supporting this restrictive view comes mainly from several angles on lesion studies of syntax in agrammatic Broca's aphasia. Five empirical arguments are presented: experiments in sentence comprehension, cross-linguistic considerations (where aphasia findings from several language types are pooled and scrutinized comparatively), grammaticality and plausibility judgments, real-time processing of complex sentences, and rehabilitation. Also discussed are recent results from functional neuroimaging and from structured observations on speech production of Broca's aphasics. Syntactic abilities are nonetheless distinct from other cognitive skills and are represented entirely and exclusively in the left cerebral hemisphere. Although more widespread in the left hemisphere than previously thought, they are clearly distinct from other human combinatorial and intellectual abilities. The neurological record (based on functional imaging, split-brain and right-hemisphere-damaged patients, as well as patients suffering from a breakdown of mathematical skills) indicates that language is a distinct, modularly organized neurological entity. Combinatorial aspects of the language faculty reside in the human left cerebral hemisphere, but only the transformational component (or algorithms that implement it in use) is located in and around Broca's area. Key Words: agrammatism; aphasia; Broca's area; cerebral localization; dyscalculia; functional neuroanatomy; grammatical transformation; modularity; neuroimaging; syntax; trace deletion. (shrink)
Like nuclear energy, most technologies could have dual use—for health and well being and disaster and terror. Some research publications have brought to the forefront the tragic consequences of the latter potential through their possible use. Monitoring life science research and development (R&D) to prevent possible misuse is a challenging task globally, more so in developing economies like India, which are emerging as major biotech hubs. As a signatory to the Biological and Toxin Weapons Convention, India has put in motion (...) a process of evolving a series of measures to manage dual-use technology. The Indian Council of Medical Research (ICMR) has taken a lead in drafting model codes of conduct, ethics and practice for use by other S&T agencies to tailor them as per their requirements. Taking cue from the discussions held by the editors of the various medical and science journals in the developed world, the Indian Journal of Medical Research, the official publication of the ICMR, is working on policy and uniform practice of publication of dual-use research results. The Government of India too has promulgated legal provisions to minimize the risks of misuse of technology, like the Weapons of Mass Destruction Act. Clearly, no single agency would be able to manage the dual-use of technology effectively. Multiple agencies have to come together to work in tandem for effective implementation of various measure and also like Janus, ensure that they are neither too restrictive nor intrusive to discourage the development of science. (shrink)
As computer-based information systems start to have a great impact on people, organizations, and society as a whole, there is much debate about information technology in relation to social control and privacy, security and reliability, and ethics and professional responsibilities. However, more often than not, these debates reveal some fundamental disagreements, sometimes about first principles. In this article the authors suggest that a fruitful and interesting way to conceptualize some of these moral and ethical issues associated with the use of (...) information technology is to apply the principles of Aristotle's ethics to this topic. They argue that framing the moral and ethical choices associated with information technology in Aristotelian terms draws attention to the fact that there are fundamental dilemmas to be addressed. These dilemmas are discussed in relation to the four areas suggested by Dejoie, Fowler, and Paradice (1991): (a) privacy, (b) information accuracy, (c) access to information, and (d) intellectual property rights. The dilemmas associated with all four areas are illustrated with references to recent legal developments in Australia and New Zealand. (shrink)
Using medical literature citations, Congressional hearings, and declassified documents this paper examines the uses of pharmaceuticals in the interrogation of vulnerable populations. From the use of IV relaxants on criminal suspects during the 1920s to the Global War on Terror, the nexus of drugs, testing, and interrogations will be explored in both the domestic and international contexts.
Social exclusion and legal marginalization are important determinants of health outcomes for people who use illicit drugs, sex workers, and persons who face criminal penalties because of homosexuality or transgenderism. Incarceration may add to the health risks associated with police repression and discrimination for these persons. Access to legal services may be essential to positive health outcomes in these populations. Through concrete examples, this paper explores types of legal problems and legal services linked to health outcomes for drug users, sex (...) workers, and sexual minorities and makes recommendations for donors, legal service providers, and civil society organizations. (shrink)
Starting with a discussion of what I call `Koyré's paradox of conceptual novelty', I introduce the ideas of Damerow et al. on the establishment of classical mechanics in Galileo's work. I then argue that although their view on the nature of Galileo's conceptual innovation is convincing, it misses an essential element: Galileo's use of the experiments described in the first day of the Two New Sciences. I describe these experiments and analyze their function. Central to my analysis is the idea (...) that Galileo's pendulum experiments serve to secure the reference of his theoretical models in actually occurring cases of free fall. In this way, Galileo's experiments constitute an essential part of the meaning of the new concepts of classical mechanics. (shrink)
This paper discusses archaeological, historical, and contemporary ethnographic evidence for the use of the San Pedro cactus in northern Peru as a vehicle for traveling between worlds and for imparting the “vista” (magical sight) necessary for shamanic healers to divine the cause of their patients' ailments. Using iconographic, ethnohistorical, and ethnographic evidence for the uninterrupted use of this sacred plant as a means of access to the Divine and as a tool for healing, it describes the relationship between San Pedro, (...) ancestor worship, water/fertility cults and also the common symbolic associations between San Pedro and wind-spirits. It closes by suggesting that the more than 2000 year time-depth of using this plant as a means for accessing the realms of Spirit and as a tool for healing should serve to challenge the unfortunate tendency in the contemporary United States to consider this plant as a “recreational drug.”. (shrink)
Improvements in production methods over the last two decades have resulted in aquaculture becoming a significant contributor to food production in many countries. Increased efficiency and production levels are off-setting unsustainable capture fishing practices and contributing to food security, particularly in a number of developing countries. The challenge for the rapidly growing aquaculture industry is to develop and apply technologies that ensure sustainable production methods that will reduce environmental damage, increase productivity across the sector, and respect the diverse social and (...) cultural dimensions of fish farming that are observed globally. The aquaculture industry currently faces a number of technology trajectories, which include the option to commercially produce genetically modified (GM) fish. The use of genetic modification in aquaculture has the potential to contribute to increased food security and is claimed to be the next logical step for the industry. However, the potential use of these technologies raises a number of important ethical questions. Using an ethical framework, the Ethical Matrix, this paper explores a number of the ethical issues potentially raised by the use of GM technologies in aquaculture. Several key issues have been identified. These include aspects of distributive justice for producers; use of a precautionary approach in the management of environmental risk and food safety; and impacts on the welfare and intrinsic value of the fish. There is a need to conduct a comparative analysis of the full economic cycle of the use of GM fish in aquaculture production for developing countries. There is also a need to initiate an informed dialogue between stakeholders and strenuous efforts should be made to ensure the participation of producers and their representatives from developing nations. An additional concern is that any national licensing of the first generation of GM fish, i.e., in the USA, may initiate and frame an assessment cycle, mediated by the WTO, which could dominate the conditions under which the technology will be applied and regulated globally. Therefore, an integrated analysis of the technology development trajectories, in terms of international policy, IPR, and operational implications, as well as an analysis of a broader range of ethical concerns, is needed. (shrink)
Race is a prominent category in medicine. Epidemiologists describe how rates of morbidity and mortality vary with race, and doctors consider the race of their patients when deciding whether to test them for sickle‐cell anemia or what drug to use to treat their hypertension. At the same time, critics of racial classification say that race is not real but only an illusion or that race is scientifically meaningless. In this paper, I explain how race is used in medicine as a (...) proxy for genes that encode drug metabolizing enzymes and how a proper understanding of race calls into doubt the practice of treating race as a marker of any medically relevant genetic trait. (shrink)
_________________________________________________________________ Abstract : Sperber, Cara, and Girotto (1995) argued that, in Wason’s selection task, relevance-guided comprehension processes tend to determine participants’ performance and pre-empt the use of other inferential capacities. Because of this, the value of the selection task as a tool for studying human inference has been grossly overestimated. Fiddick, Cosmides, and Tooby (2000) argued against Sperber et al. that specialized inferential mechanisms, in particular the “social contract algorithm” hypothesized by Cosmides (1989), pre-empt more general comprehension abilities, making the (...) selection task a useful tool after all. We rebut this argument. We argue and illustrate with two new experiments, that Fiddick et al. mix the true Wason selection task with a trivially simple categorization task superficially similar to the Wason task, yielding methodologically flawed evidence. We conclude that the extensive use of various kinds of selection tasks in the psychology of reasoning has been quite counterproductive and should be discontinued. (shrink)
In the vast literature on human rights and natural law one finds arguments that draw on science or mathematics to support claims to universality and objectivity. Here are two such arguments: 1) Human rights are as universal (i.e., valid independently of their specific historical and cultural Western origin) as the laws and theories of science; and 2) principles of natural law have the same objective (metahistorical) validity as mathematical principles. In what follows I will examine these arguments in some detail (...) and argue that both are misplaced. A section of the paper will be devoted to a discussion of arguments relying on the historical and cultural specificity (and intrinsic superiority) of Western science. The conclusion is that both science and mathematics offer little help to anyone wanting to make use of them as paradigms of universality, objectivity, and rationality. Finally, I will draw some consequences for the idea of human rights. (shrink)
The epistemological status of semantic components of ethnosemantics is investigated with reference to Wittgenstein's definition of the meaning of a word as its use in language. Semantic components, like the intension of words in logistic philosophy, constitute the conditions which must pertain to objects in order that they are denoted by particular words. "Componential meaning" is determined to be another form of "unitary meaning" and hence subject to the same critical arguments made by Wittgenstein against the latter's three fundamental types: (...) (1) meanings are objects, (2) meanings are images, and (3) meanings are feelings and mental experiences. A rebuttal to D'Andrade's labeling rule objection to the usage theory of meaning is presented. (shrink)
This paper examines the relevance and importance of the large number of examples which Aristotle uses in his "Prior Analytics." In the first part of the paper three preliminary issues are raised: First, it investigates what counts as an example in Aristotle's syllogistic, and especially whether only examples expressed in concrete terms should be considered as examples or maybe also propositions and arguments with letters of the alphabet. The second issue concerns the kinds of examples Aristotle actually uses from everyday (...) life as well as from various scientific and philosophical forms of discourse; among these, it seems that biological examples, rather than mathematical ones, have a predominant place. Finally, I discuss what Aristotle himself has to say about the use of examples, and in particular about the similarity between the use of an example and the use of induction. The second part of the paper focusses on the functions of Aristotle's logical examples. It is of course obvious that some of the examples in the Prior Analytics are used to illustrate, and thus to clarify, a definition, a logical rule, a type of argument. However, I think that Aristotle's logical examples have another function, which is philosophically more interesting, namely as integral parts of the procedure of proving something. To support this claim, I analyse three passages from the "Prior Analytics" in which examples are used either in order to prove that something is not the case, i.e. as counter-examples, or in order to prove positively that it is possible for something to be the case. At the end, I argue that for such uses of examples Aristotle uses the notion of 'ekthesis', which seems to have a wider sense than usually suggested; that is to say, it is used to refer to any proof by means of an example, and not only for the procedure which Aristotle uses to reduce imperfect to perfect syllogisms. (shrink)