In my commentary on Herman Cappelen and Ernie Lepore’s aptly titled book, Insensitive Semantics, I stake out a middle ground between their version of Semantic Minimalism and Contextualism. My kind of Semantic Minimalism does without the “minimal propositions” posited by C&L. It allows that some sentences do not express propositions, even relative to contexts. Instead, they are semantically incomplete. It is not a form of contextualism, since being semantically incomplete is not a way of being context-sensitive. In their reply (...) to my commentary, C&L seem to miss this point. Exaggerating the force of their slippery slope argument, they continue to suppose that contextualism is the only alternative to their version of minimalism. They contend that I haven’t replied to their central criticism, but in fact they haven’t replied to mine. (shrink)
The central issue of this essay is whether contextualism in epistemology is genuinely in conflict with recent claims that ‘know’ is not in fact a contextsensitive word. To address this question, I will first rehearse three key aims of contextualists and the broad strategy they adopt for achieving them. I then introduce two linguistic arguments to the effect that the lexical item ‘know’ is not context sensitive, one from Herman Cappelen and Ernie Lepore, one from Jason Stanley. I find (...) these and related arguments quite compelling. In particular, I think Cappelen and Lepore (2003, 2005a) show pretty definitively that ‘know’ is not like ‘I’/‘here’/‘now’, and Stanley (2004) shows that ‘know’ is not like ‘tall’/‘rich’.1 One could try to find another model for ‘know’. Instead, I consider whether one can rescue ‘‘the spirit of contextualism in epistemology’’—that is, achieve its aims by deploying a strategy of appealing to speaker context—even while granting that ‘know’ isn’t a context-sensitive word at all. My conclusion, in a nutshell, is this: If there are pragmatic determinants of what is asserted/stated, and contextualism can overcome independent problems not having to do specifically with the context-sensitivity of the word ‘know’, then the spirit of contextualism can be salvaged. Even though, for reasons sketched by the aforementioned authors, ‘know’ doesn’t actually belong in the class of context-sensitive words. (shrink)
Despite all the attention given to Kants universalizability tests, one crucial aspect of Kants thought is often overlooked. Attention to this issue, I will argue, helps us resolve two serious problems for Kants ethics. Put briefly, the first problem is this: Kant, despite his stated intent to the contrary, doesnt seem to use universalization in arguing for duties to oneself, and, anyway, it is not at all clear why duties to oneself should be grounded on a procedure that envisions a (...) world in which everyone wills the contrary of those duties. The second, more global problem is that if we follow Barbara Herman in holding that Kantian ethics can provide a structure for moral deliberation, we need an interpretation of the universalization procedure that unproblematically allows it to generate something like prima facie duties to guide that deliberation; but it is not at all clear that we have such an interpretation. I argue here that if we expand our limited way of thinking about universalization, we can solve the first problem and work towards a solution to the second. We can begin by recalling that Kants Law of Nature formulation (FLN) of the Categorical Imperative obligates us to act as if the maxim of your action were to become by your will a universal law of nature (G, 421). (shrink)
Privacy concerns involving data mining are examined in terms of four questions: (1) What exactly is data mining? (2) How does data mining raise concerns for personal privacy? (3) How do privacy concerns raised by data mining differ from those concerns introduced by traditional information-retrieval techniques in computer databases? (4) How do privacy concerns raised by mining personal data from the Internet differ from those concerns introduced by mining such data from data warehouses? It is argued that the practice of (...) using data-mining techniques, whether on the Internet or in data warehouses, to gain information about persons raises privacy concerns that (a) go beyond concerns introduced in traditional information-retrieval techniques in computer databases and (b) are not covered by present data-protection guidelines and privacy laws. (shrink)
This paper examines the question whether, and to what extent, John Locke’s classic theory of property can be applied to the current debate involving intellectual property rights (IPRs) and the information commons. Organized into four main sections, Section 1 includes a brief exposition of Locke’s arguments for the just appropriation of physical objects and tangible property. In Section 2, I consider some challenges involved in extending Locke’s labor theory of property to the debate about IPRs and digital information. In Section (...) 3, it is argued that even if the labor analogy breaks down, we should not necessarily infer that Locke’s theory has no relevance for the contemporary debate involving IPRs and the information commons. Alternatively, I argue that much of what Locke has to say about the kinds of considerations that ought to be accorded to the physical commons when appropriating objects from it – especially his proviso requiring that “enough and as good” be left for others – can also be applied to appropriations involving the information commons. Based on my reading of Locke’s proviso, I further argue that Locke would presume in favor of the information commons when competing interests (involving the rights of individual appropriators and the preservation of the commons) are at stake. In this sense, I believe that Locke offers us an adjudicative principle for evaluating the claims advanced by rival interests in the contemporary debate about IPRs and the information commons. In Section 4, I apply Locke’s proviso in my analysis of two recent copyright laws: the Copyright Term Extension Act (CTEA), and the Digital Millennium Copyright Act (DMCA). I then argue that both laws violate the spirit of Locke’s proviso because they unfairly restrict the access that ordinary individuals have previously had to resources that comprise the information commons. Noting that Locke would not altogether reject copyright protection for IPRs, I conclude that Locke’s classic property theory provides a useful mechanism for adjudicating between claims about how best to ensure that individuals will be able to continue to access information in digitized form, while at the same time also allowing for that information to enjoy some form of legal protection. (shrink)
This essay examines some ethical aspects of stalkingincidents in cyberspace. Particular attention is focused on the Amy Boyer/Liam Youens case of cyberstalking, which has raised a number of controversial ethical questions. We limit our analysis to three issues involving this particular case. First, we suggest that the privacy of stalking victims is threatened because of the unrestricted access to on-linepersonal information, including on-line public records, currently available to stalkers. Second, we consider issues involving moral responsibility and legal liability for Internet (...) service providers (ISPs) when stalking crimesoccur in their `space' on the Internet. Finally, we examine issues of moral responsibility for ordinary Internet users to determine whether they are obligated to inform persons whom they discover to be the targets of cyberstalkers. (shrink)
The purpose of this essay is to determinewhat exactly is meant by the claimcomputer ethics is unique, a position thatwill henceforth be referred to as the CEIUthesis. A brief sketch of the CEIU debate is provided,and an empirical case involving a recentincident of cyberstalking is briefly consideredin order to illustrate some controversialpoints of contention in that debate. To gain aclearer understanding of what exactly isasserted in the various claims about theuniqueness of computer ethics, and to avoidmany of the confusions currently (...) associatedwith the term ``unique'', a precise definition ofthat term is proposed. We then differentiatetwo distinct and radically differentinterpretations of the CEIU thesis, based onarguments that can be found in the relevantcomputer ethics literature. The twointerpretations are critically analyzed andboth are shown to be inadequate in establishingthe CEIU thesis. We then examine and reject twoassumptions implicit in arguments advanced bothby CEIU advocates and their opponents. Inexposing and rejecting these assumptions, wesee why it is not necessary to accept theconclusions reached by either side in thisdebate. Finally, we defend the view thatcomputer ethics issues are both philosophicallyinteresting and deserving of our attention,regardless of whether those issues might alsohappen to be unique ethical issues. (shrink)
The present study examines certain challenges that KDD (Knowledge Discovery in Databases) in general and data mining in particular pose for normative privacy and public policy. In an earlier work (see Tavani, 1999), I argued that certain applications of data-mining technology involving the manipulation of personal data raise special privacy concerns. Whereas the main purpose of the earlier essay was to show what those specific privacy concerns are and to describe how exactly those concerns have been introduced by the use (...) of certain KDD and data-mining techniques, the present study questions whether the use of those techniques necessarily violates the privacy of individuals. This question is considered vis-à-vis a recent theory of privacy advanced by James Moor (1997). The implications of that privacy theory for a data-mining policy are also considered. (shrink)
In _Insensitive Semantics_, Herman Cappelen and Ernie Lepore (C&L) defend Semantic Minimalism and Speech Act Pluralism. Semantic Minimalism concerns the effect of utterance context on _semantic_ content. It holds, in contrast to the views of a wide variety of linguists and philosophers of language, that this effect is limited to fixing the semantic value of the small number of expressions they argue are genuinely context- sensitive: uncontroversial indexicals, demonstratives, tense markers, and perhaps a few others. What’s more, according to (...) C&L, once this context-sensitivity has been accounted for, a (disambiguated) sentence expresses a truth-evaluable proposition. Speech Act Pluralism concerns _speech act_ content: what a speaker says (asserts, claims, etc.) by a particular utterance of a sentence. Among its central claims are: first, that speech act content typically includes an indefinite range of propositions, as evidenced by the indefinite range of accurate indirect speech reports concerning a particular utterance (call this Basic Pluralism); and, second, that speakers do not have privileged access to what they say, nor must they believe what they sincerely say (call this the Controversial Aspect).1. (shrink)
The present article focusesupon three aspects of computer ethics as aphilosophical field: contemporary perspectives,future projections, and current resources.Several topics are covered, including variouscomputer ethics methodologies, the `uniqueness'of computer ethics questions, and speculationsabout the impact of globalization and theinternet. Also examined is the suggestion thatcomputer ethics may `disappear' in the future.Finally, there is a brief description ofcomputer ethics resources, such as journals,textbooks, conferences and associations.
In this paper, we examine some ethical implications of a controversial court decision in the United States involving Verizon (an Internet Service Provider or ISP) and the Recording Industry Association of America (RIAA). In particular, we analyze the impacts this decision has for personal privacy and intellectual property. We begin with a brief description of the controversies and rulings in this case. This is followed by a look at some of the challenges that peer-to-peer (P2P) systems, used to share digital (...) information, pose for our legal and moral systems. We then examine the concept of privacy to better understand how the privacy of Internet users participating in P2P file-sharing practices is threatened under certain interpretations of the Digital Millennium Copyright Act (DMCA) in the United States. In particular, we examine the implications of this act for a new form of “panoptic surveillance” that can be carried out by organizations such as the RIAA. We next consider the tension between privacy and property-right interests that emerges in the Verizon case, and we examine a model proposed by Jessica Litman for distributing information over the Internet in a way that respects both privacy and property rights. We conclude by arguing that in the Verizon case, we should presume in favor of privacy as the default position, and we defend the view that a presumption should be made in favor of sharing (rather than hoarding) digital information. We also conclude that in the Verizon case, a presumption in favor of property would have undesirable effects and would further legitimize the commodification of digital information – a recent trend that is reinforced by certain interpretations of the DMCA on the part of lawmakers and by aggressive tactics used by the RIAA. (shrink)
This essay examines issues involving personal privacy and informed consent that arise at the intersection of information and communication technology (ICT) and population genomics research. I begin by briefly examining the ethical, legal, and social implications (ELSI) program requirements that were established to guide researchers working on the Human Genome Project (HGP). Next I consider a case illustration involving deCODE Genetics, a privately owned genetics company in Iceland, which raises some ethical concerns that are not clearly addressed in the current (...) ELSI guidelines. The deCODE case also illustrates some ways in which an ICT technique known as data mining has both aided and posed special challenges for researchers working in the field of population genomics. On the one hand, data-mining tools have greatly assisted researchers in mapping the human genome and in identifying certain disease genes common in specific populations (which, in turn, has accelerated the process of finding cures for diseases that affect those populations). On the other hand, this technology has significantly threatened the privacy of research subjects participating in population genomics studies, who may, unwittingly, contribute to the construction of new groups (based on arbitrary and non-obvious patterns and statistical correlations) that put those subjects at risk for discrimination and stigmatization. In the final section of this paper I examine some ways in which the use of data mining in the context of population genomics research poses a critical challenge for the principle of informed consent, which traditionally has played a central role in protecting the privacy interests of research subjects participating in epidemiological studies. (shrink)
This is a lively, provocative book and many of its arguments are convincing. In this critical study I summarize the book, then discuss some of the authors’ claims, dwelling on three issues: their objections to the view of François Recanati on “pre-semantic” effects; the relation between their theory of quotation and the Tarskian “Proper Name Theory,” which they reject; and their treatment of mixed quotation, which rests on the claim that quotation expressions are “syntactic chameleons.” I argue that the objections (...) to Recanati don’t expose any problem with his view, and that the “Proper Name Theory” has all the virtues of their own proposal. Finally I raise some queries about the technical apparatus of syntactic chameleonism. (shrink)
Cognition is readily seen to be connected to evolution through plots of the ratio of cranial capacity to body size of hominids which show two regions of sharply increasing ratios beginning at 2.5 and 0.5 million years ago – precisely the critical times inferred by the author from his study of tools. A similar correlation exists between current human brain growth spurts and the onsets of the Piagetian stages of reasoning development. The first goal of the author's target article is (...) stated to be “to make a case for the relevance of archeological contributions to studies of the evolution of cognition” (sect. 1, Introduction). His analysis focuses on spatial cognition. (shrink)
Consider Dretske's measles example (from page 74 in his Knowldege and the Flow of Information (MIT/Bradford: 1981) ): since the question of whether Alice's being one of Herman's children carries the information that she has the measles is a question about conditional probabilities, we must be careful about our specification of the condition, the antecedent. Although we are to suppose that it is a true generalization that all of Herman's children have the measles, since that is a coincidence, (...) we can just as well suppose that Alice is an only child with the measles. It is of course true that the conditional probability of Alice's having measles given that she has the measles is 1; but that is not relevant to the question Dretske raises. In Dretske's example, the question is whether Alice's being Herman's child carries the information that she has measles. And so the relevant condition in this example is simply Alice's being Herman's child. While it is in fact true that Alice has the measles, that isn't part of the condition: for the question is, "how probable is the one state of affairs given some other state of affairs,". (shrink)
Pulvermüller's discussion needs more explanation of how the proposed assemblies remain assembled after formation and how they can be accessed later among all the possible assemblies, many of which involve many of the same neurons. Alternative Hebbian strengthening mechanisms may provide additional information, and, developmental studies of the assemblies might provide insights into their evolution.
Philosophers on Education provides the most comprehensive history of philosphers' views and impacts on the direction of education, from Plato to Dewey. As Amelie Oksenberg Rorty explains in describing a history of education, we are essentially describing and gaining the clearest understanding of the issues that presently concern and divide us. Philosophical reflection on education has usually been directed to the education of rulers, to those who are presumed to preserve and transmit--or to redirect and transform--the culture of sociey, its (...) knowledge and values. Every historical era is marked by a struggle among claimants to that power. It is only late in the history of liberal democracies that educational policy was formulated for and directed toward autonomous individuals who structure their own lives. The contributors to this collection recognize that history remains actively embedded and expressed in society's beliefs and practices, and that the study of the history of philosophy mandates reflection on its implications for education. The all new essays are written by some of the finest contemporary philosophers: Elizabeth Anderson, Annette C. Baier, Frederick B. Beiser, Eva T. H. Brann, M.F. Burnyeat, William Galston, Daniel Garber, Peter Gay, Alvin I. Goldman, Moshe Halbertal, Tova Hartman Halbertal, Simon Harrison, Barbara Herman, Genevieve Lloyd, Alasdair MacIntyre, Richard W. Miller, Roy P. Mottahedeh, Adam Phillips, Philip L. Quinn, C.D.C. Reeve, Patrick Riley, Amelie Oksenberg Rorty, Emma Rothschild, Alan Ryan, Richard Schacht, Josef Stern, Richard Tuck, Thomas E. Uebel, Jeremy Waldron, Allen Wood, Paul Woodruff, Jean S. Yolton, John W. Yolton, Zhang LoShan (pseudonym). (shrink)
The view defended in this paper - I call it the No-Assertion view - rejects the assumption that it is theoretically useful to single out a subset of sayings as assertions: (v) Sayings are governed by variable norms, come with variable commitments and have variable causes and effects. What philosophers have tried to capture by the term 'assertion' is largely a philosophers' invention. It fails to pick out an act-type that we engage in and it is not a category we (...) need in order to explain any significant component of our linguistic practice. Timothy Williamson (2000) defends a theory of type (i). He says that a theory of assertion has as its goal "[…] that of articulating for the first time the rules of a traditional game that we play" (p. 240). Among those who think we play the game of assertion, there's disagreement about what the rules are. Some think it's a single rule and disagree about what that rule is. Others think the rules change across contexts. According to the No-Assertion view we don’t play the assertion game. The game might exist as an abstract object, but it is not a game you need to learn and play to become a speaker of a natural language. (shrink)
The claim that contemporary analytic philosophers rely extensively on intuitions as evidence is almost universally accepted in current meta-philosophical debates and it figures prominently in our self-understanding as analytic philosophers. No matter what area you happen to work in and what views you happen to hold in those areas, you are likely to think that philosophizing requires constructing cases and making intuitive judgments about those cases. This assumption also underlines the entire experimental philosophy movement: only if philosophers rely on intuitions (...) as evidence are data about non-philosophers' intuitions of any interest to us. Our alleged reliance on the intuitive makes many philosophers who don't work on meta-philosophy concerned about their own discipline: they are unsure what intuitions are and whether they can carry the evidential weight we allegedly assign to them. The goal of this book is to argue that this concern is unwarranted since the claim is false: it is not true that philosophers rely extensively (or even a little bit) on intuitions as evidence. At worst, analytic philosophers are guilty of engaging in somewhat irresponsible use of 'intuition'-vocabulary. While this irresponsibility has had little effect on first order philosophy, it has fundamentally misled meta-philosophers: it has encouraged meta-philosophical pseudo-problems and misleading pictures of what philosophy is. (shrink)
A semantic theory T for a language L should assign content to utterances of sentences of L. One common assumption is that T will assign p to some S of L just in case in uttering S a speaker A says that p. We will argue that this assumption is mistaken.
One of Weatherson's main goals is to drive home a methodological point: We shouldn't be looking for deductive arguments for or against relativism – we should instead be evaluating inductive arguments designed to show that either relativism or some alternative offers the best explanation of some data. Our focus in Chapter Two on diagnostics for shared content allegedly encourages the search for deductive arguments and so does more harm than good. We have no methodological slogan of our own to offer. (...) Part of what we were trying to do was to clearly articulate what the relevant issues even are. Often relativism is characterized in a way that is offhand and sloppy. The relativist, we are told, accepts 'disquotational truth' for various kinds of claims but denies that they are 'true simpliciter'. What exactly is going on here? Do the relevant distinctions even make sense? Before engaging in various abductive manoevers we need to get much clearer about what it is that we are trying to argue for and against. That said we are perfectly happy with the kind of inductive enterprise that Weatherson sketches. For our part, we were fully aware (and indeed explicit) that the 'agreement' diagnostic does not ‘deductively’ settle all of the relevant disputes. A significant part of Chapter Four is dedicated to something in the vicinity of Weatherson's project. Note, indeed, that our diagnostics are even stated using the ideology of 'providing evidence' – hardly the basis for a straightforwardly deductive argument for or against relativism. Finally, though, we should point out that we are not hostile to deductive arguments against relativism. A philosopher's evidence is theory-laden and in part owes itself to epistemic powers that his or her opponents may not acknowledge. In short, their evidence may not always have the hallmarks of 'evidence neutrality' --- evidence that their opponents would recognize as such. We are perfectly open to there being compelling deductive arguments against relativism from such evidence.. (shrink)
This paper evaluates arguments presented by John Perry (and Ken Taylor) in favor of the presence of an unarticulated constituent in the proposition expressed by utterance of, for example, (1):1 1. It's raining (at t). We contend that these arguments are, at best, inconclusive. That's the critical part of our paper. On the positive side, we argue that (1) has as its semantic content the proposition that it is raining (at t) and that this is a location-neutral proposition. According to (...) the view we propose, an audience typically looks for a location when they hear utterances of (1) because their interests in rain are location- focused: it is the location of rain that determines whether we get wet, carrots grow, and roads become slippery. These are, however, contingent facts about rain, wetness, people, carrots, and roads – they are not built into the semantics for the verb 'rain'. (shrink)
According to Kent Bach (forthcoming), our book, Insensitive Semantics (IS), suffers from its 'implicit endorsement' of (1): (1) Every complete sentence expresses a proposition (this is Propositionalism, a fancy version of the old grammar school dictum that every complete sentence expresses a complete thought) (Bach (ms.)) In response (C&L, forthcoming), we claim to be unaware of endorsing (1). No argument in IS depends on (1), we say. We don't claim to have shown that that there couldn't be grammatical sentences the (...) semantic contents of which are not propositional. (shrink)
t. 1. La déduction transcendentale avant la Critique de la raison pure.--t. 2. La déduction transcendentale de 1781 jusqu'à la deuxième édition de la Critique de la raison pure (1887).--t. 3. La déduction transcendentale de 1787 jusqu'à l'Opus postumum.