Although arguments for and against competing theories of vagueness often appeal to claims about the use of vague predicates by ordinary speakers, such claims are rarely tested. An exception is Bonini et al. , who report empirical results on the use of vague predicates by Italian speakers, and take the results to count in favor of epistemicism. Yet several methodological difficulties mar their experiments; we outline these problems and devise revised experiments that do not show the same results. We then (...) describe three additional empirical studies that investigate further claims in the literature on vagueness: the hypothesis that speakers confuse ‘P’ with ‘definitely P’, the relative persuasiveness of different formulations of the inductive premise of the Sorites, and the interaction of vague predicates with three different forms of negation. (shrink)
Although arguments for and against competing theories of vagueness often appeal to claims about the use of vague predicates by ordinary speakers, such claims are rarely tested. An exception is Bonini et al. (1999), who report empirical results on the use of vague predicates by Italian speakers, and take the results to count in favor of epistemicism. Yet several methodological difficulties mar their experiments; we outline these problems and devise revised experiments that do not show the same results. We then (...) describe three additional empirical studies that investigate further claims in the literature on vagueness: the hypothesis that speakers confuse ‘P’ with ‘definitely P’, the relative persuasiveness of different formulations of the inductive premise of the Sorites, and the interaction of vague predicates with three different forms of negation. (shrink)
This book, published in 2000, is a clear account of causation based firmly in contemporary science. Dowe discusses in a systematic way, a positive account of causation: the conserved quantities account of causal processes which he has been developing over the last ten years. The book describes causal processes and interactions in terms of conserved quantities: a causal process is the worldline of an object which possesses a conserved quantity, and a causal interaction involves the exchange of conserved quantities. Further, (...) things that are properly called cause and effect are appropriately connected by a set of causal processes and interactions. The distinction between cause and effect is explained in terms of a version of the fork theory: the direction of a certain kind of ordered pattern of events in the world. This particular version has the virtue that it allows for the possibility of backwards causation, and therefore time travel. (shrink)
In this article we argue that nanotechnology represents an extraordinary opportunity to build in a robust role for the social sciences in a technology that remains at an early, and hence undetermined, stage of development. We examine policy dynamics in both the United States and United Kingdom aimed at both opening up, and closing down, the role of the social sciences in nanotechnologies. We then set out a prospective agenda for the social sciences and its potential in the future shaping (...) of nanotechnology research and innovation processes. The emergent, undetermined nature of nanotechnologies calls for an open, experimental, and interdisciplinary model of social science research. (shrink)
Aristotle holds that individual substances are ontologically independent from nonsubstances and universal substances but that non-substances and universal substances are ontologically dependent on substances. There is then an asymmetry between individual substances and other kinds of beings with respect to ontological dependence. Under what could plausibly be called the standard interpretation, the ontological independence ascribed to individual substances and denied of non-substances and universal substances is a capacity for independent existence. There is, however, a tension between this interpretation and the (...) asymmetry between individual substances and the other kinds of entities with respect to ontological independence. I will propose an alternative interpretation: to weaken the relevant notion of ontological independence from a capacity for independent existence to the independent possession of a certain ontological status. (shrink)
The relation of ontological dependence or grounding, expressed by the terminology of separation and priority in substance, plays a central role in Aristotle’s Categories, Metaphysics, De Anima and elsewhere. The article discusses three current interpretations of this terminology. These are drawn along the lines of, respectively, modal-existential ontological dependence, essential ontological dependence, and grounding or metaphysical explanation. I provide an opinionated introduction to the topic, raising the main interpretative questions, laying out a few of the exegetical and philosophical options that (...) influence one’s reading, and locating questions of Aristotle scholarship within the discussion of ontological dependence and grounding in contemporary metaphysics. (shrink)
This paper examines Wesley Salmon's "process" theory of causality, arguing in particular that there are four areas of inadequacy. These are that the theory is circular, that it is too vague at a crucial point, that statistical forks do not serve their intended purpose, and that Salmon has not adequately demonstrated that the theory avoids Hume's strictures about "hidden powers". A new theory is suggested, based on "conserved quantities", which fulfills Salmon's broad objectives, and which avoids the problems discussed.
A predicate logic typically has a heterogeneous semantic theory. Subjects and predicates have distinct semantic roles: subjects refer; predicates characterize. A sentence expresses a truth if the object to which the subject refers is correctly characterized by the predicate. Traditional term logic, by contrast, has a homogeneous theory: both subjects and predicates refer; and a sentence is true if the subject and predicate name one and the same thing. In this paper, I will examine evidence for ascribing to Aristotle the (...) view that subjects and predicates refer. If this is correct, then it seems that Aristotle, like the traditional term logician, problematically conflates predication and identity claims. I will argue that we can ascribe to Aristotle the view that both subjects and predicates refer, while holding that he would deny that a sentence is true just in case the subject and predicate name one and the same thing. In particular, I will argue that Aristotle's core semantic notion is not identity but the weaker relation of constitution. For example, the predication ‘All men are mortal’ expresses a true thought, in Aristotle's view, just in case the mereological sum of humans is a part of the mereological sum of mortals. (shrink)
This paper contributes towards a lay ethics of nanotechnology through an analysis of talk from focus groups designed to examine how laypeople grapple with the meaning of a technology ‘in-the-making’. We describe the content of lay ethical concerns before suggesting that this content can be understood as being structured by five archetypal narratives which underpin talk. These we term: ‘the rich get richer and the poor get poorer’; ‘kept in the dark’; ‘opening Pandora’s box’; ‘messing with nature’; and ‘be careful (...) what you wish for’. We further suggest that these narratives can be understood as sharing an emphasis on the ‘giftedness’ of life, and that together they are used to resist dominant technoscientific and Enlightenment narratives of control and mastery which are encapsulated by nanotechnology. (shrink)
There is, no doubt, a temptation to treat preventions, such as ‘the father’s grabbing the child prevented the accident’, and cases of ‘causation’ by omission, such as ‘the father’s inattention was the cause of the child’s accident’, as cases of genuine causation. I think they are not, and in this paper I defend a theory of what they are. More specifically, the counterfactual theory defended here is that a claim about prevention or ‘causation’ by omission should be understood not as (...) being directly about actual genuine causation but primarily as a counterfactual claim about genuine causation.1 The relation between actual causation and the mere possibility of causation allows my theory to explain both the difference and the similarity between the two notions (causation and prevention/omission). Further, the difference explains certain intuitions we have and the similarity justifies and explains the fact that for practical purposes we usually treat preventions and omissions as if they were genuine causation. Finally, the fact that this counterfactual theory of prevention and omission takes causation as primitive suggests that it is consistent with any theory of causation. This allows us to construct two arguments against what I will call genuinism, the view that cases of prevention and ‘causation’ by omission really are cases of genuine causation. In section II I show that genuinism does not account for the so-called intuition of difference. In section III I outline a number of problems that various theories of causation have with preventions and ‘causation’ by omission. These problems are ipso facto problems for genuinism, whereas I show in section V how the counterfactual theory solves those problems. Further, I answer a genuinist argument based on another type of intuition—the genuinist intuition—by showing why we have that intuition and how it should be handled (section VI). (shrink)
Despite much discussion over the existence of moral facts, metaethicists have largely ignored the related question of their possibility. This paper addresses the issue from the moral error theorist’s perspective, and shows how the arguments that error theorists have produced against the existence of moral facts at this world, if sound, also show that moral facts are impossible, at least at worlds non-morally identical to our own and, on some versions of the error theory, at any world. So error theorists’ (...) arguments warrant a stronger conclusion than has previously been noticed. This may appear to make them vulnerable to counterarguments that take the possibility of moral facts as a premise. However, I show that any such arguments would be question-begging. (shrink)
This chapter argues that dual-use emerging technologies are distributing unprecedented offensive capabilities to nonstate actors. To counteract this trend, some scholars have proposed that states become a little “less liberal” by implementing large-scale surveillance policies to monitor the actions of citizens. This is problematic, though, because the distribution of offensive capabilities is also undermining states’ capacity to enforce the rule of law. I will suggest that the only plausible escape from this conundrum, at least from our present vantage point, is (...) the creation of a “supersingleton” run by a friendly superintelligence, founded upon a “post-singularity social contract.” In making this argument, the present chapter offers a novel reason for prioritizing the “control problem,” i.e., the problem of ensuring that a greaterthan-human-level AI will positively enhance human well-being. (shrink)
Experimental methods and conceptual confusion : philosophy, science, and what emotions really are -- To 'make our voices resonate' or 'to be silent'? : shame as fundamental ontology -- Emotion, cognition, and world -- Shame and world.
Individual substances are the ground of Aristotle’s ontology. Taking a liberal approach to existence, Aristotle accepts among existents entities in such categories other than substance as quality, quantity and relation; and, within each category, individuals and universals. As I will argue, individual substances are ontologically independent from all these other entities, while all other entities are ontologically dependent on individual substances. The association of substance with independence has a long history and several contemporary metaphysicians have pursued the connection. In this (...) chapter, I will discuss the intersection of these notions of substance and ontological dependence in Aristotle. I will canvass a few contemporary formulations of ontological dependence and discuss some of the interpretative difficulties in ascribing any of these formulations to Aristotle’s characterization of individual substances as ontologically independent. My aim is not to resolve fully these difficulties but to locate the topics of substance and independence relative to certain other controversies in Aristotle studies. However, I will sketch a position. In particular, elsewhere I have speculated that Aristotle is both a primitivist and a pluralist with respect to ontological dependence, and I will develop this line of interpretation a bit further later in the chapter. (shrink)
Both literalism, the view that mathematical objects simply exist in the empirical world, and fictionalism, the view that mathematical objects do not exist but are rather harmless fictions, have been both ascribed to Aristotle. The ascription of literalism to Aristotle, however, commits Aristotle to the unattractive view that mathematics studies but a small fragment of the physical world; and there is evidence that Aristotle would deny the literalist position that mathematical objects are perceivable. The ascription of fictionalism also faces a (...) difficult challenge: there is evidence that Aristotle would deny the fictionalist position that mathematics is false. I argue that, in Aristotle's view, the fiction of mathematics is not to treat what does not exist as if existing but to treat mathematical objects with an ontological status they lack. This form of fictionalism is consistent with holding that mathematics is true. (shrink)
As a first stab, call a property recurrent if it can be possessed by more than one object, and nonrecurrent if it can be possessed by at most one object. The question whether Aristotle holds that there are nonrecurrent properties has spawned a lively and ongoing debate among commentators over the last forty-five years. One source of textual evidence in the Categories, drawn on in this debate, is Aristotle’s claim that certain properties are inseparable from what they are in. Here (...) the point of contention is whether this commits Aristotle to holding that these properties are inseparable from individuals, since it is commonly held that a property is nonrecurrent, if it is inseparable from an individual. I argue that this evidence is neutral on the question whether there are nonrecurrent properties in Aristotle. One of my aims here is to disentangle the question of recurrence from local issues of individuality and universality in the Categories. But another aim is to turn from the textual considerations, which have dominated the debate, to broader methodological considerations. It is a shared assumption among all those who look to textual evidence from the Categories, so to decide whether Aristotle believes there are nonrecurrent properties, that in this work Aristotle is engaged in a project where the question of recurrence is relevant. I argue that Aristotle’s concerns in the Categories are disjoint from the question of recurrence, and so this shared assumption is false. (shrink)
Philosophers have long been fascinated by the connection between cause and effect: are 'causes' things we can experience, or are they concepts provided by our minds? The study of causation goes back to Aristotle, but resurged with David Hume and Immanuel Kant, and is now one of the most important topics in metaphysics. Most of the recent work done in this area has attempted to place causation in a deterministic, scientific, worldview. But what about the unpredictable and chancey world we (...) actually live in: can one theory of causation cover all instances of cause and effect? _Cause and Chance: Causation in an Indeterministic World _is a collection of specially written papers by world-class metaphysicians. Its focus is the problem facing the 'reductionist' approach to causation: the attempt to cover all types of causation, deterministic and indeterministic, with one basic theory. Contributors: Stephen Barker, Helen Beebee, Phil Dowe, Dorothy Edgington, Doug Ehring, Chris Hitchcock, Igal Kwart, Paul Noordhof, Murali Ramachandran and Michael Tooley. (shrink)
In a recent paper (1994) Wesley Salmon has replied to criticisms (e.g., Dowe 1992c, Kitcher 1989) of his (1984) theory of causality, and has offered a revised theory which, he argues, is not open to those criticisms. The key change concerns the characterization of causal processes, where Salmon has traded "the capacity for mark transmission" for "the transmission of an invariant quantity." Salmon argues against the view presented in Dowe (1992c), namely that the concept of "possession of a conserved quantity" (...) is sufficient to account for the difference between causal and pseudo processes. Here that view is defended, and important questions are raised about the notion of transmission and about gerrymandered aggregates. (shrink)
Changing patterns of political participation observed by political scientists over the past half-century undermine traditional democratic theory and practice. The vast majority of democratic theory, and deliberative democratic theory in particular, either implicitly or explicitly assumes the need for widespread citizen participation. It requires that all citizens possess the opportunity to participate and also that they take up this opportunity. But empirical evidence gathered over the past half-century strongly suggests that many citizens do not have a meaningful opportunity to participate (...) in the ways that many democratic theorists require, and do not participate in anything like the numbers that they believe is necessary. This paper outlines some of the profound changes that have been experienced by liberal democratic states in the 20th and early 21st Centuries, changes which are still ongoing, and which have resulted in declines in citizens participation and trust, the marginalisation of citizens from democratic life, and the entrenchment of social and economic inequalities which have damaged democracy. The paper challenges the conventional wisdom in rejecting the idea that the future of democracy lies in encouraging more widespread participation. The paper takes seriously the failure of the strategies adopted by many states to increase participation, especially among the poor, and suggests that instead of requiring more of citizens, we should in fact be requiring less of them. Instead of seeking to encourage more citizen participation, we should acknowledge that citizens will probably not participate in the volume, or in the ways, many democratic theorists would like, and that therefore we need an alternative approach: a regime which can continue to produce democratic outcomes, and which satisfies the requirements of political equality, in the absence of widespread participation by citizens. (shrink)
Presentists face a challenge from truthmaker theory: if you hold both that the only existing objects are presently existing and that truth supervenes on being, then you will be hard pressed to identify some existent on which a given true but traceless claim about the past supervenes. One reconciliation strategy, advocated by Cameron (2011), is to appeal to distributional properties so to serve as presently existing truthmakers for past truths. I argue that a presentist ought to deny that distributional properties (...) can serve as truthmakers. (shrink)
The greatest existential threats to humanity stem from increasingly powerful advanced technologies. Yet the “risk potential” of such tools can only be realized when coupled with a suitable agent who; through error or terror; could use the tool to bring about an existential catastrophe. While the existential risk literature has provided many accounts of how advanced technologies might be misused and abused to cause unprecedented harm; no scholar has yet explored the other half of the agent-tool coupling; namely the agent. (...) This paper aims to correct this failure by offering a comprehensive overview of what we could call “agential riskology.” Only by studying the unique properties of different agential risk types can one acquire an accurate picture of the existential danger before us. (shrink)
This article discusses the perpetual debate on the Florentine, Niccolo Machiavelli's ethical values and leadership ideas and the consequent creation of the mythical reputation and negative epithet 'Machiavellian'. This article proposes recommendations on how Machiavelli's thought and his study can best be applied to bring genuine clarity and value to organisations in these interesting and turbulent times providing a hopefully viable compass for a changing landscape.
Gordon Baker in his last decade published a series of papers (now collected in Baker 2004), which are revolutionary in their proposals for understanding of later Wittgenstein. Taking our lead from the first of those papers, on "perspicuous presentations," we offer new criticisms of 'elucidatory' readers of later Wittgenstein, such as Peter Hacker: we argue that their readings fail to connect with the radically therapeutic intent of the 'perspicuous presentation' concept, as an achievement-term, rather than a kind of 'objective' mapping (...) of a 'conceptual landscape.' Baker's Wittgenstein, far from being a 'language policeman' of the kind that often fails to influence mainstream philosophy, offers an alternative to the latent scientism of Wittgenstein's influential 'elucidatory' readers. (shrink)
This research applies the impression management theory of exemplification in an accounting study by identifying and measuring differences in both auditor and public perceptions of exemplary behaviors. The auditors were divided into two groups, one of which reported self-perceptions (A-S) while the other group reported their perceptions of a typical auditor (A-O). There were two separate public groups, which gave their perceptions of a typical auditor and were divided based on their levels of accounting sophistication. The more sophisticated public group (...) was comprised of bank loan officers (LO) while the less sophisticated public group consisted of investment club members (IC). Comparisons were made on 30 behaviors contained in the AICPA Code of Professional Conduct, which served as the basis for the research instrument. Profile analysis, a special form of MANOVA technique, was used to analyze the results. A-S perceptions were the highest of the four treatment levels and were significantly higher (i.e., more exemplary) than the perceptions of both the A-O and LO groups. The more sophisticated user group (LO) provided the lowest perceptions of the four treatment levels. For at least four of the six measures, the LO treatment group perceived the typical auditor to be less exemplary than both the IC and A-O treatments. There were no differences in perceptions between the A-O group and IC. Additional analysis revealed that auditors overrated the degree to which the public relied on financial statements. However, both public groups reported a reasonably high level of reliance on financial statements when making decisions. (shrink)
Jeffrey Stout claims that John Rawls's idea of public reason (IPR) has contributed to a Christian backlash against liberalism. This essay argues that those whom Stout calls “antiliberal traditionalists” have misunderstood Rawls in important ways, and goes on to consider Stout's own critiques of the IPR. While Rawls's idea is often interpreted as a blanket prohibition on religious reasoning outside church and home, the essay will show that the very viability of the IPR depends upon a rich culture of deliberation (...) in which all forms of reasoning can be put forth for consideration. This clarification addresses the perception that the IPR imposes an “asymmetrical burden” upon believers. In fact, the essay suggests that there are good reasons why believers, qua believers, might endorse the IPR. (shrink)
`It is a safe bet that Key Thinkers will emerge as something of a 'hit' within the undergraduate community and will rise to prominance as a 'must buy' -Environment and Planning `Key Thinkers on Space and Place is an engagingly written, well-researched and very accessible book. It will surely prove an invaluable tool for students, whom I would strongly encourage to purchase this edited collection as one of the best guides to recent geographical thought' -Claudio Minca, University of Newcastle `Key (...) Thinkers is the best encyclopedic tool for human geographers since the Dictionary of Human Geography. It takes into its orbit discussions of the lives and work of the last three decades' major thinkers on space and place. It is hugely useful for students who want an easy way to access the roots of where some major themes and debates in contemporary geography. It is organized so that each chapter details the scholar's biography, their contribution to spatial and place-based theory and the controversies that arise through their work' - Stuart Aitken, San Diego State University Key Thinkers on Space and Place is a comprehensive guide to the latest work on space. Each entry is a short interpretative essay of 2,500 words, outlining the contributions made by the key theorists, and comprises: · a concise biography, indicating disciplinary background, career trajectory and collaboration with others · an outline of the key theoretical, conceptual and methodological ideas each has introduced to human geography · an explanation of the reaction to, and uptake of, how these ideas has changed and evolved over time · an explanation of how these theories have been used and critiqued by human geographers · a selective bibliography of each thinker's key publications (and key secondary publications) The text is introduced by a contextual essay which outlines in general terms the shifting ways in which space and place have been theorised and which explains how Key Thinkers on Space and Place can be used. A glossary that defines key traditions, with cross-links to key theorists and a timeline of key article/book publication date is also included. (shrink)
“Scaling-up” is the next hurdle facing the local food movement. In order to effect broader systemic impacts, local food systems (LFS) will have to grow, and engage either more or larger consumers and producers. Encouraging the involvement of mid-sized farms looks to be an elegant solution, by broadening the accessibility of local food while providing alternative revenue streams for troubled family farms. Logistical, structural and regulatory barriers to increased scale in LFS are well known. Less is understood about the way (...) in which scale developments affect the perception and legitimacy of LFS. This value-added opportunity begs the question: Is the value that adheres to local food scalable? Many familiar with local food discourse might suggest that important pieces of added value within LFS are generated by the reconnection of producer and consumer, the direct exchange through which this occurs, and the shared goals and values that provide the basis for reconnection. However, these assertions are based on tenuous assumptions about how interactions within the direct exchange produce value, and how LFS are governed. Examination shows that existing assumptions do not properly acknowledge the hybridity, diversity, and flexibility inherent in LFS. A clear analysis of the potential of scale in LFS will depend on understanding both how value is determined within LFS, and the processes through which these systems are governed. Such an analysis shows that, while scaled-up LFS will be challenged to maintain legitimacy and an identity as “alternative”, the establishment of an open governance process—based on a “negotiation of accommodations”—is likely to enhance their viability. (shrink)
Although diagnosis is integral to the theory and practice of psychiatry, social scientists have not developed a comprehensive approach to diagnosis. This paper presents a preliminary outline of the issues which a sociology of diagnosis should integrate. These include bias and social control in psychiatric diagnosis, diagnosis as part of a new extension of the biopsychiatric medical model, and flaws in contemporary diagnostic categorization. These issues are then viewed in terms of professional practice styles, diagnostic biases, psychiatry's professional dominance over (...) the mental health field, and psychiatric hegemony over the clinical interaction with patients. (shrink)
Goffman makes considerable use of the metaphor of social life as theater. This metaphor has a significant impact on his thought in three areas: 1) it is central to his changing views about cynicism and trust in everyday life; 2) metaphor in general is a method of sociological inquiry; and 3) metaphor suggests a "limit" that his later work attempts to transcend.
In Unfit for the Future, Ingmar Persson and Julian Savulescu argue that our collective existetial predicment is unprecedentedly dangerous due to climate change and terrorism. Given these global risks to human prosperity and survival, Persson and Savulescu argue that we should explore the radical possibility of moral bioenhancement in addition to cognitive enhancement. In this article, I argue that moral bioenhancements could nontrivially exacerbate the threat posed by certain kinds of malicious agents, while reducing the threat of other kinds. This (...) introduces a previously undiscussed complication to Persson and Savulescu's proposal. In the final section, I present a novel argument for why moral bioenhancement should either be compulsory or not be made available to the public at all. (shrink)
This idea of time travel has long given philosophers difficulties. Most recently, in his paper ‘Troubles with Time Travel’ William Grey presents a number of objections to time travel, some well known in the philosophical literature, others quite novel. In particular Grey's ‘no destinations’ and ‘double occupation’ objections I take to be original, while what I will call the ‘times paradox’ and the ‘possibility restriction argument’ are versions of well known objections. I show how each of these can be answered, (...) thereby defending the plausibility of time travel. (shrink)