A relation between two secrets, known in the literature as nondeducibility , was originally introduced by Sutherland. We extend it to a relation between sets of secrets that we call independence . This paper proposes a formal logical system for the independence relation, proves the completeness of the system with respect to a semantics of secrets, and shows that all axioms of the system are logically independent.
Truth in the Making represents a sophisticated effort to map the complex relations between human knowledge and creative power, as reflected across more than half a millennium of philosophical enquiry. Showing the intimacy of this problematic to the work of Nicholas of Cusa, Bacon, Galileo, Descartes, Hobbes, Leibniz, Vico and David Lachterman, the book reveals how questions about creation apparently diluted by secularism in fact retain much of their potency today. If science could counterfeit or synthesize nature precisely from (...) its smallest nuts and bolts, as Descartes and Hobbes implied and as modern science increasingly suggests, would it create an identical world to that we live in now Robert C. Miner offers a precise genealogy of modern thought on truth and creation: from medieval theology's identification of human creativity with divine initiative to the radical Leibnizian contention that human ideas are 'not little copies of God's', and may at once exceed mimesis and produce things new, unpredictable and unseen. He considers how the theological importance given to creation interacts historically with the secularisation and instrumentalisation of modes of discovery and method, and asks how knowledge is understood between different disciplines, from the allegorical discipline of poetry to the constructible field of mathematics. The book is an eloquent reminder of the ways in which theology continues to fling a wild card at philosophical understandings of reality, countering theories of metaphysical equivalence of the 'real' and 'artificial' with theologies in which human making is always fallible, and strives only for approximate participation in divine truth. As a strenuous and informative breakdown of leading theories of knowledge, Truth in the Making shows the continuing influence of theological questions upon philosophical, scientific and aesthetic disciplines, whilst raising topical questions about the ultimate nature of our reality and our freedom to modify and define it. (shrink)
This paper discusses the materialist views of Margaret Cavendish, focusing on the relationships between her views and those of two of her contemporaries, Thomas Hobbes and Henry More. It argues for two main claims. First, Cavendish's views sit, often rather neatly, between those of Hobbes and More. She agreed with Hobbes on some issues and More on others, while carving out a distinctive alternative view. Secondly, the exchange between Hobbes, More, and Cavendish illustrates a more (...) general puzzle about just what divided materialists from their opponents. Seemingly straightforward disagreements about whether incorporeal substances exist turn out to be more complex ones in which the nature of those things is disputed at the same time as their existence. (shrink)
According to intellectualism, what a person knows is solely a function of the evidential features of the person's situation. Anti-intellectualism is the view that what a person knows is more than simply a function of the evidential features of the person's situation. Jason Stanley (2005) argues that, in addition to “traditional factors,” our ordinary practice of knowledge ascription is sensitive to the practical facts of a subject's situation. In this paper, we investigate this question empirically. Our results indicate that (...) Stanley's assumptions about knowledge ascriptions do not reflect our ordinary practices in some paradigmatic cases. If our data generalize, then arguments for anti-intellectualism that rely on ordinary knowledge ascriptions fail: the case for anti-intellectualism cannot depend on our ordinary practices of knowledge ascription. (shrink)
This paper is an engagement with Equality by John Baker, Kathleen Lynch, Judy Walsh and Sara Cantillon. It identifies a dilemma for educational egalitarians, which arises within their theory of equality, arguing that sometimes there may be a conflict between advancing equality of opportunity and providing equality of respect and recognition, and equality of love care and solidarity. It argues that the latter values may have more weight in deciding what to do than traditional educational egalitarians have usually (...) thought. (shrink)
Kant’s example of lying to the murderer at the door has been a cherished source of scorn for thinkers with little sympathy for Kant’s philosophy and a source of deep puzzlement for those more favorably inclined. The problem is that Kant seems to say that it’s always wrong to lie – even if necessary to prevent a murderer from reaching his victim – and that if one does lie, one becomes partially responsible for the killing of the victim. If (...) this is correct, then Kant’s account seems not only to require us to respect the murderer more than the victim, but also that we somehow can become responsible for the consequences that ultimately result from someone else’s wrongdoing. After World War II our spontaneous negative reaction to this apparently absurd line of argument is brought out even more starkly by making the murderer at the door a Nazi officer looking for Jews hidden in people’s homes. This paper argues that Kant’s discussion of lying to the murderer at the door has been seriously misinterpreted. The suggested root of the problem is that the Doctrine of Right has been given insufficient attention in Kant interpretation. It is in this work we find many of the arguments needed to understand Kant’s analysis of lying to the murderer in “On a Supposed Right to Lie from Philanthropy”. When we interpret this essay in light of Kant’s discussion in the Doctrine of Right, we can make sense of why lying to the murderer isn’t to wrong the murderer, why we nevertheless become responsible for the consequences of the lie and why choosing to lie to do wrong ‘in the highest degree’. Finally, the Doctrine of Right account of rightful relations makes it possible for us to analyze the example when we make the murderer at the door a Nazi officer. (shrink)
Postmodernists claim that there is no truth. However, the statement 'there is no truth' is self-contradictory. This essay shows the following: One cannot state the idea 'there is no truth' universally without creating a paradox. In contrast, the statement 'there is truth' does not produce such a paradox. Therefore, it is more logical that truth exists.
P. J. E. Kail's Projection and Realism in Hume's Philosophy is an excellent book, consisting—like Hume's Treatise itself—of three excellent parts. I will comment on one central aspect of its second part: its explanation of the source of the second thoughts that Hume famously expressed, with a frustrating lack of specificity, about his own initial discussion of personal identity in the Treatise.As is well known, Hume holds in the section "Of personal identity" (T 1.4.6) that a self, mind, or person (...) is "nothing but a bundle or collection of different perceptions" (T 188.8.131.52; SBN 252) and, more specifically, a "system of different perceptions or different existences link'd together by the relation of cause and .. (shrink)
Preprinted in God and the Problem of Evil(Blackwell 2001), ed. William Rowe. Many people deny that evil makes belief in atheism more reasonable for us than belief in theism. After all, they say, the grounds for belief in God are much better than the evidence for atheism, including the evidence provided by evil. We will not join their ranks on this occasion. Rather, we wish to consider the proposition that, setting aside grounds for belief in God and relying only (...) on the background knowledge shared in common by nontheists and theists, evil makes belief in atheism more reasonable for us than belief in theism. Our aim is to argue against this proposition. We recognize that in doing so, we face a formidable challenge. It’s one thing to say that evil presents a reason for atheism that is, ultimately, overridden by arguments for theism. It’s another to say that it doesn’t so much as provide us with a reason for atheism in the first place. In order to make this latter claim seem initially more plausible, consider the apparent design of the mammalian eye or the apparent fine-tuning of the universe to support life. These are often proposed as reasons to believe in theism. Critics commonly argue not merely that these supposed reasons for theism are overridden by arguments for atheism but rather that they aren’t good reasons for theism in the first place. Our parallel proposal with respect to evil and atheism is, initially at least, no less plausible than this proposal with respect to apparent design and theism. (shrink)
This paper discusses an important puzzle about the semantics of indicative conditionals and deontic necessity modals ( should , ought , etc.): the Miner Puzzle (Parfit, ms; Kolodny and MacFarlane, J Philos 107:115–143, 2010 ). Rejecting modus ponens for the indicative conditional, as others have proposed, seems to solve a version of the puzzle, but is actually orthogonal to the puzzle itself. In fact, I prove that the puzzle arises for a variety of sophisticated analyses of the truth-conditions of (...) indicative conditionals. A comprehensive solution requires rethinking the relationship between relevant information (what we know) and practical rankings of possibilities and actions (what to do). I argue that (i) relevant information determines whether considerations of value may be treated as reasons for actions that realize them and against actions that don’t, (ii) incorporating this normative fact requires a revision of the standard ordering semantics for weak (but not for strong) deontic necessity modals, and (iii) an off-the-shelf semantics for weak deontic necessity modals, due to von Fintel and Iatridou, which distinguishes “basic” and “higher-order” ordering sources, and interprets weak deontic necessity modals relative to both, is well-suited to this task. The prominence of normative considerations in our proposal suggests a more general methodological lesson: formal semantic analysis of natural language modals expressing normative concepts demands that close attention be paid to the nature of the underlying normative phenomena. (shrink)
In response to the difficulty of teaching an increasingly large number of students who are ill prepared for the sort of abstract thinking and well-structured essay writing that are essential to the field of Philosophy, I have discovered a five-step method for teaching students in my Philosophy and Social Ethics course how to examine any ethical issue and write well-structured essays discussing the issue. Just as important, students are now required to take more responsibility for the learning process which, (...) I believe, is an appropriate goal for a course in Ethics. (shrink)
Cappelen and Hawthorne’s Relativism and Monadic Truth (2009) offers an extended defense of a thesis they call simplicity, which, in brief, holds that propositions are true or false simpliciter. Propositions are cast in their traditional roles as the contents of assertions, and as the semantic values of declarative sentences in contexts. Simplicity stands in sharp contrast to forms of relativism including, for instance, a form that hold that our claims are true or false only relative to a judge. This applies (...) especially to claims of taste, which come out true or false only relative to the judge who finds things tasty (e.g. Glanzberg 2007, Lasersohn 2005). But simplicity also rejects the more widespread temporalist view that propositions are true or false only relative to a time, and it rejects the even more widely held view that propositions are true or false only relative to a world. One reason that has been advanced for temporalism, e.g. by Kaplan (1989), is that our languages seem to contain non-trivial temporal operators. Hence, the argument goes, the semantic values of sentences need to be temporally neutral, i.e. vary for truth or falsehood with time. The same goes for possible worlds and modal operators. Hence, Kaplan and many others think of the semantic values of sentences as sets of world-time pairs. It has been tempting to apply this sort of argument much more widely, to see the semantic values of sentences as varying not just with world and time, but perhaps with location and other parameters as well. Kaplan.. (shrink)
In The Right and the Good, W. D. Ross commits himself to the view that, in addition to being distinct and defeasible, some prima facie duties are more binding than others. David McNaughton has argued that there appears to be no way of making sense of this claim that is both coherent and consistent with Ross's overall picture. I offer an alternative way of understanding Ross's remarks about the comparative stringency of prima facie duties, which, in addition to being (...) compatible with his view as presented in the text, provides us with a coherent, and indeed plausible, account of what it means for some duties to be more binding than others. (shrink)
T. M. Scanlon has cited the value of friendship in arguing against a ‘teleological’ view of value which says that value inheres only in states of affairs and demands only that we promote it. This article argues that, whatever the teleological view's final merits, the case against it cannot be made on the basis of friendship. The view can capture Scanlon's claims about friendship if it holds, as it can consistently with its basic ideas, that (i) friendship is a higher-level (...) good consisting in appropriate attitudes to other goods and evils in a friend's life, (ii) these goods and evils have agent-relative value, i.e. more value than similar states of strangers, and (iii) the attitudes constituting friendship have less value than their objects. Given these independently plausible claims, the teleological view can agree with Scanlon that, e.g., it is wrong to betray a friend in order to promote more friendships among other people. (Published Online August 21 2006). (shrink)
I have argued that to say qualia are epiphenomenal is to say a world without qualia would be physically identical to a world with qualia. Dan Cavedon-Taylor has offered an alternative interpretation of the commitments of qualia epiphenomenalism according to which qualia cause beliefs and those beliefs can and do cause changes to the physical world. I argue that neither of these options works for the qualia epiphenomenalist and thus that theory faces far more serious difficulties than has previously (...) been recognized. (shrink)
Recently, a number of philosophers have advanced a surprising conclusion: people's judgments about whether an agent brought about an outcome intentionally are pervasively influenced by normative considerations. In this paper, we investigate the ‘Chairman case’, an influential case from this literature and disagree with this conclusion. Using a statistical method called structural path modeling, we show that people's attributions of intentional action to an agent are driven not by normative assessments, but rather by attributions of underlying values and characterological dispositions (...) to the agent. In a second study, we examined people's judgments about what they think drives asymmetric intuitions in the Chairman case and found that people are highly inaccurate in identifying which features of the case their intuitions track. In the final part of the paper, we discuss how the statistical methods used in this study can help philosophers with the critical features problem, the problem of figuring out which among the myriad features present in hypothetical cases are the critical ones that our intuitions are responsive to. We show how the methods used in this study have some advantages over both armchair methods used by traditional philosophers and survey methods used by experimental philosophers. (shrink)
In assessing the veridicality of utterances, we normally seem to assess the satisfaction of conditions that the speaker had been concerned to get right in making the utterance. However, the debate about assessor-relativism about epistemic modals, predicates of taste, gradable adjectives and conditionals has been largely driven by cases in which seemingly felicitous assessments of utterances are insensitive to aspects of the context of utterance that were highly relevant to the speaker’s choice of words. In this paper, we offer an (...) explanation of why certain locutions invite insensitive assessments, focusing primarily on ’tasty’ and ’might’. We spell out some reasons why felicitous insensitive assessments are puzzling and argue briefly that recent attempts to accommodate such assessments (including attempts by John MacFarlane, Kai von Fintel and Anthony Gillies) all fail to provide more than hints at a solution to the puzzle. In the main part of the paper, we develop an account of felicitous insensitive assessments by identifying a number of pragmatic factors that influence the felicity of assessments. Before closing, we argue that the role of these factors extend beyond cases considered in the debate about assessor-relativism and fit comfortably with standard contextualist analyses of the relevant locutions. (shrink)
This paper traces a rather peculiar debate between William Ockham, Walter Chatton, and Robert Holcot over whether it is possible for God to know more than he knows. Although the debate specifically addresses a theological question about divine knowledge, the central issue at stake in it is a purely philosophical question about the nature and ontological status of propositions. The theories of propositions that emerge from the discussion appear deeply puzzling, however. My aim in this paper is to show (...) that there is a way of making sense of these views (and, by implication, of much of what is puzzling about medieval theories of propositions). The key, I argue, lies in getting clear about the precise theoretical roles these thinkers assign to propositions in their accounts of propositional attitudes. (shrink)
This article further explains and develops a recent, comprehensive semantic naturalization theory, namely the interactive indexing (II) theory as described in my 2008 Minds and Machines article Semantic Naturalization via Interactive Perceptual Causality (Vol. 18, pp. 527–546). Folk views postulate a concrete intentional relation between cognitive states and the worldly states they are about. The II theory eliminates any such concrete intentionality, replacing it with purely causal relations based on the interactive theory of perception. But intentionality is preserved via purely (...) abstract propositions about the world that index, or correlate with, appropriate cognitive states. Further reasons as to why intentionality must be abstract are provided, along with more details of an II-style account of representation, language use and propositional attitudes. All cognitive representation is explained in terms of classification or sorting dispositions indexed by appropriate propositions. The theory is also related to Fodor’s representational theory of mind, with some surprisingly close parallels being found in spite of the purely dispositional basis of the II theory. In particular, Fodor’s insistence that thinking about an item cannot be reduced to sorting dispositions is supported via a novel two-level account of cognition—upper level propositional attitudes involve significant intermediate processing of a broadly normative epistemic kind prior to the formation of sorting dispositions. To conclude, the weak intentional realism of the II theory—which makes intentional descriptions of the world dispensable—is related to Dennett’s ‘intentional stance’ view, and distinguished from strong (indispensable) intentional realist views. II-style dispositions are also defended. (shrink)
Ned Markosian argues (Australasian Journal of Philosophy 76:213-228, 1998a; Australasian Journal of Philosophy 82:332-340, 2004a, The Monist 87:405-428, 2004b) that simples are ‘maximally continuous’ entities. This leads him to conclude that there could be non-particular ‘stuff’ in addition to things. I first show how an ensuing debate on this issue McDaniel (Australasian Journal of Philosophy 81(2):265-275, 2003); Markosian (Australasian Journal of Philosophy 82:332-340, 2004a) ended in deadlock. I attempt to break the deadlock. Markosian’s view entails stuff-thing coincidence, which I show (...) is just as problematic as the more oft-discussed thing-thing coincidence. Also, the view entails that every particular is only contingently so. If there is a world W like our own, but with ether, then there would be only one object in W. But, since merely adding ether to a world does not destroy the entities in it, then W contains counterparts of all the entities in the actual world—they just are not things. Hence, if simples are maximally continuous, then every actual particular is only contingently so. This in turn entails the following disjunction: (i) identity is contingent or intransitive, or (ii) there are no things at all in the actual world, or (iii) the distinction between stuff and things is one without a difference. I recommend that we reject this stuff-thing dualism. (shrink)
For many of the authors in this volume, this is the second attempt to explore what McCarthy and Hayes (1969) ﬁrst called the “Frame Problem”. Since the ﬁrst compendium (Pylyshyn, 1987), nicely summarized here by Ronald Loui, there have been several conferences and books on the topic. Their goals range from providing a clariﬁcation of the problem by breaking it down into subproblems (and sometimes declaring the hard subproblems to not be the_ real_ Frame Problem), to providing formal “solutions” to (...) certain aspects of the problem. But more often the message has been that the problem is not solvable except in a piecemeal way in special circumstances by some sort of heuristic approximations. It has sometimes also been said that solving the Frame Problem is not only an unachievable goal, but it is also an unnecessary one since_ humans_ do not solve it either; we simply get along as best we can and deal with the problem of planning in ways that, to use Dennett’s phrase, is “good enough for government work”. (shrink)
This paper examines the libertarian account of free choice advanced by Robert Kane in his recent book, The Significance of Free Will. First a rather simple libertarian view is considered, and an objection is raised against it the view fails to provide for any greater degree of agent-control than what could be available in a deterministic world. The basic differences between this simple view and Kane's account are the requirements, on the latter, of efforts of will and of an agent's (...) wanting more to do a certain thing than he wants to do anything else. It is argued here that neither of these features yields any improvement over the simple libertarian view; neither helps to meet the objection that was raised against the simple view. Finally, it is suggested that a modest defense of that view might be available. (shrink)
Telerobotically operated and semiautonomous machines have become a major component in the arsenals of industrial nations around the world. By the year 2015 the United States military plans to have one-third of their combat aircraft and ground vehicles robotically controlled. Although there are many reasons for the use of robots on the battlefield, perhaps one of the most interesting assertions are that these machines, if properly designed and used, will result in a more just and ethical implementation of warfare. (...) This paper will focus on these claims by looking at what has been discovered about the capability of humans to behave ethically on the battlefield, and then comparing those findings with the claims made by robotics researchers that their machines are able to behave more ethically on the battlefield than human soldiers. Throughout the paper we will explore the philosophical critique of this claim and also look at how the robots of today are impacting our ability to fight wars in a just manner. (shrink)
In this paper I highlight certain logical and metaphysical issues which arise in the characterisation of functionalism-in particular its ready coherence with a physicalist ontology, its structuralism and the impredicativity of functionalist specifications. I then utilise these points in an attempt to demonstrate fatal flaws in the functionalist programme. I argue that the brand of functionalism inspired by David Lewis fails to accommodate multiple realisability though such accommodation was vaunted as a key improvement over the identity theory. More standard (...) accounts of functionalism allow, by contrast, for far too much multiple realisability. Specifically, functionalist structures will be massively reduplicated in the human brain; so functionalism yields the absurd consequence that each human harbours large numbers of minds and exemplifies virtually all mental states. (shrink)
Individualism leading to more consumerism seems to be a bit of truism nowadays in the media. The USA is particularly indicted for being too individualistic and consumerist. Past research has mostly indicated a positive relationship between the two. However, past research has not suggested a negative association between individualism and consumerism. This paper offers support for such a negative relationship by showing that an individual’s ethical values can temper the consumerist nature of individualists. Data were collected in the USA (...) and Taiwan. Structural equation models demonstrate that our hypothesized model fits our data well. A key result over the global sample is the stability of the linear path from individualism to work ethic to consumer ethic to consumerism. The two-nation comparison also supports differences in how Taiwanese and Americans differ in their belief that consumption benefits society. (shrink)
The creation-evolution “controversy” has been with us for more than a century. Here I argue that merely teaching more science will probably not improve the situation; we need to understand the controversy as part of a broader problem with public acceptance of pseudoscience, and respond by teaching how science works as a method. Critical thinking is difficult to teach, but educators can rely on increasing evidence from neurobiology about how the brain learns, or fails to.
Among the most well-known accounts of events is Jaegwon Kim’s exemplification theory, which identifies each event with a property exemplification (often modeled as an “ordered triple” of an entity, property type, and time). Two of the most influential rival event theorists (Lawrence Lombard and Jonathan Bennett) have urged rejecting exemplificationism on the basis of the charge that it ultimately conflates events with facts [Lombard (1986): Events: A Metaphysical Study. Routledge & Kegan Paul; Bennett (1988):Events and their Names. Hackett Publishing (...) Company]. In response, I offer a detailed examination of Lombard and Bennett’s arguments that exemplificationism undermines the event/fact distinction. I then develop and defend a modified version of Kim’s account that overcomes this objection, and so constitutes a more plausible exemplification theory of events. (shrink)
No one has explored the implications of cognitive theories and findings about religion for understanding its history with any more enthusiasm or insight than Luther Martin. Although my focus here is not historical, I assume that I will be employing cognitive tools in ways that he finds congenial. In the paper’s first section, I will make some general comments about standard comparisons of science and religion and criticize one strategy for making peace between them. In the second section of (...) the paper, I will delineate two cognitive criteria for comparing science, religion, theology, and commonsense explanations. Finally, in the third section, I will suggest that such a comparison supplies grounds for thinking that our longstanding interest in the comparison of science and religion is, oddly, somewhat misbegotten from a cognitive perspective. (shrink)
This article addresses the question of whether God's existence would be obvious to everyone if God performed more miracles. I conclude that it would not be so. I look at cases where people have been confronted with what they believe to be miracles and have either not come to believe in God, or have come to intellectual belief in God but declined to follow him. God's existence could be made undeniable not by spectacular signs, but only by God impressing (...) his existence upon us in a direct, non-propositional way. (shrink)
Millican (Mind 113(451):437–476, 2004) claims to have detected ‘the one fatal flaw in Anselm’s ontological argument.’ I argue that there is more than one important flaw in the position defended in Millican (Mind 113(451):437–476, 2004). First, Millican’s reconstruction of Anselm’s argument does serious violence to the original text. Second, Millican’s generalised objection fails to diagnose any flaw in a vast range of ontological arguments. Third, there are independent reasons for thinking that Millican’s generalised objection is unpersuasive.
The so-called evolution wars (Futuyma 1995; Pigliucci 2002) between the scientific understanding of the history of life on earth and various religiously inspired forms of cre- ationism are more than ever at the forefront of the broader ‘‘science wars,’’ themselves a part of the even more encom- passing ‘‘cultural wars.’’ With all these conflicts going on, and at a time when a potentially historical case on the teach- ing of Intelligent Design (ID) in public schools is being de- (...) bated in Pennsylvania, it may be useful to consider a number of books that have come out recently to help scientists and the public at large to understand what all the fuss is about. (shrink)
In this paper I try to establish a relation between some fundamental concepts of Gadamerian philosophy—namely, the concepts of play, of transmutation into form, and of increase in being—and the concept of truth. The concept of play allows one to conceive the extra-methodical character of truth as an objectivity radically different from that of science: the objectivity of what happens and is thus unrepeatable, absolutely independent of any methodical mastery; the concept of transmutation into form is a theorization of the (...) effectual character of truth; the concept of increase in being shows its nonredundant character, i.e., the idea that truth is more than reality. Truth is eventually conceived as a “transformational concept,” in which ontology, knowledge, and ethics are indissolubly interconnected. (shrink)
The derivation of the generally held Principle of Alternative Possibilities (PAP), roughly you are morally responsible only if you could do otherwise, from an even more generally held moral principle, K (for Kant), that roughly speaking ought implies can, has recently been the focus of significant debate. In this paper I shall argue that by focusing on PAP interpreted in terms of commissions alone an alternative derivation of PAP interpreted in terms of omissions is being overlooked. The advantage of (...) the new derivation is that it avoids many of the criticisms directed at the original derivation. Key Words: alternative possibilities blameworthiness moral responsibility omissions. (shrink)
The debate over free will has pittedlibertarian insistence on open alternativesagainst the compatibilist view that authenticcommitments can preserve free will in adetermined world. A second schism in the freewill debate sets rationalist belief in thecentrality of reason against nonrationalistswho regard reason as inessential or even animpediment to free will. By looking deeperinto what motivates each of these perspectivesit is possible to find common ground thataccommodates insights from all those competingviews. The resulting metacompatibilist view offree will bridges some of the differencesbetween (...) compatibilists and incompatibilists aswell as between rationalists andnonrationalists, and results in a free willtheory that is both more philosophicallyinclusive and more firmly connected tocontemporary research in psychology andbiology. (shrink)
At the crux of Descartes's general metaphysics and epistemology are his accounts of substances, attributes and ideas of substances and attributes. In spite of the centrality of these theories, there is wide disagreement among scholars about how to interpret them. I approach these debates by focusing on Descartes's theory of the infinite substance ? God. I argue that God's attributes are neither individual, inseparable properties that inhere in God (contra Kenny, Wilson, Curley, Hoffman) nor deductions from God (contra Lennon), but (...) attributions that can consistently be made to God. On this account, the diversity of God's attributes is due to how meditators refer to the various cognitive routes they take to clear and distinct perceptions of God; what makes a meditator's clear and distinct perception of God more distinct is that it becomes more stable ? the meditator can more easily retain and regain the perception. Other virtues of this interpretation include accounts of the following: the puzzling remarks about essences that Descartes makes to Gassendi; what founds conceptual distinctions in reality; and why the Cartesian meditator ?proves? the existence of God several times in the Meditations. (shrink)
The first main topic of this paper is a weak second-order theory that sits between firstorder Peano Arithmetic PA1 and axiomatized second-order Peano Arithmetic PA2 – namely, that much-investigated theory known in the trade as ACA0. What I’m going to argue is that ACA0, in its standard form, lacks a cogent conceptual motivation. Now, that claim – when the wraps are off – will turn out to be rather less exciting than it sounds. It isn’t that all the work that (...) has been done on ACA0 has been hopelessly misplaced: that would be a quite absurd suggestion. The mistake, if that’s what it is, has been a relatively small one. Still, we really ought to try to put things into conceptual good order here. That’s part of what philosophers are for. Here’s the structure of my main claim. On the one hand, interesting work on ACA0 actually only uses part of the strength of the theory: or as we might put it, the interesting work is actually carried on in a cut-down theory I’ll call ACA!. This theory, I’ll be claiming, does have a good conceptual motivation – it is in fact the theory that the putative conceptual grounding for ACA0 actually underpins. On the other hand, I’ll be arguing that original-strength ACA0 inductively inflates. I mean, to put it more carefully, that anyone who accepts ACA0 as a cogent theory can have no reason not to accept a certain significantly stronger theory, with a stronger induction principle. This stronger theory is standardly known as plain ACA. So, my claim comes to this: you can either go for the cut-down theory ACA!; or you can go for the much richer theory ACA. What you can’t do is – I mean, what you can’t have a stable conceptual motivation for doing – is to rest content with the intermediate strength ACA0 in its standard presentation. Yet in much of the literature, in particular in Simpson’s encyclopedic book Subsystems of Second-Order Arithmetic (1991), neither ACA! nor full ACA gets so much as a mention, and the conceptually unstable theory ACA0 gets all the glory. Why is my claim at all interesting? For at least two reasons.. (shrink)
What follows is an exercise in hunter-gatherer ontology. More precisely, the region of space and of spatial objects will be adopted as a happy hunting ground for the purposes of Meinongian metaphysics. Meinong, notoriously, struggled against the prejudice in favour of the actual and fought on behalf of the ontological rights of incomplete, impossible, and indeterminate objects. A parallel struggle, as we shall see, can be waged in the domain of spatial objects. Meinong's ideas can in this way be (...) seen to have relevance for studies of the philosophical foundations of the theories of land-surveying and of international law. (shrink)
The empirical findings in Collins and Porras'' study of visionary companies, Built to Last, and the normative claims about the purpose of the business firm in Centesimus Annus are found to be complementary in understanding the purpose of the business firm. A summary of the methodology and findings of Built to Lastand a short overview of Catholic Social Teaching are provided. It is shown that Centesimus Annus'' claim that the purpose of the firm is broader than just profit is consistent (...) with Collins and Porras empirical finding that firms which set a broader objective tend to be more successful than those which pursue only the maximization of profits. It is noted however that a related finding in Collins and Porras, namely that the content of the firm''s objective is not as important as internalizing some objective beyond just profit maximization, can lead to ethical myopia. Two examples are provided of this: the Walt Disney Company and Philip Morris. Centesimus Annus offers a way to expose such myopia, by providing guidance as to what the purpose of the firm is, and therefore as to what kinds of objectives are appropriate to the firm. (shrink)
Population axiology concerns how to evaluate populations in regard to their goodness, that is, how to order populations by the relations “is better than” and “is as good as”. This field has been riddled with impossibility results which seem to show that our considered beliefs are inconsistent in cases where the number of people and their welfare varies.1 All of these results have one thing in common, however. They all involve an adequacy condition that rules out Derek Parfit’s Repugnant Conclusion: (...) The Repugnant Conclusion: For any perfectly equal population with very high positive welfare, there is a population with very low positive welfare which is better, other things being equal.2 1 The informal Mere Addition Paradox in Parfit (1984), pp. 419ff is the locus classicus. For an informal proof of a similar result with stronger assumptions, see Ng (1989), p. 240. A formal proof with slightly stronger assumptions than Ng’s can be found in Blackorby and Donaldson (1991). For theorems with much weaker assumptions, see my (1999), (2000b), and especially (2000a), (2001), and (2009). 2 See Parfit (1984), p. 388. My formulation is more general than Parfit’s apart from that he doesn’t demand that the people with very high welfare are equally well off. Expressions such as “a population with very high positive welfare”, “a population with very low positive welfare”, etc., are elliptical for the more cumbersome phrases “a population consisting only of lives with.. (shrink)
Following Wallace’s suggestion, Darwin framed his theory using Spencer’s expression “survival of the fittest”. Since then, fitness occupies a significant place in the conventional understanding of Darwinism, even though the explicit meaning of the term ‘fitness’ is rarely stated. In this paper I examine some of the different roles that fitness has played in the development of the theory. Whereas the meaning of fitness was originally understood in ecological terms, it took a statistical turn in terms of reproductive success throughout (...) the 20th Century. This has lead to the ever-increasing importance of sexually reproducing organisms and the populations they compose in evolutionary explanations. I will argue that, moving forward, evolutionary theory should look back at its ecological roots in order to be more inclusive in the type of systems it examines. Many biological systems (e.g. clonal species, colonial species, multi-species communities) can only be satisfactorily accounted for by offering a non-reproductive account of fitness. This argument will be made by examining biological systems with very small or transient population structures. I argue this has significant consequences for how we define Darwinism, increasing the significance of survival (or persistence) over that of reproduction. (shrink)
The context of economic globalization has contributed to the emergence of a new form of social action which has spread into the economic sphere in the form of the new social economic movements. The emblematic figure of this new generation of social movements is fair trade, which influences the economy towards political or social ends. Having emerged from multiple alternative trade practices, fair trade has gradually become institutionalized since the professionalization of World Shops, the arrival of fair trade products in (...) the food industry, and the establishment of an official "fair trade" label. With the strength that this institutionalization has generated, fair trade can now be considered a real trade system that questions, as much as it renews, the traditional economic system. In parallel, this transformation has exacerbated the tensions within the movement, which can be characterized as a clash between a "radical, militant" pole and a "softer, more commercial" one. However, it is not the actual institutionalization of fair trade which is being debated among fair trade actors on either side of the fence, but rather the challenges inherent in finding an economic institutionalization acceptable to social economic movements. Therefore the institutionalization process of fair trade should not be seen as mere degradation of social action, but rather as typical of the institutionalization process of new social economic movements. If we need to worry about the highjacking and alteration of the fair trade movement by the dominant economic system, the opposite is no less likely, as new social economic movements contribute to an ethical restructuring of markets. (shrink)
Nanotechnology is a swiftly developing field of technology that is believed to have the potential of great upsides and excessive downsides. In the ethical debate there has been a strong tendency to strongly focus on either the first or the latter. As a consequence ethical assessments of nanotechnology tend to radically diverge. Optimistic visionaries predict truly utopian states of affairs. Pessimistic thinkers present all manner of apocalyptic visions. Whereas the utopian views follow from one-sidedly focusing on the potential benefits of (...) nanotechnology, the apocalyptic perspectives result from giving exclusive attention to possible worst-case scenarios. These radically opposing evaluations hold the risk of conflicts and unwanted backlashes. Furthermore, many of these drastic views are based on simplified and outdated visions of a nanotechnology dominated by self-replicating assemblers and nanomachines. Hence, the present state of the ethical debate on nanotechnology calls for the development of more balanced and better-informed assessments. As a first step in this direction this contribution presents a new method of framing the ethical debate on nanotechnology. Thus, the focus of this paper is on methodology, not on normative analysis. (shrink)
A person required to risk money on a remote digit of π would, in order to comply fully with the theory [of personal probability] have to compute that digit, though this would really be wasteful if the cost of computation were more than the prize involved. For the postulates of the theory imply that you should behave in accordance with the logical implications of all that you know. Is it possible to improve the theory in this respect, making allowance (...) within it for the cost of thinking, or would that entail paradox? (shrink)
Voles are attracting attention because genetic variation at a single locus appears to have a profound impact on a complex social behavior, namely monogamy. After briefly reviewing the state of the most relevant scientific literature, I examine the way that this research gets taken up by the popular media, by scientists, and by the notable philosopher of neuroscience Patricia Churchland and interpreted as having deeply revisionary implications for how we ordinarily understand ourselves as persons. We have all these big questions (...) we would like to resolve about free will, consciousness, our understanding of persons, and the nature of morality and there is a tendency to ask more of neuroscience than it can yet answer. I do not deny that advances in neuroscience may eventually bear on important philosophical issues. However, it is not at all clear that this research has many of the sweeping implications being claimed for it and, in communicating science responsibly to the public, there is reason to be cautious about suggesting that it does. (shrink)
Philosophers of biology, along with everyone else, generally perceive life to fall into two broad categories, the microbes and macrobes, and then pay most of their attention to the latter. ‘Macrobe’ is the word we propose for larger life forms, and we use it as part of an argument for microbial equality. We suggest that taking more notice of microbes – the dominant life form on the planet, both now and throughout evolutionary history – will transform some of the (...) philosophy of biology’s standard ideas on ontology, evolution, taxonomy and biodiversity. We set out a number of recent developments in microbiology – including biofilm formation, chemotaxis, quorum sensing and gene transfer – that highlight microbial capacities for cooperation and communication and break down conventional thinking that microbes are solely or primarily single-celled organisms. These insights also bring new perspectives to the levels of selection debate, as well as to discussions of the evolution and nature of multicellularity, and to neo-Darwinian understandings of evolutionary mechanisms. We show how these revisions lead to further complications for microbial classification and the philosophies of systematics and biodiversity. Incorporating microbial insights into the philosophy of biology will challenge many of its assumptions, but also give greater scope and depth to its investigations. (shrink)
The target article proposes that “counterintuitive beliefs in supernatural agents” are shaped by cognitive factors and survive because they foster empathic concern and counteract existential dread. I argue that they are shaped by motivational forces similar to those that shape our beliefs about other people; that empathic concern is rewarded in a more elementary fashion; and that a major function of these supernatural beliefs may be to provide a more flexible alternative to autonomous willpower in controlling not only (...) dread but also many other unwelcome urges. (shrink)
The changing world of health care finance has led to a paradigm shift in health care with health care being viewed more and more as a commodity. Many have argued that such a paradigm shift is incompatible with the very nature of medicine and health care. But such arguments raise more questions than they answer. There are important assumptions about basic concepts of health care and markets that frame such arguments.
Although in modern times and clinical settings, we rarely see the old characteristics of tribal shamanism such as deep trances, out-of-body experiences, and soul retrieval, the archetypal dreams, waking visions and active imagination of modern depth psychology represents a liminal zone where ancient and modern shamanism overlaps with analytical psychology. These essays explore the contributors' excursions as healers and therapists into this zone. The contributors describe the many facets shamanism and depth psychology have in common: animal symbolism; recognition of the (...) reality of the collective unconscious; and healing rituals that put therapist and patient in touch with transpersonal powers. By reintroducing the core of shamanism in contemporary form, these essays shape a powerful means of healing that combines the direct contact with the inner psyche one finds in shamanism with the self-reflection and critical awareness of modern consciousness. The essays draw from the contributors' experiences both inside and outside the consulting room, and with cultures that include the Lakota Sioux, and those of the Peruvian Andes and the Hawaiian Islands. The focus is on those aspects of shamanism most useful and relevant to the modern practice of depth psychology. As a result, these explorations bring the young practice of analytical psychology into perspective as part of a much more ancient heritage of shamanistic healing. Contributors: Margaret Laurel Allen, Norma Churchill, Arthur Colman, Lori Cromer, Patricia Damery, C. Jess Groesbeck, Pansy Hawk Wing, June Kounin, Carol McRae, Pilar Montero, Jeffrey A. Raff, Janet S. Robinson, Meredith Sabini, Dyane N. Sherwood, Sara Spaulding-Phillips, Bradley A. Te Paske and Louis M. Vuksinick. (shrink)
More Precisely is a mathematics book that's designed to meet the needs of philosophers. It's not a philosophy of math book and it's not a logic book - it's a math book for philosophers. More Precisely shows how to apply mathematical tools in various branches of philosophy. It provides many classical and recent philosophical examples. The topics presented by More Precisely include: basic set theory; relations and functions; machines; probability; formal semantics (including possible worlds semantics); utilitarianism; and (...) infinity (both countable and uncountable). More Precisely is designed both as a text book and reference book to meet the needs of upper level undergraduates and graduate students. It will also be useful as a reference book for any philosopher working today. (shrink)
Warrant is that, whatever it is, which makes the difference between knowledge and mere true belief. In "Warrant Entails Truth" (PPR, December 1995), I argued that it is impossible that a false belief be warranted. Sharon Ryan attacked the argument of that paper in her "Does Warrant Entail Truth?" (PPR, March 1996). In "More on Warrant's Entailing Truth" I present arguments for the claim that warrant entails truth that are, I think, significantly more compelling than the arguments of (...) my original "Warrant Entails Truth." This paper responds to Ryan's objections, but it is not merely a reply to Ryan's article. It is, rather, a free-standing defense of warrant's entailing truth that is the product of discussion and argument for over two years with many philosophers, including Ryan, over the arguments contained in my original paper. (shrink)
Official Dutch food information apparently tries to avoid images but is implicitly shaped by the metaphor that food is fuel. The image of food as fuel and its accompanying view of the body as a machine are not maximally helpful for integrating two important human desires: health and pleasure. At the basis of the split between health and pleasure is the traditional mind–body dichotomy, in which the body is an important source of evil and bodily pleasure is sinful and dangerous. (...) In the search for alternatives, new metaphors are proposed that integrate mind and body as well as pleasure and health. The relevance of metaphors for ethics is at least twofold. (1) Moral thought and theory are at least partly shaped by metaphors. In the light of this growing recognition, the analysis of morality needs innovation. (2) With regard to food, new metaphors, such as slow food, or the image of enjoyment as an art, enable a new search for morally responsible forms of hedonism, based on more love and respect for human as well as animal bodies. But new metaphors are specific and selective, just like old ones. I argue that a search for the best overall metaphor would be misguided, but that more diverse forms of attention to bodily aspects of life, including experiences related to food, will result in richer vocabularies of the body, the mind, and body–mind relations. This holds a promise of moral progress. (shrink)
This paper focuses much-needed attention on the ethical nature of customer relationship management (CRM) strategies in organisations. The research uses an in-depth case study to reflect on the design, implementation and use of ‘best practice’ associated with CRM. We argue that conventional CRM philosophy is based on a fairly narrow construct that fails to consider ethical issues appropriately. We highlight why ethical considerations are important when organisations use CRM and how a more holistic approach incorporating some of Alasdair MacIntyre's (...) ideas on virtue ethics could be relevant. (shrink)
We contrast Bonanno’s ‘Belief Revision in a Temporal Framework’  with preference change and belief revision from the perspective of dynamic epistemic logic (DEL). For that, we extend the logic of communication and change of  with relational substitutions  for preference change, and show that this does not alter its properties. Next we move to a more constrained context where belief and knowledge can be defined from preferences [29; 14; 5; 7], prove completeness of a very expressive logic (...) of belief revision, and define a mechanism for updating belief revision models using a combination of action priority update  and preference substitution . (shrink)
We review and discuss A. H. Louie’s book “More than Life Itself: A Reflexion on Formal Systems and Biology” from an interdisciplinary viewpoint, involving both biology and mathematics, taking into account new developments and related theories.
Since its founding in the nineteenth century, social anthropology has been seen as the study of exotic peoples in faraway places. But today more and more anthropologists are dedicating themselves not just to observing but to understanding and helping solve social problems wherever they occur--in international aid organizations, British TV studios, American hospitals, or racist enclaves in Eastern Europe, for example. In Exotic No More , an initiative of the Royal Anthropological Institute, some of today's most respected (...) anthropologists demonstrate, in clear, unpretentious prose, the tremendous contributions that anthropology can make to contemporary society. They cover issues ranging from fundamentalism to forced migration, child labor to crack dealing, human rights to hunger, ethnicity to environmentalism, intellectual property rights to international capitalisms. But Exotic No More is more than a litany of gloom and doom the essays also explore topics usually associated with leisure or "high" culture, including the media, visual arts, tourism, and music. Each author uses specific examples from their fieldwork to illustrate their discussions, and 62 photographs enliven the text. Throughout the book, the contributors highlight anthropology's commitment to taking people seriously on their own terms, paying close attention to what they are saying and doing, and trying to understand how they see the world and why. Sometimes this bottom-up perspective makes the strange familiar, but it can also make the familiar strange, exposing the cultural basis of seemingly "natural" behaviors and challenging us to rethink some of our most cherished ideas--about gender, "free" markets, "race," and "refugees," among many others. Contributors: William O. Beeman Philippe Bourgois John Chernoff E. Valentine Daniel Alex de Waal Judith Ennew James Fairhead Sarah Franklin Michael Gilsenan Faye Ginsburg Alma Gottlieb Christopher Hann Faye V. Harrison Richard Jenkins Melissa Leach Margaret Lock Jeremy MacClancy Jonathan Mazower Ellen Messer A. David Napier Nancy Scheper-Hughes Jane Schneider Parker Shipton Christopher B. Steiner. (shrink)
Sara Ruddick's contemporary philosophical account of mothering reconsiders the maternal arguments used in the women's peace movements of the earlier part of this century. The culmination of this project is her 1989 book, Maternal Thinking: Toward a Politics of Peace. Ruddick's project is ground-breaking work in both academic philosophy and feminist theory. -/- In this chapter, I first look at the relationship between the two basic components of Ruddick's argument in Maternal Thinking: the "practicalist conception of truth" (PCT) and (...) feminist standpoint theory (FST). I argue that Ruddick is never clear about the exact relation between the two components. These tensions point to a deeper problem in Ruddick's discussion of the critical power of maternal thinking. -/- The diversity of maternal practices presents a genuine challenge to Ruddick’s account. I argue that neither of the components she explores can adequately ground a feminist peace politics without first answering the question of who speaks for mothers. While I can suggest ways to make Ruddick's argument consistent, she still faces-despite her claims of universality- the deeper problem of reconciling her account of maternal practice with the genuine diversity of actual maternal practices. (shrink)
In 1907, Einstein set out to fully relativize all motion, no matter whether uniform or accelerated. After ﬁve failed attempts between 1907 and 1918, he ﬁnally threw in the towel around 1920, setting himself a new goal. For the rest of his life he searched for a classical ﬁeld theory unifying gravity and electromagnetism. As he struggled to relativize motion, Einstein had to readjust both his approach and his objectives at almost every step along the way; he got himself hopelessly (...) confused at times; he fooled himself with fallacious arguments and sloppy calculations; and he committed what he later allegedly called the biggest blunder of his career: he introduced the cosmological constant. There is a very uplifting moral to this somber tale. Although Einstein never reached his original destination, the harvest of his thirteen-year odyssey is quite impressive. First of all, what is left of absolute motion in general relativity is far more palatable than absolute motion in special relativity or Newtonian theory. And general relativity does seem to eliminate absolute space. More importantly, from a modern physics point of view, Einstein produced a spectacular new theory of gravity based on what he called the equivalence principle. This principle says that inertial and gravitational effects are due to one and the same structure, the inertio-gravitational ﬁeld, which in Einstein’s theory is represented by a metric tensor ﬁeld. In addition to laying the foundations of this theory, Einstein, among other things, launched relativistic cosmology, suggested the possibility of gravitational waves, gave the ﬁrst sensible deﬁnition of a space-time singularity, and caught on to the intimate connection between general covariance and energy-momentum conservation, an example of the general connection between symmetries and conservation laws of Noether’s theorems. These results more than make up for the—at least by the standards of modern philosophy of science—rather opportunistic way in which they were obtained.. (shrink)
(1) It is not clear from Gold and Stoljar’s definition of biological neuroscience whether it includes computational and representational concepts. If so, then their evaluation of Kandel’s theory is problematic. If not, then a more direct refutation of the radical neuron doctrine is available. (2) Objections to the psychological sciences might derive not just from the conflation of the radical and the trivial neuron doctrine. There might also be the implicit belief that for many mental phenomena, adequate theories must (...) invoke neurophysiological concepts and cannot be purely psychological. (shrink)
A number of authors have pointed to “convergent evolution” as evidence for the central role of natural selection in shaping predictable trajectories of macroevolution. However, there are numerous conceptual and empirical difficulties that arise in broadly appealing to the frequency of homoplasy as evidence for a non-contingently constrained adaptational design space. Most important is the need to distinguish between convergent (externally constrained) and parallel (internally constrained) evolution, and to consider how the respective frequencies of these significantly different sources of homoplasy (...) affect a strong adaptationist view of life. In this paper, I critically evaluate Simon Conway Morris’s use of the homoplasy literature to support his argument for a non-contingent, counterfactually stable account of macroevolutionary pattern. In so doing, I offer a conception of parallelism which avoids the charge that it differs from convergence merely in degree and not in kind. I argue that although organisms sharing a homoplastic trait will also share varying degrees of homology, it is the underlying developmental homology with respect to the generators directly causally responsible for the homoplastic event that defines parallel evolution and non-arbitrarily distinguishes it from convergence. The notion of “screening-off” is used to distinguish the proximal generators of a homoplastic trait from its more distal genetic causes (such as a master control gene). (shrink)
In the Wednesday Logic Reading Group, where we are working through Sara Negri and Jan von Plato’s Structural Proof Theory – henceforth ‘NvP’ – I today introduced Chapter 6, ‘Structural Proof Analysis of Axiomatic Theories’. In their commendable efforts to be brief, the authors are sometimes a bit brisk about motivation. So I thought it was worth trying to stand back a bit from the details of this action-packed chapter as far as I understood it in the few hours (...) I had to prepare, and to try to give an overall sense of the project. These are the notes I wrote for myself. As often with such middle-of-term efforts dashed off in a couple of hours, I both would have liked to do better and do more justice to what we are reading, but I also just don’t have time to do more now than make a few corrections to the first version. So the usual warning applies: caveat lector. (shrink)
Many futurists, technologists, and democratic theorists have asserted the Internet and modern information technology are enabling the realization of an authentic direct democracy, or at least a more participatory democracy. Conversely, critics contend advances in technology are only automating the existing democracy. This article explores the potential of modern information technology to enable the emergence of a more participatory democratic system. In particular, the key foundations of modern direct democracy are analyzed with respect to promising technological developments.
In Better Never to Have Been: The Harm of Coming into Existence, I argued that coming into existence is always a harm and that procreation is wrong. In this paper, I respond to those of my critics to whom I have not previously responded. More specifically, I engage the objections of Tim Bayne, Ben Bradley, Campbell Brown, David DeGrazia, Elizabeth Harman, Chris Kaposy, Joseph Packer and Saul Smilansky.
This article presents the results of a study that investigates the way in which carriers of a mutation on the BRCA1 or the BRCA2 gene, associated with a high risk of breast and ovarian cancer, make their reproductive decisions. Using semi-structured interviews, the study explored the way in which these persons reflected on the acceptability of taking the risk of transmitting this mutation to the next generation, the arguments they used in favor or against taking that risk, and in the (...) light of these arguments, their opinion on the acceptability of preimplantation genetic diagnosis (PGD) as a reproductive option. The findings suggest that when carriers are planning to have a(nother) child, they are mainly concerned by the risk of transmitting ‘much more than a gene’: essentially painful experiences not only with respect to health, such as undergoing cancer surveillance or combatting one’s own illness, but also with regards to family life, such as witnessing the illness and death of a close relative, encountering difficulties in finding a partner or reconsidering one’s plans to have a family. As for opinions concerning the acceptability of PGD as a reproductive option, opinions about personal recourse were varied but all expressed the understanding that PGD should be made available to those persons who consider it their best option. (shrink)
Aristotle begins On Interpretation with an analysis of the existence of linguistic entities as both physical and meaningful. Two things have been lacking for a full appreciation of this analysis: a more literal translation of the passage and an ample understanding of the distinction between symbols and signs. In this article, therefore, I first offer a translation of this opening passage (16a1-9) that allows the import of Aristotle's thinking to strike the reader. Then I articulate the distinction between symbol (...) and sign so crucial to understanding this passage. Aristotle employs this distinction, I argue, in order to show how the linguistic entities he defines later in On Interpretation (that is, name, verb, denial, affirmation, declaration, and articulation) are both conventional and natural, owing to their being both symbols and signs, respectively. Finally, I suggest why Aristotle's analysis of how linguistic entities exist as both physical and meaningful is fitting, since man himself, "the animal that has speech," lives at the boundary between nature and intelligence. (shrink)
In this article, I discuss Charles Taylor's reading of Nietzsche. Taylor argues that Nietzsche presents a challenge on the 'deepest level' because, on Taylor's reading, Nietzsche forces us to consider whether or not our 'continuing allegiance to standards of justice and benevolence' goes against our inner nature. I argue that this purported Nietzschean challenge is more self-revealing of Taylor than it is foreboding, as it brings to light the tension between the open and pluralistic content of Taylor's faith, and (...) the epistemological grounding of it, which a more well-rounded appreciation of Nietzsche could help to alleviate. Key Words: Charles Taylor genealogy ontology moral reasoning Nietzsche theism. (shrink)
If explicit cognition about morality promotes moral behavior then one might expect ethics professors to behave particularly well. However, professional ethicists' behavior has never been empirically studied. The present research examined the rates at which ethics books are missing from leading academic libraries, compared to other philosophy books similar in age and popularity. Study 1 found that relatively obscure, contemporary ethics books of the sort likely to be borrowed mainly by professors and advanced students of philosophy were actually about 50% (...)more likely to be missing than non-ethics books. Study 2 found that classic (pre-1900) ethics books were about twice as likely to be missing. (shrink)
Aspects of the history of behavioural science are reviewed, pointing to its fragmented and faction-ridden nature. The emergence of evolutionary psychology (EP) is viewed in this context. With the help of a dual-layered model of behavioural control, the case is made for a more integrative perspective towards EP. The model's application to both behaviour and complex human information processing is described. Similarities in their control are noted. It is suggested that one layer of control (‘on-line’) corresponds to the encapsulated (...) modules of EP whereas the off-line controls provide the plasticity and flexibility suggested by its critics. (shrink)
A model positing that perception of another's affective state automatically generates matching emotional and instrumental responses predicts more than has ever been observed. Reflexive empathicness would produce emotional exhaustion, inhibitory strain, and debilitate everyday functioning. Self-regulation of empathic responses involves, not only reactive inhibition, but agentic proactive control. Pervasive inhumanities involve selective disengagement of empathic restraints through dissociative psychosocial mechanisms.
Abstract This call to think, to feel, to read about the title subject and to act first lists five hurdles on the way to a more peaceful and sustainable human society. A number of successful solutions are then presented, such as the UN Convention of the Law of the Sea. There follow sections on potential contributions by religion and by collaboration between science and religion. My plea is for a widespread participation at all levels of society and an attitude (...) of cautious, critical yet determined and creative advancement toward said society. (shrink)
: This paper charts the gradual development of a theory of real space, underlying the created world and constituted by the extension of God Himself, in the writings of the Cambridge Platonist, Henry More. It identifies two impediments to More's embracing such a theory in the earlier part of his career, namely his initial commitment to the principles that (a) space was not real and (b) God was not extended, and it shows how he finally came to renounce (...) these principles in order to devise the theory so closely associated with him. (shrink)
In this article, we focus on the concept of leadership ethics and make observations about transformational, transactional and servant leadership. We consider differences in how each definition of leadership outlines what the leader is supposed to achieve, and how the leader treats people in the organization while striving to achieve the organization's goals. We also consider which leadership styles are likely to be more popular in organizations that strive to maximize short run profits. Our paper does not tout or (...) degrade any of these leadership theories. Instead, it points out which theories allow reason to play more than a minimal role in ethical decision-making, as well as those that are most consistent with a firm's desire to achieve efficiency in the short run. We explain our view that the way leadership is practiced in large, bureaucratic organizations suggests that ethics is often absent from the leader's decision-making process. Consequently, we suggest that before we engage in a meaningful dialogue about what kind of leaders we might really want in business, we must consider how much short-run profit we are willing to forego in exchange for more ethical corporate cultures. (shrink)
It is commonly assumed that persons who hold abortions to be generally impermissible must, for the same reasons, be opposed to embryonic stem cell research [ESR]. Yet a settled position against abortion does not necessarily direct one to reject that research. The difference in potentiality between the embryos used in ESR and embryos discussed in the abortion debate can make ESR acceptable even if one holds that abortion is impermissible. With regard to their potentiality, in vitro embryos are here argued (...) to be more morally similar to clonable somatic cells than they are to in vivo embryos. This creates an important moral distinction between embryos in vivo and in vitro. Attempts to refute this moral distinction, raised in the recent debate in this journal between Alfonso Gómez-Lobo and Mary Mahowald, are also addressed. (shrink)
It has been suggested that there may exist languages that contain only feature-placing sentences, and hence the conceptual scheme implied by such languages is radically different from the one with which we are more familiar. Contrary to what some philosophers believe, I argue that with such languages, we may not be able to say things having approximately the force of the things we actually say, that is, to express the so-called ordinary matters merely at the expense of simplicity. For (...) one thing, in such languages not only we cannot speak of change in something, e.g., that Theaetetus grows older, but also the sense of change in the expression of change in something cannot be preserved in the feature-placing translation. (shrink)
THERE COULD BE MORE THAN ONE GOD (DEFINED BY THE NORMAL DIVINE PREDICATES), ONLY IF A FIRST GOD BRINGS ABOUT (FROM ETERNITY) A SECOND GOD, AND THE FIRST TWO BRING ABOUT A THIRD GOD. IN ORDER TO EVINCE THE GOODNESS OF SHARING AND COOPERATING IN SHARING, THEY WILL DO THIS NECESSARILY. BUT THEY DO NOT HAVE TO PRODUCE A FOURTH GOD; AND SINCE A GOD MUST EXIST NECESSARILY IF AT ALL, THERE WILL BE AND CAN BE ONLY THREE GODS. (...) BUT SINCE THEY MUTUALLY SUSTAIN EACH OTHER, THEY FORM A TRINITY. (shrink)
Solomon has made the case, in Social Empicism (2001) for socially naturalized analysis of the dynamics of scientific inquiry that takes seriously two critical insights: that scientific rationality is contingent, disunified, and socially emergent; and that scientific progress is often fostered by factors traditionally regarded as compromising sources of bias. While elements of this framework are widely shared, Solomon intends it to be more resolutely social, more thoroughly naturalizing, and more ambitiously normative than other contextualizing epistemologies currently (...) on offer. Four focal issues are addressed in the commentaries that follow: Solomon's characterization of empirical success as a goal of science (Clough); her distinction between empirical and non-empirical decision vectors and the viability of the multivariate analysis she proposes for assessing epistemic fairness in their distribution (Clough; Richardson); the plausibility of her thesis that normatively appropriate consensus is a (rare) limiting case rather than an intrinsically desirable outcome of inquiry (Oreskes; Richardson); and her conviction that a socially naturalized analysis of science can ground norms of scientific rationality (Longino; Oreskes). (shrink)
Although many people believe that more people would be better, arguments intended to show this are unconvincing. I consider one of Parfit's arguments for a related conclusion, that even when both are worth living, we ought to prefer the better of two lives. Were this argument successful, or so I claim, then it would follow that more people would be better. But there aren't reasons for preferring the better of two lives. Nor is an attempted rejoinder effective. We (...) can agree that there aren't reasons for preferring the better of two lives, and yet still maintain there are reasons for improving lives. (shrink)
So far as philosophizing is concerned with another's thoughts, it is criticizing. No true philosopher, as such, merely accepts or merely reproduces, another's thoughts. The true philosopher starts anew — independently, solitarily. But his understanding of the work done by others can make his own philosophizing surer and quicker. He must see for himself. But sometimes he can more easily avoid the pitfalls that others have discovered and marked. Thus, in the social history of philosophizing there can be not (...) only mere repetition and mere novelty, but also progress. Even the less able philosopher may improve on the results of a more able predecessor. (shrink)
In continuing news, there is a growing debate on whether current laws and regulations, both in the US and abroad, need to be strengthened as they relate to nanotechnology. On one side, experts argue that nanomaterials, which are making their way into the marketplace today, are possibly harmful to consumers and the environment, so stronger and new laws are needed to ensure they are safe. On the other side, (...) different experts argue that more regulation will slow down the pace of business and innovation in nanotechnology, or that self-regulation is the answer, or other opposing positions. This paper will draw out the core issues behind the debate and explain that there is more at stake than merely environmental, health and safety (EHS) worries or business interests, as it first appears. We will also suggest an alternative solution to stricter laws, since stricter laws would face formidable practical challenges, even if they are warranted. (shrink)
Revonsuo's evolutionary theory of dream function is extremely interesting. However, although threat avoidance theory is well grounded in experimental data, it does not take other significant dream research data into account. The theory can be integrated into a more general hypothesis which takes these data into consideration. [Revonsuo].
If philosophical moral reflection improves moral behavior, one might expect ethics professors to behave morally better than socially similar non-ethicists. Under the assumption that forms of political engagement such as voting have moral worth, we looked at the rate at which a sample of professional ethicists—and political philosophers as a subgroup of ethicists—voted in eight years’ worth of elections. We compared ethicists’ and political philosophers’ voting rates with the voting rates of three other groups: philosophers not specializing in ethics, political (...) scientists, and a comparison group of professors specializing in neither philosophy nor political science. All groups voted at about the same rate, except for the political scientists, who voted about 10–15% more often. On the face of it, this finding conflicts with the expectation that ethicists will behave more responsibly than non-ethicists. (shrink)
Helping more than “a little”: recent books on Kierkegaard and philosophy of religion Content Type Journal Article Category Book Review Pages 1-16 DOI 10.1007/s11153-012-9345-6 Authors J. Aaron Simmons, Department of Philosophy, Furman University, 3300 Poinsett Hwy, Greenville, SC 29613, USA Journal International Journal for Philosophy of Religion Online ISSN 1572-8684 Print ISSN 0020-7047.
There is a kinship between Owen Flanagan's The Really Hard Problem and William James's The Varieties of Religious Experience that not only can help us to understand Flanagan's book but also can help scholars, particularly scholars of religion, to be attentive to an important development in the realm of the "spiritual but not religious." Specifically, Flanagan's book continues a tradition in philosophy, exemplified by James, that addresses questions of religious or spiritual meaning in terms accessible to a broad audience outside (...) the context of organized religions. Both James and Flanagan are concerned to refute the popular perception that the sciences of the mind pose a threat to meaning and particularly to meaningful processes of human growth and transformation. Where James used the subconscious to bridge between science and religion and persuade his readers of the reality of the More, Flanagan uses a scientifically grounded understanding of transcendence to enchant his readers into believing in Less. Although I think that Flanagan's attempt to link the psychological and sociocultural levels of analysis via the concept of transcendence is scientifically premature, his attempt at a naturalistic spirituality raises questions of definition that scholars of religion need to take seriously. (shrink)
In this report I provide an introduction to the burgeoning field of hypercomputation – the study of machines that can compute more than Turing machines. I take an extensive survey of many of the key concepts in the field, tying together the disparate ideas and presenting them in a structure which allows comparisons of the many approaches and results. To this I add several new results and draw out some interesting consequences of hypercomputation for several different disciplines.
When Dewey scholars and educational theorists appeal to the value of educative growth, what exactly do they mean? Is an individual's growth contingent on receiving a formal education? Is growth too abstract a goal for educators to pursue? Richard Rorty contended that the request for a “criterion of growth” is a mistake made by John Dewey's “conservative critics,” for it unnecessarily restricts the future “down to the size of the present.” Nonetheless, educational practitioners inspired by Dewey's educational writings may ask (...) Dewey scholars and educational theorists, “How do I facilitate growth in my classroom?” Here Shane Ralston asserts, in spite of Rorty's argument, that searching for a more concrete standard of Deweyan growth is perfectly legitimate. In this essay, Ralston reviews four recent books on Dewey's educational philosophy—Naoko Saito's The Gleam of Light: Moral Perfectionism and Education in Dewey and Emerson, Stephen Fishman and Lucille McCarthy's John Dewey and the Philosophy and Practice of Hope, and James Scott Johnston's Inquiry and Education: John Dewey and the Quest for Democracy and Deweyan Inquiry: From Educational Theory to Practice—and through his analysis identifies some possible ways for Dewey-inspired educators to make growth a more practical pedagogical ideal. (shrink)
Many people hold this truth to be self-evident that universities should enroll more female students in science and engineering; the main question then being how. Typical arguments include possible benefits to women, possible benefits to the economy, and the unfairness of the current female under-representation. However, when clearly stated and scrutinized these arguments in fact lead to the conclusion that there should be more women in scientific disciplines in higher education in the sense that we should expect (...) class='Hi'>more women (which various kinds of discrimination may prevent), not that we should actively enroll more women. Outreach programs towards high school students may therefore be logically incompatible with the arguments supposed to justify them. They should purport to allow women to graduate in a field congruent with her abilities and desires, rather than try to draw as many of them to scientific disciplines as possible: one cannot try to ‘recruit’ as many female students as possible while claiming to help them choose more freely. (shrink)
Half of the 33.2 million people living with HIV today are women. Yet, responses to the epidemic are not adequately meeting the needs of women. This article critically evaluates how prevention of mother-to-child transmission (PMTCT) programs, the principal framework under which women's health is currently addressed in the global response to AIDS, have tended to focus on the prevention of HIV transmission from HIV-positive women to their infants. This paper concludes that more than ten years after their inception, PMTCT (...) programs still do not successfully ensure the adequate treatment, care and support of HIV-infected women. Of particular concern is the continued widespread use of single-dose nevirapine despite World Health Organization recommendations to employ more effective combination therapies that do not potentially jeopardize women's future treatment outcomes. In response, the article calls for a more comprehensive approach that places women's health needs at the centre of AIDS responses. This is critical in settings where the pandemic is generalized and there is a push to greatly expand PMTCT programs, as a more effective and equitable way of meeting the needs of women in the context of HIV. Without such a comprehensive approach, women will continue to be impacted disproportionately by the pandemic, and current strategies for prevention, including PMTCT, and treatment will not be as effective and responsive as they need to be. (shrink)
The evolution of sexual reproduction is a case of explanatory pluralism, meaning that there is more than one explanation for this phenomenon. I use the concept of a domain to more clearly explicate the various explananda that can be found in this case. I argue that although pluralism with respect to some types of domains can be decreased using van Fraassen’s pragmatics of explanation, there remains an important class of domain, an orthogonal domain, for which this is not (...) the case. (shrink)
When a distinction is drawn between “total” knowledge and “problem-specific” knowledge, it is seen that successful users of the recognition heuristic have more problem-specific knowledge than people unable to exploit this heuristic. So it is not ignorance that makes them smart, but knowledge.
This article argues that Agamben's ?paradigmatic method? leads to particular choices in his depiction of the figure of the homo sacer. Reviewing this project also suggests that there's more to history?the example given is the story of homo sacer?than Agamben's method would ever leave us to say. In other words, there are still resources in the tradition for something new, and thus there is much more left to say about its legacies.
Although there are many different moral arguments concerning the use of Best Interests in neonatal decision-making, there seems in practice a firm commitment to application of the concept. And yet, there is still little reflection given by practitioners about what employing a Best Interest determination means in infant care. The following lays out a comprehensive taxonomy of interest-sources in order to provide for more robust considerations of what constitutes best interests of/for neonates.