G.E. Moore, more than either Bertrand Russell or Ludwig Wittgenstein, was chiefly responsible for the rise of the analytic method in twentieth-century philosophy. This selection of his writings shows Moore at his very best. The classic essays are crucial to major philosophical debates that still resonate today. Amongst those included are: * A Defense of Common Sense * Certainty * Sense-Data * External and Internal Relations * Hume's Theory Explained * Is Existence a Predicate? * Proof of an External World (...) In addition, this collection also contains the key early papers in which Moore signals his break with idealism, and three important previously unpublished papers from his later work which illustrate his relationship with Wittgenstein. (shrink)
An important contribution to the foundations of probability theory, statistics and statistical physics has been made by E. T. Jaynes. The recent publication of his collected works provides an appropriate opportunity to attempt an assessment of this contribution.
In this philosophy classic, which was first published in 1951, E. R. Dodds takes on the traditional view of Greek culture as a triumph of rationalism. Using the analytical tools of modern anthropology and psychology, Dodds asks, "Why should we attribute to the ancient Greeks an immunity from 'primitive' modes of thought which we do not find in any society open to our direct observation?" Praised by reviewers as "an event in modern Greek scholarship" and "a book which it would (...) be difficult to over-praise," _The Greeks and the Irrational _was Volume 25 of the Sather Classical Lectures series. (shrink)
Is God's foreknowledge compatible with human freedom? One of the most attractive attempts to reconcile the two is the Ockhamistic view, which subscribes not only to human freedom and divine omniscience, but retains our most fundamental intuitions concerning God and time: that the past is immutable, that God exists and acts in time, and that there is no backward causation. In order to achieve all that, Ockhamists distinguish ‘hard facts’ about the past which cannot possibly be altered from ‘soft facts’ (...) about the past which are alterable, and argue that God's prior beliefs about human actions are soft facts about the past. (shrink)
What is a natural kind ? As we shall see, the concept of a natural kind has a long history. Many of the interesting doctrines can be detected in Aristotle, were revived by Locke and Leibniz, and have again become fashionable in recent years. Equally there has been agreement about certain paradigm examples: the kinds oak, stickleback and gold are natural kinds, and the kinds table, nation and banknote are not. Sadly agreement does not extend much further. It is impossible (...) to discover a single consistent doctrine in the literature, and different discussions focus on different doctrines without writers or readers being aware of the fact. In this paper I shall attempt to find a defensible distinction between natural and non-natural kinds. (shrink)
If one is an egalitarian, what should one want to equalize? Opportunities or outcomes? Resources or welfare? These positions are usually conceived to be very different. I argue in this paper that the distinction is misconceived: the only coherent conception of resource equality implies welfare equality, in an appropriately abstract description of the problem. In this section, I motivate the program which the rest of the paper carries out.
E-Z Reader 7 is a processing model of eye-movement control. One constraint imposed on the model is that high-level cognitive processes do not influence eye movements unless normal reading processes are disturbed. I suggest that this constraint is unnecessary, and that the model provides a sensible architecture for explaining how both low- and high-level processes influence eye movements.
O presente texto procura acompanhar alguns aspectos da reconstrução sartreana das relações entre indivíduo e história, tentando mostrar que a fenomenologia e o materialismo dialético comparecem nessa proposta de conhecimento e que é a convergência das duas perspectivas que permite, contemplando adequadamente a universalidade e a singularidade, descrever e compreender dialeticamente o modo histórico de produção da identidade individual.
In this paper I shall venture into an area with which I am not very familiar and in which I feel far from confident; namely into phenomenology. My main motive is not to get away from standard, boring, methodological questions like those of induction and demarcation; but the conviction that a phenomenological account of the empirical basis forms a necessary complement to Popper's falsificationism. According to the latter, a scientific theory is a synthetic and universal, hence unverifiable proposition. In fact, (...) in order to be technologically useful, a scientific hypothesis must refer to future states-of-affairs; it ought therefore to remain unverified. But in order to be empirical, a theory must bear some kind of relation to factual statements. According to Popper, such a relation can only be one of potential conflict. Thus a theory T will be termed scientific if and only if T is logically incompatible with a so-called basic statement b, where b is both empirically verifiable and empirically falsifiable. In other words: T is scientific if it entails ¬b; where b, hence also ¬b, is an empirically decidable proposition. (shrink)
How could the self be a substance? There are various ways in which it could be, some familiar from the history of philosophy. I shall be rejecting these more familiar substantivalist approaches, but also the non-substantival theories traditionally opposed to them. I believe that the self is indeed a substance—in fact, that it is a simple or noncomposite substance—and, perhaps more remarkably still, that selves are, in a sense, self-creating substances. Of course, if one thinks of the notion of substance (...) as an outmoded relic of prescientific metaphysics—as the notion of some kind of basic and perhaps ineffable stuff —then the suggestion that the self is a substance may appear derisory. Even what we ordinarily call ‘stuffs’—gold and water and butter and the like—are, it seems, more properly conceived of as aggregates of molecules or atoms, while the latter are not appropriately to be thought of as being ‘made’ of any kind of ‘stuff’ at all. But this only goes to show that we need to think in terms of a more sophisticated notion of substance—one which may ultimately be traced back to Aristotle's conception of a ‘primary substance’ in the Categories , and whose heir in modern times is W. E. Johnson's notion of the ‘continuant’. It is the notion, that is, of a concrete individual capable of persisting identically through qualitative change, a subject of alterable predicates that is not itself predicable of any further subject. (shrink)
Ancient moral philosophers, especially Aristotle and his followers, typically shared the assumption that ethics is primarily concerned with how to achieve the final end for human beings, a life of “happiness” or “human flourishing.” This final end was not a subjective condition, such as contentment or the satisfaction of our preferences, but a life that could be objectively determined to be appropriate to our nature as human beings. Character traits were treated as moral virtues because they contributed well toward this (...) ideal life, either as means to it or as constitutive aspects of it. Traits that tended to prevent a “happy” life were considered vices, even if they contributed to a life that was pleasant and what a person most wanted. The idea of “happiness” was central, then, in philosophical efforts to specify what we ought to do, what sort of persons we should try to become, and what sort of life a wise person would hope for. (shrink)
The strong weak truth table (sw) reducibility was suggested by Downey, Hirschfeldt, and LaForte as a measure of relative randomness, alternative to the Solovay reducibility. It also occurs naturally in proofs in classical computability theory as well as in the recent work of Soare, Nabutovsky, and Weinberger on applications of computability to differential geometry. We study the sw-degrees of c.e. reals and construct a c.e. real which has no random c.e. real (i.e., Ω number) sw-above it.
Characterizations of philosophy abound. It is ‘the queen of the sciences’, a grand and sweeping metaphysical endeavour; or, less regally, it is a sort of deep anthropology or ‘descriptive metaphysics’, uncovering the general presuppositions or conceptual schemes that lurk beneath our words and thoughts. A different set of images portray philosophy as a type of therapy, or as a spiritual exercise, a way of life to be followed, or even as a special branch of poetry or politics. Then there is (...) a group of characterizations that include philosophy as linguistic analysis, as phenomenological description, as conceptual geography, or as genealogy in the sense proposed by Nietzsche and later taken up by Foucault. (shrink)
‘Reactionary modernism’ is a term happily coined by the historian and sociologist Jeffrey Herf to refer to a current of German thought during the interwar years. It indicates the attempt to ‘reconcil[e] the antimodernist, romantic and irrationalist ideas present in German nationalism’ with that ‘most obvious manifestation of means–ends rationality … modern technology’. Herf's paradigm examples of this current of thought are two best-selling writers of the period: Oswald Spengler, author of the massive domesday scenario The Decline of the West (...) in 1917 and, fifteen years later, of Man and Technics, and Ernst Jünger, the now centenarian chronicler of the war in which he was a much-decorated hero, whose main theoretical work was Der Arbeiter in 1932. The label is also applied by Herf to such intellectual luminaries as the legal theorist and apologist for the Third Reich, Carl Schmitt, and more contentiously Martin Heidegger. At a less elevated level, reactionary modernism also permeated the writings of countless, now forgotten engineers, who were inspired at once by the new technology, Nietzschean images of Promethean Übermenschen, and an ethos of völkisch nationalism. (shrink)
Affirmative action programs remain controversial, I suspect, partly because the familiar arguments for and against them start from significantly different moral perspectives. Thus I want to step back for a while from the details of debate about particular programs and give attention to the moral viewpoints presupposed in different types of argument. My aim, more specifically, is to compare the “messages” expressed when affirmative action is defended from different moral perspectives. Exclusively forward-looking arguments, I suggest, tend to express the wrong (...) message, but this is also true of exclusively backward-looking arguments. However, a moral outlook that focuses on cross-temporal narrative values suggests a more appropriate account of what affirmative action should try to express. Assessment of the message, admittedly, is only one aspect of a complex issue, but it is a relatively neglected one. My discussion takes for granted some common-sense ideas about the communicative function of action, and so I begin with these. Actions, as the saying goes, often speak louder than words. There are times, too, when only actions can effectively communicate the message we want to convey and times when giving a message is a central part of the purpose of action. What our actions say to others depends largely, though not entirely, upon our avowed reasons for acting; and this is a matter for reflective decision, not something we discover later by looking back at what we did and its effects. The decision is important because “the same act” can have very different consequences, depending upon how we choose to justify it. (shrink)
I. Introduction Two kinds of remedies have traditionally been employed for breach of contract: legal relief and equitable relief. Legal relief normally takes the form of money damages. Equitable relief normally consists either of specific performance or an injunction – that is, the party in breach may be ordered to perform an act or to refrain from performing an act. In this article I will use a “consent theory of contract” to assess the choice between money damages and specific performance. (...) According to such a theory, contractual obligation is dependent on more fundamental entitlements of the parties and arises as a result of the parties' consent to transfer alienable rights. My thesis will be that the normal rule favoring money damages should be replaced with one that presumptively favors specific performance unless the parties have consented to money damages instead. The principal obstacle to such an approach is the reluctance of courts to specifically enforce contracts for personal services. The philosophical distinction between alienable and inalienable rights bolsters this historical reticence, since a right to personal services may be seen as inalienable. I will then explain why, if the subject matter of a contract for personal services is properly confined to an alienable right to money damages for failure to perform, specific enforcement of such contracts is no longer problematic. Finally, I shall consider whether the subject matter of contracts for corporate services is properly confined to money damages like contracts for personal services, or whether performance of corporate services can be made the subject of a valid rights transfer and judicially compelled in the same manner as contracts for external resources. (shrink)
Why does the problem of free will seem so intractable? I surmise that in large measure it does so because the free will debate, at least in its modern form, is conducted in terms of a mistaken approach to causality in general. At the heart of this approach is the assumption that all causation is fundamentally event causation. Of course, it is well-known that some philosophers of action want to invoke in addition an irreducible notion of agent causation, applicable only (...) in the sphere of intelligent agency. But such a view is generally dismissed as incompatible with the naturalism that has now become orthodoxy amongst mainstream analytical philosophers of mind. What I want to argue is that substances, not events, are the primary relata of causal relations and that agent causation should properly be conceived of as a species of substance causation. I shall try to show that by thus reconceiving the nature of causation and of agency, the problem of free will can be made more tractable. I shall also argue for a contention that may seem even less plausible at first sight, namely, that such a view of agency is perfectly compatible with a volitionist theory of action. (shrink)
G. E. Moore's ‘A Defence of Common Sense’ has generated the kind of interest and contrariety which often accompany what is new, provocative, and even important in philosophy. Moore himself reportedly agreed with Wittgenstein's estimate that this was his best article, while C. D. Broad has lamented its very great but largely unfortunate influence. Although the essay inspired Wittgenstein to explore the basis of Moore's claim to know many propositions of common sense to be true, A. J. Ayer judges its (...) enduring value to lie in provoking a more sophisticated conception of the very type of metaphysics which disputes any such unqualified claim of certainty. (shrink)
According to the distinguished philosopher Richard Wollheim, an emotion is an extended mental episode that originates when events in the world frustrate or satisfy a pre-existing desire. This leads the subject to form an attitude to the world which colours their future experience, leading them to attend to one aspect of things rather than another, and to view the things they attend to in one light rather than another. The idea that emotions arise from the satisfaction or frustration of desires—the (...) ‘match-mismatch’ view of emotion aetiology—has had several earlier incarnations in the psychology of emotion. Early versions of this proposal were associated with the attempt to replace the typology of emotion found in ordinary language with a simpler theory of drives and to define new emotion types in terms of general properties such as the frustration of a drive. The match-mismatch view survived the demise of that revisionist project and is found today in theories that accept a folk-psychological-style taxonomy of emotion types based on the meaning ascribed by the subject to the stimulus situation. For example, the match-mismatch view forms part of the subtle and complex model of emotion episodes developed over many years by Nico Frijda. According to Frijda, information about the ‘situational antecedents’ of an emotion—the stimulus in its context, including the ongoing goals of the organism—is evaluated for its relevance to the multiple concerns of the organism. Evaluation of match-mismatch—the degree of compatibility between the situation and the subject's goals—forms part of this process. (shrink)
This essay first distinguishes different questions regarding moral objectivity and relativism and then sketches a broadly Kantian position on two of these questions. First, how, if at all, can we derive, justify, or support specific moral principles and judgments from more basic moral standards and values? Second, how, if at all, can the basic standards such as my broadly Kantian perspective, be defended? Regarding the first question, the broadly Kantian position is that from ideas in Kant's later formulations of the (...) Categorical Imperative, especially human dignity and rational autonomous law-making, we can develop an appropriate moral perspective for identifying and supporting more specific principles. Both the deliberative perspective and the derivative principles can be viewed as “constructed,” but in different senses. In response to the second question, the essay examines two of Kant's strategies for defending his basic perspective and the important background of his arguments against previous moral theories. (shrink)
Philosophers have debated for millennia about whether moral requirements are always rational to follow. The background for these debates is often what I shall call “the self-interest model.” The guiding assumption here is that the basic demand of reason, to each person, is that one must, above all, advance one's self-interest. Alternatively, debate may be framed by a related, but significantly different, assumption: the idea that the basic rational requirement is to develop and pursue a set of personal ends in (...) an informed, efficient, and coherent way, whether one's choice of ends is based on self-interested desires or not. For brevity I refer to this as “the coherence-and-efficiency model.” Advocates of both models tend to think that, while it is sufficiently clear in principle what the rational thing to do is, what remains in doubt is whether it is always rational to be moral. They typically assume that morality is concerned, entirely or primarily, with our relations to others, especially with obligations that appear to require some sacrifice or compromise with the pursuit of self-interest. (shrink)
Epistemology, as I understand it, is a branch of philosophy especially concerned with general questions about how we can know various things or at least justify our beliefs about them. It questions what counts as evidence and what are reasonable sources of doubt. Traditionally, episte-mology focuses on pervasive and apparently basic assumptions covering a wide range of claims to knowledge or justified belief rather than very specific, practical puzzles. For example, traditional epistemologists ask “How do we know there are material (...) objects?” and not “How do you know which are the female beetles?” Similarly, moral epistemology, as I understand it, is concerned with general questions about how we can know or justify our beliefs about moral matters. Its focus, again, is on quite general, pervasive, and apparently basic assumptions about what counts as evidence, what are reasonable sources of doubt, and what are the appropriate procedures for justifying particular moral claims. (shrink)