Is God's foreknowledge compatible with human freedom? One of the most attractive attempts to reconcile the two is the Ockhamistic view, which subscribes not only to human freedom and divine omniscience, but retains our most fundamental intuitions concerning God and time: that the past is immutable, that God exists and acts in time, and that there is no backward causation. In order to achieve all that, Ockhamists distinguish ‘hard facts’ about the past which cannot possibly be altered from ‘soft facts’ (...) about the past which are alterable, and argue that God's prior beliefs about human actions are soft facts about the past. (shrink)
Characterizations of philosophy abound. It is ‘the queen of the sciences’, a grand and sweeping metaphysical endeavour; or, less regally, it is a sort of deep anthropology or ‘descriptive metaphysics’, uncovering the general presuppositions or conceptual schemes that lurk beneath our words and thoughts. A different set of images portray philosophy as a type of therapy, or as a spiritual exercise, a way of life to be followed, or even as a special branch of poetry or politics. Then there is (...) a group of characterizations that include philosophy as linguistic analysis, as phenomenological description, as conceptual geography, or as genealogy in the sense proposed by Nietzsche and later taken up by Foucault. (shrink)
What is a natural kind ? As we shall see, the concept of a natural kind has a long history. Many of the interesting doctrines can be detected in Aristotle, were revived by Locke and Leibniz, and have again become fashionable in recent years. Equally there has been agreement about certain paradigm examples: the kinds oak, stickleback and gold are natural kinds, and the kinds table, nation and banknote are not. Sadly agreement does not extend much further. It is impossible (...) to discover a single consistent doctrine in the literature, and different discussions focus on different doctrines without writers or readers being aware of the fact. In this paper I shall attempt to find a defensible distinction between natural and non-natural kinds. (shrink)
‘Reactionary modernism’ is a term happily coined by the historian and sociologist Jeffrey Herf to refer to a current of German thought during the interwar years. It indicates the attempt to ‘reconcil[e] the antimodernist, romantic and irrationalist ideas present in German nationalism’ with that ‘most obvious manifestation of means–ends rationality … modern technology’. Herf's paradigm examples of this current of thought are two best-selling writers of the period: Oswald Spengler, author of the massive domesday scenario The Decline of the West (...) in 1917 and, fifteen years later, of Man and Technics, and Ernst Jünger, the now centenarian chronicler of the war in which he was a much-decorated hero, whose main theoretical work was Der Arbeiter in 1932. The label is also applied by Herf to such intellectual luminaries as the legal theorist and apologist for the Third Reich, Carl Schmitt, and more contentiously Martin Heidegger. At a less elevated level, reactionary modernism also permeated the writings of countless, now forgotten engineers, who were inspired at once by the new technology, Nietzschean images of Promethean Übermenschen, and an ethos of völkisch nationalism. (shrink)
G.E. Moore, more than either Bertrand Russell or Ludwig Wittgenstein, was chiefly responsible for the rise of the analytic method in twentieth-century philosophy. This selection of his writings shows Moore at his very best. The classic essays are crucial to major philosophical debates that still resonate today. Amongst those included are: * A Defense of Common Sense * Certainty * Sense-Data * External and Internal Relations * Hume's Theory Explained * Is Existence a Predicate? * Proof of an External World (...) In addition, this collection also contains the key early papers in which Moore signals his break with idealism, and three important previously unpublished papers from his later work which illustrate his relationship with Wittgenstein. (shrink)
In this paper I shall venture into an area with which I am not very familiar and in which I feel far from confident; namely into phenomenology. My main motive is not to get away from standard, boring, methodological questions like those of induction and demarcation; but the conviction that a phenomenological account of the empirical basis forms a necessary complement to Popper's falsificationism. According to the latter, a scientific theory is a synthetic and universal, hence unverifiable proposition. In fact, (...) in order to be technologically useful, a scientific hypothesis must refer to future states-of-affairs; it ought therefore to remain unverified. But in order to be empirical, a theory must bear some kind of relation to factual statements. According to Popper, such a relation can only be one of potential conflict. Thus a theory T will be termed scientific if and only if T is logically incompatible with a so-called basic statement b, where b is both empirically verifiable and empirically falsifiable. In other words: T is scientific if it entails ¬b; where b, hence also ¬b, is an empirically decidable proposition. (shrink)
G. E. Moore's ‘A Defence of Common Sense’ has generated the kind of interest and contrariety which often accompany what is new, provocative, and even important in philosophy. Moore himself reportedly agreed with Wittgenstein's estimate that this was his best article, while C. D. Broad has lamented its very great but largely unfortunate influence. Although the essay inspired Wittgenstein to explore the basis of Moore's claim to know many propositions of common sense to be true, A. J. Ayer judges its (...) enduring value to lie in provoking a more sophisticated conception of the very type of metaphysics which disputes any such unqualified claim of certainty. (shrink)
In 1929, doubtless to the discomfort of his logical positivist host Moritz Schlick, Wittgenstein remarked, ‘To be sure, I can understand what Heidegger means by Being and Angst ’ . I return to what Heidegger meant and Wittgenstein could understand later. I begin with that remark because it has had an instructive career. When the passage which it prefaced was first published in 1965, the editors left it out—presumably to protect a hero of ‘analytic’ philosophy from being compromised by an (...) expression of sympathy for the arch-fiend of ‘continental’ philosophy. It was as if a diary of Churchill's had been discovered containing admiring references to Hitler. This was the period, after all, when Heidegger was, as Michael Dummett recalls, a ‘joke’ among Oxford philosophers, the paradigm of the sort of metaphysical nonsense Wittgenstein had dedicated himself to exposing. (shrink)
If one is an egalitarian, what should one want to equalize? Opportunities or outcomes? Resources or welfare? These positions are usually conceived to be very different. I argue in this paper that the distinction is misconceived: the only coherent conception of resource equality implies welfare equality, in an appropriately abstract description of the problem. In this section, I motivate the program which the rest of the paper carries out.
An important contribution to the foundations of probability theory, statistics and statistical physics has been made by E. T. Jaynes. The recent publication of his collected works provides an appropriate opportunity to attempt an assessment of this contribution.
In this philosophy classic, which was first published in 1951, E. R. Dodds takes on the traditional view of Greek culture as a triumph of rationalism. Using the analytical tools of modern anthropology and psychology, Dodds asks, "Why should we attribute to the ancient Greeks an immunity from 'primitive' modes of thought which we do not find in any society open to our direct observation?" Praised by reviewers as "an event in modern Greek scholarship" and "a book which it would (...) be difficult to over-praise," _The Greeks and the Irrational _was Volume 25 of the Sather Classical Lectures series. (shrink)
The Anglican Thirty Nine Articles join catholic Christendom in affirming that: There is but one living and true God…and in unity of this Godhead there be three Persons of one substance, power, and eternity; the Father, the Son, and the Holy Ghost.
How could the self be a substance? There are various ways in which it could be, some familiar from the history of philosophy. I shall be rejecting these more familiar substantivalist approaches, but also the non-substantival theories traditionally opposed to them. I believe that the self is indeed a substance—in fact, that it is a simple or noncomposite substance—and, perhaps more remarkably still, that selves are, in a sense, self-creating substances. Of course, if one thinks of the notion of substance (...) as an outmoded relic of prescientific metaphysics—as the notion of some kind of basic and perhaps ineffable stuff —then the suggestion that the self is a substance may appear derisory. Even what we ordinarily call ‘stuffs’—gold and water and butter and the like—are, it seems, more properly conceived of as aggregates of molecules or atoms, while the latter are not appropriately to be thought of as being ‘made’ of any kind of ‘stuff’ at all. But this only goes to show that we need to think in terms of a more sophisticated notion of substance—one which may ultimately be traced back to Aristotle's conception of a ‘primary substance’ in the Categories , and whose heir in modern times is W. E. Johnson's notion of the ‘continuant’. It is the notion, that is, of a concrete individual capable of persisting identically through qualitative change, a subject of alterable predicates that is not itself predicable of any further subject. (shrink)
???Everyone agrees that the moral features of things supervene on their natural features??? , 22). Everyone is wrong, or so I will argue. In the first section, I explain the version of moral supervenience that Smith and others argue everyone should accept. In the second section, I argue that the mere conceptual possibility of a divine command theory of morality is sufficient to refute the version of moral supervenience under consideration. Lastly, I consider and respond to two objections, showing, among (...) other things, that while DCT is sufficient to refute this version of moral supervenience it is not necessary. (shrink)
The strong weak truth table (sw) reducibility was suggested by Downey, Hirschfeldt, and LaForte as a measure of relative randomness, alternative to the Solovay reducibility. It also occurs naturally in proofs in classical computability theory as well as in the recent work of Soare, Nabutovsky, and Weinberger on applications of computability to differential geometry. We study the sw-degrees of c.e. reals and construct a c.e. real which has no random c.e. real (i.e., Ω number) sw-above it.
In an incisive critique of Professor Hick's Evil and the God of Love , Professor Puccetti claims to ‘carry the campaign as well as the battle’—i.e. to show that, with respect to evil, theists ‘are either “explaining it away” or saying it cannot be explained at all. And in both cases they are in effect admitting they have no rational defence to offer. Which means that despite appearances they really are abandoning the battlefield.’.
Galen’s Commentaries on the Hippocratic Epidemics constitute one of the most detailed studies of Hippocratic medicine from Antiquity. The Arabic translation of the Commentaries by Ḥunayn ibn Isḥāq is of crucial importance because it preserves large sections now lost in Greek, and because it helped to establish an Arabic clinical literature. The present contribution investigate the translation of this seminal work into Syriac and Arabic. It provides a first survey of the manuscript tradition, and explores how physicians in the medieval (...) Muslim world drew on it both to teach medicine to students, and to develop a framework for their own clinical research. (shrink)