This paper concerns the attempt to formulate an empirical version of the problem of evil, and the attempt to counter this version by what is known as ‘sceptical theism’. My concern is to assess what is actually achieved in these attempts. To this end I consider the debate between them against the backdrop of William Rowe's distinction between expanded standard theism and restricted standard theism (which I label E and R respectively). My claim is that the empirical version significantly fails (...) to challenge E in the way that a workable logical version would; and that sceptical theism significantly fails to defend R in the way that a workable theodicy would. My conclusion is that sceptical theism and the empirical argument play a significantly more limited role in the debate over evil than the arguments they are supposed to replace. (Published Online August 11 2004). (shrink)
This paper continues a strain of intellectual complaint against the presumptions of certain kinds of formal semantics (the qualification is important) and their bad effects on those areas of artificial intelligence concerned with machine understanding of human language. After some discussion of the use of the term epistemology in artificial intelligence, the paper takes as a case study the various positions held by McDermott on these issues and concludes, reluctantly, that, although he has reversed himself on the issue, there was (...) no time at which he was right. (shrink)
Arguments strong enough to justify skeptical theism will be strong enough to justify the position that every claim about God is empirically unfalsifiable. This fact is problematic because that position licenses further arguments which are clearly unreasonable, but which the skeptical theist cannot consistently accept as such. Avoiding this result while still achieving the theoretical objectives looked for in skeptical theism appears to demand an impossibly nuanced position.
I discuss two questions: (1) would Duhem have accepted the thesis of the continuity of scientific methodology? and (2) to what extent is the Oxford tradition of classification/subalternation of sciences continuous with early modern science? I argue that Duhem would have been surprised by the claim that scientific methodology is continuous; he expected at best only a continuity of physical theories, which he was trying to isolate from the perpetual fluctuations of methods and metaphysics. I also argue that the evidence (...) does not support the conclusion that early modern doctrines about mathematics and physics are continuous with the subalternation of sciences from Grosseteste, Bacon, and the theologians of fourteenth-century Oxford. The official and dominant context for early modern scientific methodology seems to have been progressive Thomism, and early modern thinkers seem to have pitted themselves against it. (shrink)
When John von Neumann turned his interest to computers, he was one of the leading mathematicians of his time. In the 1940s, he helped design two of the ﬁrst stored-program digital electronic computers. He authored reports explaining the functional organization of modern computers for the ﬁrst time, thereby inﬂuencing their construction worldwide (von Neumann, 1945; Burks et al., 1946). In the ﬁrst of these reports, von Neumann described the computer as analogous to a brain, with an input “organ” (analogous to (...) sensory neurons), a memory, an arithmetical and a logical “organ” (analogous to associative neurons), and an output “organ” (analogous to motor neurons). His experience with computers convinced him that brains and computers, both having to do with the processing of information, should be studied by a new discipline–automata theory. In fact, according to von Neumann, automata theory would cover not only computers and brains, but also any biological or artiﬁcial systems that dealt with information and control, including robots and genes. Von Neumann never formulated a full-blown mathematical theory of automata, but he wrote several important exploratory papers (von Neumann, 1951, 1956, 1966). Meanwhile, besides designing hardware, he developed some of the ﬁrst programs, programming languages, programming techniques, and numerical methods for solving mathematical problems using computers. (Much of his work on computing is reprinted in Aspray and Burks, 1987.) Shortly before his death in 1956, he wrote an informal synthesis of his views about brains. Though von Neumann left his manuscript sketchy and unﬁnished, Yale University Press published it as The Com- puter and the Brain in 1958. The 2000 reprint of this small but informative book is an opportunity to learn, or be reminded of, von Neumann’s thoughts on the computational organization of the mind-brain. Von Neumann began by explaining computers, which for him were essentially number crunchers: to compute was “to operate on .. (shrink)
Abelard maintains that individual words in a sentence represent distinct semantic units of its overall meaning. He employs two strategies to defend thisposition in the face of troublesome counterexamples. One strategy—the earlier of the two—sacrifices normal intuitions about what a word is, often labeling whatseem to be words as non-signifying syllables. The later strategy invokes a rather fluid conception of what the signification of a word is, allowing this significationconsiderable latitude to alter under the contextual influence of other words. This (...) evolution of strategy is linked to a new willingness on Abelard’s part to adopt theprinciple of charity in interpreting sentences; this approach presumes the truth of the statement, and tries to find an interpretation which bears that presumptionout. This new willingness to adopt the principle is in turn linked to Abelard’s developing vocation as an interpreter of biblical texts. (shrink)
The internal friction of cold-worked polycrystalline copper has been measured in the temperature range 20° to 300°K, and a study made of the maximum in the curve of internal friction versus temperature first observed by Bordoni. The measurements indicate how the internal friction varies as a function of the amount of cold work, and also the effect of impurities in the metal and of neutron irradiation. The results are discussed with particular reference to recent theoretical treatments by Seeger.
The paper describes a system for dealing with nestings of belief in terms of the mechanism of computational environment. A method is offered for computing the beliefs of A about B (and so on) in terms of the systems existing knowledge structures about A and B separately. A proposal for belief percolation is put forward: percolation being a side effect of the process of the computation of nested beliefs, but one which could explain the acquisition of unsupported beliefs. It is (...) argued that the mechanism proposed is compatible with a general least effort hypothesis concerning human mental functioning. (shrink)
The cleavage surfaces of 75 type I and 75 type II diamonds have been examined using microscopy and multiple-beam interferometry. The work confirms an earlier qualitative observation that the surface of a type II diamond exhibits a more regular cleavage pattern. There are in general, more cleavage lines on type I diamonds, and also a greater number of the so-called river systems. The results suggest that impurities within the lattice lead to a more broken substructure in the type I diamonds. (...) It is also shown that birefringence and counting properties are not appreciably reflected in the cleavage patterns, and that very few type II diamonds show evidence of a laminated structure. (shrink)
The paper contrasts three approaches to the extension of lexical sense: what we shall call, respectively, lexical tuning; a second based on lexical closeness and relaxation; and a third known as underspecification, or the use of lexical rules. These approaches have quite different origins in artificial intelligence (AI) and linguistics, and involve corpus input, lexicons and knowledge bases in quite different ways. Moreover, the types of sense extension they claim to deal with in their principal examples are different as well. (...) The purpose of these contrasts in the paper is to establish the possibility of evaluating their differing claims by means of the current markup and test paradigm that has been so successful recently in the closely related task of word sense discrimination (WSD). The key question in the paper is what the relationship of sense extension to WSD is, and its conclusion is that, at the moment, not all types of sense extension heuristic can be evaluated within the current paradigm requiring markup and test. (shrink)
This article seeks to present for the first time a more systematic account of Edith Stein’s views on death and dying. First, I will argue that death does not necessarily lead us to an understanding of our earthly existence as aevum, that is, an experience of time between eternity and finite temporality. We always bear the mark of our finitude, including our finite temporality, even when we exist within the eternal mind of God. To claim otherwise, is to make (...) identical our eternity with God’s eternity, thereby undermining the traditional Scholastic argument, which Stein holds, that there is no real relation between the being (and, therefore, (a)temporality) of God and the being of human persons. Second, I will argue that Stein excludes the category of potentiality from her discussion of death as a relation between the fullness or actuality of being and nothingness. In fact, death is more a relation between possibility/potentiality and nothingness than a relation between actual fullness and nothingness. What Stein describes as fullness ought to be read as potential. (shrink)
The goal of this article is to analyze the way in which Edith Stein describes the human subject throughout her research, including her phenomenological phaseand the period of her Christian philosophy. In order to do this, I trace essential moments in Husserl’s philosophy, showing both Stein’s reliance upon Husserl andher originality. Both thinkers believe that an analysis of the human being can be carried out by examining consciousness and its lived experiences. Through suchan examination Stein arrives at the same (...) conclusion as Husserl, namely, that the human subject is formed of body, psyche, and spirit (Geist). Stein’s originalityconsists in a further development of the complexity of the human being. She maps this out, providing detailed analyses of the I, the soul, the spirit, and, ultimately,the person. She makes use of medieval philosophical anthropology, including that of Thomas Aquinas and Augustine of Hippo. (shrink)
Edith Stein is honored today not only because of her sainthood but because of what is now seen as important and groundbreaking work in phenomenology done under especially arduous conditions. Thus it may be said with some accuracy that Stein is, among philosophers, in the comparatively rare category of being acknowledged both for her work and her exemplary life. Writing on Stein has standardly proceeded with an emphasis on the biographical factors that caused her to live and write as (...) she did. One often reads that Stein was reared in a strongly Judaic tradition—her family was more observant, for example, than the family of Simone Weil—but that experiences she had as a young woman caused her to turn in the .. (shrink)
I examine Edith Stein’s argument for the existence of God found in Finite and Eternal Being. Although largely Thomistic in its structure, the proof is unique in its details, starting with the life of the ego (Ichleben) and ascending to the being of God. The ego is shown to be contingent in its being as well as in the meaning-content through which it lives. Stein argues that this dependent being cannot be accounted for without a being that does not (...) need to receive its being, namely, God. She then turns to the felt security of being as a counter to Heideggerian Angst as a revelatory mood, arguing that security puts us into contact with divine being. She concludes by admitting that proofs rarely convince because of the infinite distance between creature and creator, but concedes to them a role, nonetheless, in shrinking the distance between belief and unbelief. (shrink)
St. Thomas Aquinas has been considered a kairos in intellectual history for seeing God’s essence as being. Martin Heidegger criticized philosophers forrepresenting being as a be-ing and identifying it with God, and Jean-Luc Marion speaks of “God without being.” In her Potency and Act Edith Stein introduced thecategory of being without essence, but such being is not God but “the opposite.” For St. Augustine sin was an approach to nonbeing, and Stein saw it leading to a“displacement into nonbeing,” to (...) an “annihilation” where only a “null being” is retained. This eschatological reflection is an intriguing aspect of her “fusion” ofscholasticism and phenomenology. (shrink)