I will argue that 'Continental Philosophy' is an Anglo-American invention. It is 'Pseudo-Continentalism,' no more than a highly selective rendering of Western European Philosophy. Borne out of opposition to the dominance of analytical philosophy in our universities, Pseudo-Continentalism in fact converges with analysis in remarkable ways. Both are advertised as revolutions in thought and both stand over against the tradition of speculative philosophy: both repeat eachother's historical shibboleths about traditional speculative philosophy in respect of the completeness of reason and of (...) reality, the priority of identity and totality, the predetermined fixity of teleology. What this amounts to is a common rejection of a chimera, which in Pseudo-Continental Philosophy is usually called onto-theology or the metaphysics of presence and in the analytic tradition is sometimes called speculative philosophy. Here, indeed, the analytic tradition is moreradical: as I will show, it characteristically rejects any notion of a special kind of activity of actualisation as a feature of the real, whether this is understood as Being, mind, will, the élan vital. Difference, or the impotential. These are the vestiges of the tradition of speculative philosophy that are retained under the rubric of Continental Philosophy. (shrink)
In Process and Reality (1929) and subsequent writings, A.N. Whitehead builds on the success of the Frege-Russell generalization of the mathematical function and develops his philosophy on that basis. He holds that the proper generalization of the meaning of the function shows that it is primarily to be defined in terms of many-to-one mapping activity, which he terms 'creativity'. This allows him to generalize the range of the function, so that it constitutes a universal ontology of construction or 'process'. He (...) analyzes the concept of God in terms of functional mapping to structure, and he defines finite entities as iterative 'occasions' of mapping activity. He thus challenges the widespread logical-analytical view that the connectives and variables of a function in its different instantiations are merely numerically different, and he develops a fallibilist theory of activity as essentially serial in nature. (shrink)
Bradley contends that the semiology of Charles Sanders Peirce , the founder of pragmatism, is a standing challenge as much to Gadamerian hermeneutics as to Saussure’s structuralism and its deconstructionist progeny. For Peirce physical matter itself is one specific mode of the activity of semiosis or sign interpretation. The paper outlines the central point and purpose of Peirce’s general metaphysics and describe the basic features of his theory of signs.
In his book Radical Evolution: The Promise and Peril of Enhancing Our Minds, Our Bodies—and What It Means to Be Human (2005), author-journalist Joel Garreau identifies four technologies whose synergistic activity may transform humankind into a state transcending present human nature: genetic, robotic, information, and nano (GRIN) technologies. If the GRIN technologies follow Moore's Law, as information technology has done for the past four decades, Homo sapiens and human society may be unimaginably different before the middle of this century. But (...) among scientists, futurists, and other pundits there is no agreement on the nature and ramifications of this transformation. Based on dozens of interviews, Garreau sees three possible scenarios for our species. The Heaven Scenario foresees enhanced bodies and minds in a disease-free world, perhaps even immortality; the Hell Scenario warns of losing our identity as a biological entity and perhaps the demise of liberal democracy; the Prevail Scenario predicts that we will muddle through the GRIN technology revolution basically intact, as we have prevailed during past technological upheavals. In this review, these scenarios are examined in the context of Kuhn's “normal” versus “extraordinary” science and in the context of current understanding about gene function. (shrink)
This is an interesting, sophisticated collection of philosophical essays on the hiddenness of God, in the specific sense that God has not made his existence sufficiently clear. The question addressed is whether or not such hiddenness is compatible with the existence of a creator God. The incompatibility thesis is argued by J. L. Schellenberg in his well-known work, Divine Hiddenness and Human Reason. He claims there that if there were a perfectly loving creator God, such a God would ensure that (...) there are no inculpable nonbelievers. As there are, no such God exists. In the present collection, Schellenberg contributes “What the Hiddenness of God Reveals: A Collaborative Discussion,” which is a complex, subtle reconsideration of the issues in dialogue form that introduces a notion of the “religious” that is wider and looser than that connected with theism. By contrast, all the other contributors defend some form of the compatibility thesis against Schellenberg. The basic outlines of the debate are laid out in the excellent Introduction by the editors, and the ensuing essays consider a number of fundamental issues. (shrink)
An exploration of the moral and ethical implications of new biotechnologies Many of the ethical issues raised by new technologies have not been widely examined, discussed, or indeed settled. For example, robotics technology challenges the notion of personhood. Should a robot, capable of making what humans would call ethical decisions, be held responsible for those decisions and the resultant actions? Should society reward and punish robots in the same way that it does humans? Likewise, issues of safety, environmental concerns, and (...) distributive justice arise with the increasing acceptance of genetically modified organisms in food production nanotechnology in engineering and medicine, and human gene therapy and enhancement. The problem of dual-use—when a technology can be used both to benefit and to harm—exists with virtually all new technologies but is central in the context of emerging 21st century technologies ranging from artificial intelligence and robotics to human gene-editing and brain-computer interfacing. In Re-Creating Nature: Science, Technology, and Human Values in the Twenty-First Century, James T. Bradley addresses emerging biotechnologies with prodigious potential to benefit humankind but that are also fraught with ethical consequences. Some actually possess the power to directly alter the evolution of life on earth including human. Specifically, these topics include stem cells, synthetic biology, GMOs in agriculture, nanotechnology, bioterrorism, CRISPR gene-editing technology, three-parent babies, robotics and roboethics, artificial intelligence, and human brain research and neurotechnologies. Offering clear explanations of these various technologies, a pragmatic presentation of the conundrums involved, and questions that illuminate hypothetical situations, Bradley guides discussions of these and other thorny issues resulting from the development of new biotechnologies. He also highlights the responsibilities of scientists to conduct research in an ethical manner and the responsibilities of nonscientists to become “science literate” in the twenty-first century. (shrink)
This work has been selected by scholars as being culturally important, and is part of the knowledge base of civilization as we know it. This work was reproduced from the original artifact, and remains as true to the original work as possible. Therefore, you will see the original copyright references, library stamps (as most of these works have been housed in our most important libraries around the world), and other notations in the work. This work is in the public domain (...) in the United States of America, and possibly other nations. Within the United States, you may freely copy and distribute this work, as no entity (individual or corporate) has a copyright on the body of the work. As a reproduction of a historical artifact, this work may contain missing or blurred pages, poor pictures, errant marks, etc. Scholars believe, and we concur, that this work is important enough to be preserved, reproduced, and made generally available to the public. We appreciate your support of the preservation process, and thank you for being an important part of keeping this knowledge alive and relevant. (shrink)
The discipline of mathematics has not been spared the sweeping critique of postmodernism. Is mathematical theory true for all time, or are mathematical constructs in fact fallible? This fascinating book examines the tensions that have arisen between modern and postmodern views of mathematics, explores alternative theories of mathematical truth, explains why the issues are important, and shows how a Christian perspective makes a difference. Contributors: W. James Bradley William Dembski Russell W. Howell Calvin Jongsma David Klanderman Christopher Menzel Glen VanBrummelen (...) Scott VanderStoep Michael Veatch Paul Zwier. (shrink)
According to Charles Taylor, practical reasoning helps us overcome cultural conflicts of val-ue when we are able to show that the passage from one value to another represents an epistemic gain. This paper argues that practical reasoning can be effective in pathological cases of cultural convergence but only if it is understood as a species of the creative social imagination.
In mainstream Anglo-American philosophy, the relation between cognition and community has been defined primarily in terms of the generalization of the mathematical function, especially as a model for the nature of rules, which thus come to be under-stood as algorithms. This leads to the elimination of both the reflexive, synthesizing subject, and the intrinsic communal-historical nature of argumentation and belief-formation. Against this approach, I follow R.G. Collingwood’s hitherto unrecognized strategy in his Essay on Metaphysics and argue that the relation of (...) cognition and community is better understood by way of the ancient and forgotten model of creedal rules of faith or trust. These will be shown to have the logical form of first person performative rules of faith or trust that generate third person declaratives or proposi-tions, and so constitute the possibility conditions for an argumentational logic of question and answer. They restore the synthetic subject, for they are not algorithms but reflexive and interpretive formulae; they are communally constituted and so historically saturated; and they reinstate an ontological theory of truth as disclosure, with coherence and comprehensiveness as its criteria. In these respects, as Collingwood saw, the creedal model provides a fresh interpretation of the historicality of argumentation and redefines the relation of cognition and community in terms of the interdependence of faith and reason. (shrink)