Theories of moral, and more generally, practical reasoning sometimes draw on the notion of coherence. Admirably, Paul Thagard has attempted to give a computationally detailed account of the kind of coherence involved in practical reasoning, claiming that it will help overcome problems in foundationalist approaches to ethics. The arguments herein rebut the alleged role of coherence in practical reasoning endorsed by Thagard. While there are some general lessons to be learned from the preceding, no attempt is made to argue against (...) all forms of coherence in all contexts. Nor is the usefulness of computational modelling called into question. The point will be that coherence cannot be as useful in understanding moral reasoning as coherentists may think. This result has clear implications for the future of Machine Ethics, a newly emerging subfield of AI. (shrink)
‘Particularism’ and ‘generalism’ refer to families of positions in the philosophy of moral reasoning, with the former playing down the importance of principles, rules or standards, and the latter stressing their importance. Part of the debate has taken an empirical turn, and this turn has implications for AI research and the philosophy of cognitive modeling. In this paper, Jonathan Dancy’s approach to particularism (arguably one of the best known and most radical approaches) is questioned both on logical and empirical grounds. (...) Doubts are raised over whether Dancy’s brand of particularism can adequately explain the graded nature of similarity assessments in analogical arguments. Also, simple recurrent neural network models of moral case classification are presented and discussed. This is done to raise concerns about Dancy’s suggestion that neural networks can help us to understand how we could classify situations in a way that is compatible with his particularism. Throughout, the idea of a surveyable standard—one with restricted length and complexity—plays a key role. Analogical arguments are taken to involve multidimensional similarity assessments, and surveyable contributory standards are taken to be attempts to articulate the dimensions of similarity that may exist between cases. This work will be of relevance both to those who have interests in computationally modeling human moral cognition and to those who are interested in how such models may or may not improve our philosophical understanding of such cognition. (shrink)
In "Representations without Rules, Connectionism and the Syntactic Argument'', Kenneth Aizawa argues against the view that connectionist nets can be understood as processing representations without the use of representation-level rules, and he provides a positive characterization of how to interpret connectionist nets as following representation-level rules. He takes Terry Horgan and John Tienson to be the targets of his critique. The present paper marshals functional and methodological considerations, gleaned from the practice of cognitive modelling, to argue against Aizawa's characterization of (...) how connectionist nets may be understood as making use of representation-level rules. (shrink)
David Bohm's interpretation of quantum mechanics yields a quantum potential, Q. In his early work, the effects of Q are understood in causal terms as acting through a real (quantum) field which pushes particles around. In his later work (with Basil Hiley), the causal understanding of Q appears to have been abandoned. The purpose of this paper is to understand how the use of certain metaphors leads Bohm away from a causal treatment of Q, and to evaluate the use of (...) those metaphors. (shrink)
Terence Horgan and John Tienson claim that folk psychological laws are different in kind from basic physical laws in at least two ways: first, physical laws do not possess the kind of ceteris paribus qualifications possessed by folk psychological laws, which means the two types of laws have different logical forms; and second, applied physical laws are best thought of as being about an idealized world and folk psychological laws about the actual world. I argue that Horgan and Tienson have (...) not made a persuasive case for either of the preceding views. (shrink)
This paper identifies a type of multi-source (case-based) reasoning and differentiates it from other types of analogical reasoning. Work in cognitive science on mental space mapping or conceptual blending is used to better understand this type of reasoning. The type of argument featured herein will be shown to be a kind of source-blended argument. While it possesses some similarities to traditionally conceived analogical arguments, there are important differences as well. The triple contract (a key development in the usury debates of (...) the fifteenth and sixteenth centuries) will be shown to make use of source-blended arguments. (shrink)
This paper responds to criticisms levelled by Fodor, Pylyshyn, and McLaughlin against connectionism. Specifically, I will rebut the charge that connectionists cannot account for representational systematicity without implementing a classical architecture. This will be accomplished by drawing on Paul Smolensky's Tensor Product model of representation and on his insights about split-level architectures.
My paper proceeds in three stages: 1) the traditional relationship between philosophy and theology; 2) how the “foundationalist” issue affects this debate; 3) some final reflections. This essay, along with the previous one by Jack Bonsor, was originally presented to the “Theology in the Seminary Context” seminar at the Catholic Theological Society of America convention in June, 1995.
The article discusses current philosophical issues in foundationalism and anti-foundationalism as well as their ramifications for theological epistemology. Thestrengths and weaknesses of the anti-foundationalist theological current are also assessed.
This paper is a modification of Nicola Guarino and Christopher Welty's conception of the subsumption relation. Guarino and Welty require that that whether one property may subsume the other should depend on the modal metaproperties of those properties. I argue that the part of their account that concerns the metaproperty carrying a criterion of identity is essentially flawed. Subsequently, I propose to constrain the subsumption relation not, as Guarino and Welty require, by means of incompatible criteria of (...) absolute identity but by means of incompatible criteria of relative identity. After discussing the benefits of applying relative identity in ontological investigations I provide a formal framework in which to prove a counterpart of the identity criteria constraint. (shrink)
This essay argues for postmodern, non-metaphysical, non-foundational perspectives within Roman Catholic theological discourse. It was originally presented, along with the following article by Thomas Guarino, to the “Theology in the Seminary Context” seminar at the Catholic Theological Society of America convention in June, 1995.