In "Representations without Rules, Connectionism and the Syntactic Argument'', Kenneth Aizawa argues against the view that connectionist nets can be understood as processing representations without the use of representation-level rules, and he provides a positive characterization of how to interpret connectionist nets as following representation-level rules. He takes Terry Horgan and John Tienson to be the targets of his critique. The present paper marshals functional and methodological considerations, gleaned from the practice of cognitive modelling, to argue against Aizawa's characterization of (...) how connectionist nets may be understood as making use of representation-level rules. (shrink)
Theories of moral, and more generally, practical reasoning sometimes draw on the notion of coherence. Admirably, Paul Thagard has attempted to give a computationally detailed account of the kind of coherence involved in practical reasoning, claiming that it will help overcome problems in foundationalist approaches to ethics. The arguments herein rebut the alleged role of coherence in practical reasoning endorsed by Thagard. While there are some general lessons to be learned from the preceding, no attempt is made to argue against (...) all forms of coherence in all contexts. Nor is the usefulness of computational modelling called into question. The point will be that coherence cannot be as useful in understanding moral reasoning as coherentists may think. This result has clear implications for the future of Machine Ethics, a newly emerging subfield of AI. (shrink)
‘Particularism’ and ‘generalism’ refer to families of positions in the philosophy of moral reasoning, with the former playing down the importance of principles, rules or standards, and the latter stressing their importance. Part of the debate has taken an empirical turn, and this turn has implications for AI research and the philosophy of cognitive modeling. In this paper, Jonathan Dancy’s approach to particularism (arguably one of the best known and most radical approaches) is questioned both on logical and empirical grounds. (...) Doubts are raised over whether Dancy’s brand of particularism can adequately explain the graded nature of similarity assessments in analogical arguments. Also, simple recurrent neural network models of moral case classification are presented and discussed. This is done to raise concerns about Dancy’s suggestion that neural networks can help us to understand how we could classify situations in a way that is compatible with his particularism. Throughout, the idea of a surveyable standard—one with restricted length and complexity—plays a key role. Analogical arguments are taken to involve multidimensional similarity assessments, and surveyable contributory standards are taken to be attempts to articulate the dimensions of similarity that may exist between cases. This work will be of relevance both to those who have interests in computationally modeling human moral cognition and to those who are interested in how such models may or may not improve our philosophical understanding of such cognition. (shrink)
David Bohm's interpretation of quantum mechanics yields a quantum potential, Q. In his early work, the effects of Q are understood in causal terms as acting through a real (quantum) field which pushes particles around. In his later work (with Basil Hiley), the causal understanding of Q appears to have been abandoned. The purpose of this paper is to understand how the use of certain metaphors leads Bohm away from a causal treatment of Q, and to evaluate the use of (...) those metaphors. (shrink)
Terence Horgan and John Tienson claim that folk psychological laws are different in kind from basic physical laws in at least two ways: first, physical laws do not possess the kind of ceteris paribus qualifications possessed by folk psychological laws, which means the two types of laws have different logical forms; and second, applied physical laws are best thought of as being about an idealized world and folk psychological laws about the actual world. I argue that Horgan and Tienson have (...) not made a persuasive case for either of the preceding views. (shrink)
This paper responds to criticisms levelled by Fodor, Pylyshyn, and McLaughlin against connectionism. Specifically, I will rebut the charge that connectionists cannot account for representational systematicity without implementing a classical architecture. This will be accomplished by drawing on Paul Smolensky's Tensor Product model of representation and on his insights about split-level architectures.
This paper identifies a type of multi-source (case-based) reasoning and differentiates it from other types of analogical reasoning. Work in cognitive science on mental space mapping or conceptual blending is used to better understand this type of reasoning. The type of argument featured herein will be shown to be a kind of source-blended argument. While it possesses some similarities to traditionally conceived analogical arguments, there are important differences as well. The triple contract (a key development in the usury debates of (...) the fifteenth and sixteenth centuries) will be shown to make use of source-blended arguments. (shrink)
My paper proceeds in three stages: 1) the traditional relationship between philosophy and theology; 2) how the “foundationalist” issue affects this debate; 3) some final reflections. This essay, along with the previous one by Jack Bonsor, was originally presented to the “Theology in the Seminary Context” seminar at the Catholic Theological Society of America convention in June, 1995.
The article discusses current philosophical issues in foundationalism and anti-foundationalism as well as their ramifications for theological epistemology. Thestrengths and weaknesses of the anti-foundationalist theological current are also assessed.
This paper presents the results of training an artificial neural network (ANN) to classify moral situations. The ANN produces a similarity space in the process of solving its classification problem. The state space is subjected to analysis that suggests that holistic approaches to interpreting its functioning are problematic. The idea of a contributory or pro tanto standard, as discussed in debates between moral particularists and generalists, is used to understand the structure of the similarity space generated by the ANN. A (...) spectrum of possibilities for reasons, from atomistic to holistic, is discussed. Reasons are understood as increasing in nonlocality as they move away from atomism. It is argued that contributory standards could be used to understand forms of nonlocality that need not go all the way to holism. It is also argued that contributory standards may help us to understand the kind of similarity at work in analogical reasoning and argument in ethics. Some objections to using state space approaches to similarity are dealt with, as are objections to using empirical and computational work in philosophy. (shrink)
Bruce Waller has defended a deductive reconstruction of the kinds of analogical arguments found in ethics, law, and metaphysics. This paper demonstrates the limits of such a reconstruction and argues for an alternative. non-deductive reconstruction. It will be shown that some analogical arguments do not fit Waller's deductive schema, and that such a schema does not allow for an adequate account of the strengths and weaknesses of an analogical argument. The similarities and differences between the account defended herein and the (...) Trudy Govier's account are discussed as well. (shrink)
Work on analogy has been done from a number of disciplinary perspectives throughout the history of Western thought. This work is a multidisciplinary guide to theorizing about analogy. It contains 1,406 references, primarily to journal articles and monographs, and primarily to English language material. classical through to contemporary sources are included. The work is classified into eight different sections (with a number of subsections). A brief introduction to each section is provided. Keywords and key expressions of importance to research on (...) analogy are discussed in the introductory material. Electronic resources for conducting research on analogy are listed as well. (shrink)
The goal of the Ontology Summit 2010 was to address the current shortage of persons with ontology expertise by developing a strategy for the education of ontologists. To achieve this goal we studied how ontologists are currently trained, the requirements identiﬁed by organizations that hire ontologists, and developments that might impact the training of ontologists in the future. We developed recommendations for the body of knowledge that should be taught and the skills that should be developed by future ontologists; these (...) recommendations are intended as guidelines for institutions and organizations that may consider establishing a program for training ontologists. Further, we recommend a number of speciﬁc actions for the community to pursue. (shrink)
We describe a novel affect-inspired mechanism to improve the performance of computational systems operating in dynamic environments. In particular, we designed a mechanism that is based on aspects of the fear response in humans to dynamically reallocate operating system-level central processing unit (CPU) resources to processes as they are needed to deal with time-critical events. We evaluated this system in the MINIX® and Linux® operating systems and in three different testing environments (two simulated, one live). We found the affect-based system (...) was not only able to react more rapidly to time-critical events as intended, but since the dynamic processes for handling these events did not need to use significant CPU when they were not in time-critical situations, our simulated unmanned aerial vehicle (UAV) was able to perform even non-emergency tasks at a higher level of efficiency and reactivity than was possible in the standard implementation. (shrink)
Présentation de la découverte, dans le cadre de travaux préparatoires à l'édition critique de la traduction du "De amicitia" de Cicéron par Laurent de Premierfait, d'un manuscrit inconnu de la critique et qui contient deux traductions elles aussi inconnues et inédites. Il s'agit du manuscrit, vraisemblablement de dédicace, de la traduction du "De amicitia" et de la traduction du "De adulatore" de GuarinoGuarini, transposition latine du texte de Plutarque. Ce manuscrit produit pour Philippe de Crèvecoeur, seigneur d'Esquerdes, (...) dans les années 1490, a été acquis en 1985 par la SMAF et à ce titre déposé à la BnF. Présentation du texte (auteur, date, localisation, procédés de traduction) et éditions des deux prologues du traducteur. (shrink)
This paper is a modification of Nicola Guarino and Christopher Welty's conception of the subsumption relation. Guarino and Welty require that that whether one property may subsume the other should depend on the modal metaproperties of those properties. I argue that the part of their account that concerns the metaproperty carrying a criterion of identity is essentially flawed. Subsequently, I propose to constrain the subsumption relation not, as Guarino and Welty require, by means of incompatible criteria of (...) absolute identity but by means of incompatible criteria of relative identity. After discussing the benefits of applying relative identity in ontological investigations I provide a formal framework in which to prove a counterpart of the identity criteria constraint. (shrink)
The paper provides a qualified defence of Bruce Waller’s deductivist schema for a priori analogical arguments in ethics and law. One crucial qualification is that the schema represents analogical arguments as complexes composed of one deductive inference (hence “deductivism”) but also of one non-deductive subargument. Another important qualification is that the schema is informed by normative assumptions regarding the conditions that an analogical argument must satisfy in order for it to count as an optimal instance of its kind. Waller’s schema (...) (in qualified form) is defended from criticisms formulated by Trudy Govier, Marcello Guarini and Lilian Bermejo-Luque. (shrink)
This essay argues for postmodern, non-metaphysical, non-foundational perspectives within Roman Catholic theological discourse. It was originally presented, along with the following article by Thomas Guarino, to the “Theology in the Seminary Context” seminar at the Catholic Theological Society of America convention in June, 1995.