Computer-based argument mapping greatly enhances student critical thinking, more than tripling absolute gains made by other methods. I describe the method and my experience as an outsider. Argument mapping often showed precisely how students were erring (for example: confusing helping premises for separate reasons), making it much easier for them to fix their errors.
This paper presents an attempt to integrate theories of causal processes—of the kind developed by Wesley Salmon and Phil Dowe—into a theory of causal models using Bayesian networks. We suggest that arcs in causal models must correspond to possible causal processes. Moreover, we suggest that when processes are rendered physically impossible by what occurs on distinct paths, the original model must be restricted by removing the relevant arc. These two techniques suffice to explain cases of late preëmption and other cases (...) that have proved problematic for causal models. (shrink)
We present a minimum message length (MML) framework for trajectory partitioning by point selection, and use it to automatically select the tolerance parameter ε for Douglas-Peucker partitioning, adapting to local trajectory complexity. By examining a range of ε for synthetic and real trajectories, it is easy to see that the best ε does vary by trajectory, and that the MML encoding makes sensible choices and is robust against Gaussian noise. We use it to explore the identification of micro-activities within a (...) longer trajectory. This MML metric is comparable to the TRACLUS metric – and shares the constraint of abstracting only by omission of points – but is a true lossless encoding. Such encoding has several theoretical advantages – particularly with very small segments (high frame rates) – but actual performance interacts strongly with the search algorithm. Both differ from unconstrained piecewise linear approximations, including other MML formulations. (shrink)
We present a probabilistic extension to active path analyses of token causation (Halpern & Pearl 2001, forthcoming; Hitchcock 2001). The extension uses the generalized notion of intervention presented in (Korb et al. 2004): we allow an intervention to set any probability distribution over the intervention variables, not just a single value. The resulting account can handle a wide range of examples. We do not claim the account is complete --- only that it fills an obvious gap in previous active-path approaches. (...) It still succumbs to recent counterexamples by Hiddleston (2005), because it does not explicitly consider causal processes. We claim three benefits: a detailed comparison of three active-path approaches, a probabilistic extension for each, and an algorithmic formulation. (shrink)
James McAllister’s 2003 article, “Algorithmic randomness in empirical data” claims that empirical data sets are algorithmically random, and hence incompressible. We show that this claim is mistaken. We present theoretical arguments and empirical evidence for compressibility, and discuss the matter in the framework of Minimum Message Length (MML) inference, which shows that the theory which best compresses the data is the one with highest posterior probability, and the best explanation of the data.
The investigation of probabilistic causality has been plagued by a variety of misconceptions and misunderstandings. One has been the thought that the aim of the probabilistic account of causality is the reduction of causal claims to probabilistic claims. Nancy Cartwright (1979) has clearly rebutted that idea. Another ill-conceived idea continues to haunt the debate, namely the idea that contextual unanimity can do the work of objective homogeneity. It cannot. We argue that only objective homogeneity in combination with a causal interpretation (...) of Bayesian networks can provide the desired criterion of probabilistic causality. (shrink)
We present a probabilistic extension to active path analyses of token causation. The extension uses the generalized notion of intervention presented in : we allow an intervention to set any probability distribution over the intervention variables, not just a single value. The resulting account can handle a wide range of examples. We do not claim the account is complete --- only that it fills an obvious gap in previous active-path approaches. It still succumbs to recent counterexamples by Hiddleston, because it (...) does not explicitly consider causal processes. We claim three benefits: a detailed comparison of three active-path approaches, a probabilistic extension for each, and an algorithmic formulation. (shrink)
Paper presented to the Twenty-seventh Hume Society Conference, 26 July 2000, Williamsburg, Virginia. -/- At the time I thought there was a stronger link between Maclaurin and Hume, but in discussions at and after the meeting, decided Hume was not taking his mechanics out of Maclaurin’s Account. Although I still have found Maclaurin useful in interpreting Hume -- see Sapadin 1997 for a discussion of popular Newtonianism in Hume's day -- I suspect my draft suffers somewhat from ambivalence. There are (...) still similarities, and possible avenues of influence, arguing that Hume was not ignorant of the new mechanics, but it also becomes clear that he did not understand it: although he adopts the Newtonian measure of force, he misapplies it. (shrink)
Artificial Intelligence (AI) and Philosophy of Science share a fundamental problem—that of understanding causality. Bayesian network techniques have recently been used by Judea Pearl in a new approach to understanding causality and causal processes (Pearl, 2000). Pearl’s approach has great promise, but needs to be supplemented with an explicit account of causal interaction. Thus far, despite considerable interest, philosophy has provided no useful account of causal interaction. Here we provide one, employing the concepts of Bayesian networks. With it we demonstrate (...) the failure of one of philosophy’s more sophisticated attempts to deal with the concept of causal interaction, that of Ellery Eells’ Probabilistic Causality (1991). (shrink)
Part of our fascination with the Maya can be attributed to the fact that they were literate . . . that is, the Classic Maya possessed a visible language that consisted of letters and a grammar, and one of the products of their literacy was the book. (Aveni 1992b, p.3).
Artificial Intelligence (AI) and Philosophy of Science share a fundamental problem—understanding causality. Bayesian networks have recently been used by Judea Pearl in a new approach to understanding causality (Pearl, 2000). Part of understanding causality is understanding causal interaction. Bayes nets can represent any degree of causal interaction, and researchers normally try to limit interactions, usually by replacing the full CPT with a noisy-OR function. But we show that noisy-OR and another common model are merely special cases of the general linear (...) systems definition of noninteraction. However, they apply in different situations, and we can measure the degree of causal interaction relative to any such model. (shrink)
Using Bayesian network causal models, we provide a simple general account of probabilistic causal interaction. We also detail problems in the leading accounts by Ellery Eells, and any others which require valence reversals, contextual unanimity, or average effects.
[David Charles] Aristotle, it appears, sometimes identifies well-being (eudaimonia) with one activity (intellectual contemplation), sometimes with several, including ethical virtue. I argue that this appearance is misleading. In the Nicomachean Ethics, intellectual contemplation is the central case of human well-being, but is not identical with it. Ethically virtuous activity is included in human well-being because it is an analogue of intellectual contemplation. This structure allows Aristotle to hold that while ethically virtuous activity is valuable in its own right, the (...) best life available for humans is centred around, but not wholly constituted by, intellectual contemplation. /// [Dominic Scott] In Nicomachean Ethics X 7-8, Aristotle distinguishes two kinds of eudaimonia, primary and secondary. The first corresponds to contemplation, the second to activity in accordance with moral virtue and practical reason. My task in this paper is to elucidate this distinction. Like Charles, I interpret it as one between paradigm and derivative cases; unlike him, I explain it in terms of similarity, not analogy. Furthermore, once the underlying nature of the distinction is understood, we can reconcile the claim that paradigm eudaimonia consists just in contemplation with a passage in the first book requiring eudaimonia to involve all intrinsic goods. (shrink)
Charles Taylor’s idea of “deep diversity” has played a major role in the debates around multiculturalism in Canada and around the world. Originally, the idea was meant to account for how the different national communities within Canada – those of the English-speaking Canadians, the French-speaking Quebeckers, and the Aboriginals – conceive of their belonging to the country in different ways. But Taylor conceives of these differences strictly in terms of irreducibility; that is, he fails to see that they also (...) exist in such a way that the country cannot be said to form a unified whole. After giving an account of the philosophical as well as religious reasons behind his position, the chapter goes on to describe some of its political implications. (shrink)
Charles Sanders Peirce , the most important and influential of the classical American philosophers, is credited as the inventor of the philosophical school of pragmatism. The scope and significance of his work have had a lasting effect not only in several fields of philosophy but also in mathematics, the history and philosophy of science, and the theory of signs, as well as in literary and cultural studies. Largely obscure until after his death, Peirce's life has long been a subject (...) of interest and dispute. Unfortunately, previous biographies often confuse as much as they clarify crucial matters in Peirce's story. Ketner's new biographical project is remarkable not only for its entertaining aspects but also for its illuminating insights into Peirce's life, his thought, and the intellectual milieu in which he worked. (shrink)
The article focuses on the Philosophy of Freedom of the Swiss philosopher Charles Secrétan (1815-1895) and on the attempt to reconcile freedom as the fundamental experience for the human being with the alleged necessitarianism that would result from the positive sciences. The notion of “fall” as it is found in the Christian tradition allows Secrétan to rediscover an original dimension from which we can conceive the laws of nature as contingent. It is space and time that impose their constraints (...) and lead to the mismatch between the different faculties (sensitivity, imagination, intelligence, will) that is constitutive for the human experience and that prevents us from “being at any moment the whole of ourselves”. A peculiarity of Secrétan’s conception of space is that he does not see it as a condition for the numerical plurality of human beings. (shrink)
In this systematic introduction to the philosophy of Charles S. Peirce, the author focuses on four of Peirce's fundamental conceptions: pragmatism and Peirce's development of it into what he called 'pragmaticism'; his theory of signs; his phenomenology; and his theory that continuity is of prime importance for philosophy. He argues that at the centre of Peirce's philosophical project is a unique form of metaphysical realism, whereby continuity and evolutionary change are both necessary for our understanding of experience. In his (...) final chapter Professor Hausman applies this version of realism to contemporary controversies between anti-realists and anti-idealists. Peirce's views are compared to those of such contemporary figures as Davidson, Putnam, and Rorty. The book will be of particular interest to philosophers concerned with American philosophy and current debates on realism as well as linguists working in semiotics. (shrink)
In this chapter I discuss Charles Taylor's and Paul Ricoeur's theories of narrative identity and narratives as a central form of self-interpretation. Both Taylor and Ricoeur think that self-identity is a matter of culturally and socially mediated self-definitions, which are practically relevant for one's orientation in life. First, I will go through various characterisations that Ricoeur gives of his theory, and try to show to what extent they also apply to Taylor's theory. Then, I will analyse more closely (...) class='Hi'>Charles Taylor's, and in section three, Paul Ricoeur's views on narrative identity. (shrink)
Charles Taylor is one of the most influential and prolific philosophers in the English-speaking world today. The breadth of his writings is unique, ranging from reflections on artificial intelligence to analyses of contemporary multicultural societies. This thought-provoking introduction to Taylor's work outlines his ideas in a coherent and accessible way without reducing their richness and depth. His contribution to many of the enduring debates within Western philosophy is examined and the arguments of his critics assessed. Taylor's reflections on the (...) topics of moral theory, selfhood, political theory and epistemology form the core chapters within the book. Ruth Abbey engages with the secondary literature on Taylor's work and suggests that some criticisms by contemporaries have been based on misinterpretations and suggests ways in which a better understanding of Taylor's work leads to different criticisms of it. The book serves as an ideal companion to Taylor's ideas for students of philosophy and political theory, and will be welcomed by the non-specialist looking for an authoritative guide to Taylor's large and challenging body of work. (shrink)
Charles Sanders Peirce was born in September 1839 and died five months before the guns of August 1914. He is perhaps the most important mind the United States has ever produced. He made significant contributions throughout his life as a mathematician, astronomer, chemist, geodesist, surveyor, cartographer, metrologist, engineer, and inventor. He was a psychologist, a philologist, a lexicographer, a historian of science, a lifelong student of medicine, and, above all, a philosopher, whose special fields were logic and semiotics. He (...) is widely credited with being the founder of pragmatism. In terms of his importance as a philosopher and a scientist, he has been compared to Plato and Aristotle. He himself intended "to make a philosophy like that of Aristotle." Peirce was also a tormented and in many ways tragic figure. He suffered throughout his life from various ailments, including a painful facial neuralgia, and had wide swings of mood which frequently left him depressed to the state of inertia, and other times found him explosively violent. Despite his consistent belief that ideas could find meaning only if they "worked" in the world, he himself found it almost impossible to make satisfactory economic and social arrangements for himself. This brilliant scientist, this great philosopher, this astounding polymath was never able, throughout his long life, to find an academic post that would allow him to pursue his major interest, the study of logic, and thus also fulfill his destiny as America's greatest philosopher. Much of his work remained unpublished in his own time, and is only now finding publication in a coherent, chronologically organized edition. Even more astounding is that,despite many monographic studies, there has been no biography until now, almost eighty years after his death. Brent has studied the Peirce papers in detail and enriches his account with numerous quotations from letters by Peirce and by his friends. This is a fascinating account of a p. (shrink)
In this paper I argue that moral realism does not, pace Charles Taylor, need “moral sources” or “constitutive goods”, and adding these concepts distorts the basic insights of what can be called “cultural” moral realism.1 Yet the ideas of “moral topography” or “moral space” as well as the idea of “ontological background pictures” are valid, if separated from those notions. What does Taylor mean by these notions?
In the introduction to his Philosophical Papers 1&2 Charles Taylor assures us that his work, while encompassing a range of issues, follows a single, tightly knit agenda. He claims that the central questions concern "philosophical anthropology". Taylor's work on these questions has been presented piecemeal, in the form of articles and papers, and the student has had to imagine what a systematic monograph by Taylor on philosophical anthropology would look like. Neither Hegel, Sources of the Self, Ethics of Authenticity, (...) Catholic Modernity nor Varieties of Religion Today, nor Taylor's forthcoming books on secularization and modern social imaginaries are such treatises on the ontology of the human being. Nicholas H. Smith's monograph Charles Taylor: Meaning, Morals and Modernity (Polity, 2002) puts forward a clear and well-argued assessment of Taylor's entire project, with details on his intellectual biography and political engagement. For the purposes of thinking through Taylor's work so far, this book is probably the best one around. It is divided into eight chapters: "Linguistic Philosophy and Phenomenology", "Science, Action and the Mind", "The Romantic Legacy", "The Self and the Good", "Interpretation and the Social Sciences", "Individual and Community", "Politics and Social Criticism", and "Modernity, Art and Religion". The chapters are thematically ordered, but the order of presentation follows roughly the temporal order of Taylor's career. In this review article, I will begin with what Smith identifies as Taylor's organizing idea, and then focus on Smith's presentation of Taylor's transcendental argumentation concerning 'human constants'. As exemplars, I will discuss two of the.. (shrink)
The Philosophy Now series promises to combine rigorous analysis with authoritative expositions. Ruth Abbey’s book lives up to this demand by being a clear, reliable and more than up-to-date introduction to Charles Taylor ’s philosophy. Although it is an introductory book, the amount of footnotes and references ought to please those who want to study the original texts more closely. Abbey’s book is structured thematically: morality, selfhood, politics and epistemology get 50 pages each. The focus is on the internal (...) coherence of Taylor ’s work, not in its critique of or defence against other positions. The chapters are self-containing, but together they give a good total picture of Taylor ’s position. The concluding chapter is a highly interesting preview of Taylor ’s unpublished work-in-progress on secularity, which according to Abbey is comparable in magnitude to Sources of the Self. (shrink)
This article investigates the history of the relation between idealism and pragmatism by examining the importance of the French idealist Charles Renouvier for the development of William James's ‘Will to Believe’. By focusing on French idealism, we obtain a broader understanding of the kinds of idealism on offer in the nineteenth century. First, I show that Renouvier's unique methodological idealism led to distinctively pragmatist doctrines and that his theory of certitude and its connection to freedom is worthy of reconsideration. (...) Second, I argue that the technical vocabulary and main structure of the argument from the ‘Will to Believe’ depend upon Renouvier's idealist theory of knowledge and psychology of belief, and that taking account of this line of influence is of crucial importance for establishing the correct interpretation of James's work. (shrink)
In this essay I explore the potential contribution of Peirce's theory of scientific inquiry to moral philosophy. After a brief introduction, I outline Peirce's theory of inquiry. Next, I address why Peirce believed that this theory of inquiry is inapplicable to what he called "matters of vital importance," the latter including genuine moral problems. This leaves us in the end with two options: We can try to develop an alternative way of addressing moral problems or we can seek to reconcile (...) moral problems with scientific inquiry as described by Peirce. Though Peirce seems to argue for the former, I argue for the latter. (shrink)
This work runs counter to the traditional interpretations of Peirce's philosophy by eliciting an inherent strand of pragmatic pluralism that is embedded in the very core of his thought and that weaves his various doctrines into a systematic ...
This paper compares the idea of embodied reasoning by Confucian Tu Wei-Ming and Canadian philosopher Charles Taylor. They have similar concerns about the problems of secular modernity, that is, the domination of instrumental reason and disembodied rationality. Both of them suggest that we have to explore a kind of embodied moral reasoning. I show that their theories of embodiment have many similarities: the body is an instrument for our moral knowledge and self-understanding; such knowledge is inevitably a kind of (...) bodily knowledge. I will also demonstrate how the differences between their theories can be mutually enriched. While Taylor has provided a philosophical account of the foundation of moral epistemology, Tu’s emphasis of ritual practice and the integration of knowing, doing and being seems to offer a more fully embodied understanding of the moral self. (shrink)
In this paper the relations between the almost unknown Spanish mathematician Ventura Reyes Prósper (1863-1922) with Charles S. Peirce and Christine Ladd-Franklin are described. Two brief papers from Reyes Prósper published in El Progreso Matemático 12 (20 December 1891), pp. 297-300, and 18 (15 June 1892) pp. 170-173 on Ladd-Franklin, and on Peirce and Mitchell, respectively, are translated for first time into English and included at the end of the paper.
[Note: Picture of Peirce available] Charles S. Peirce’s Philosophy of Signs Essays in Comparative Semiotics Gérard Deledalle Peirce’s semiotics and metaphysics compared to the thought of other leading philosophers. "This is essential reading for anyone who wants to find common ground between the best of American semiotics and better-known European theories. Deledalle has done more than anyone else to introduce Peirce to European audiences, and now he sends Peirce home with some new flare."—Nathan Houser, Director, Peirce Edition Project (...) class='Hi'>Charles S. Peirce’s Philosophy of Signs examines Peirce’s philosophy and semiotic thought from a European perspective, comparing the American’s unique views with a wide variety of work by thinkers from the ancients to moderns. Parts I and II deal with the philosophical paradigms which are at the root of Peirce’s new theory of signs, pragmatic and social. The main concepts analyzed are those of "sign" and "semiosis" and their respective trichotomies; formally in the case of "sign," in time in the case of semiosis. Part III is devoted to comparing Peirce’s theory of semiotics as a form of logic to the work of other philosophers, including Bertrand Russell, Wittgenstein, Frege, Philodemus, Lady Welby, Saussure, Morris, Jakobson, and Marshall McLuhan. Part IV compares Peirce’s "scientific metaphysics" with European metaphysics. Gérard Deledalle holds the Doctorate in Philosophy from the Sorbonne. A research scholar at Columbia University and Attaché at the Centre National de la Recherche Scientifique, Paris, he has also been Professor of Philosophy and Head of the Philosophy Department of the universities of Tunis, Perpignan, and Libreville. In 1990 he received the Herbert W. Schneider Award "for distinguished contributions to the understanding and development of American philosophy. In 2001, he was appointed vice-president of the Charles S. Peirce Society. Contents Introduction—Peirce Compared: Directions for Use Part I—Semeiotic as Philosophy Peirce’s New Philosophical Paradigms Peirce’s Philosophy of Semeiotic Peirce’s First Pragmatic Papers The Postscriptum of 1893 Part II—Semeiotic as Semiotics Sign: Semiosis and Representamen—Semiosis and Time Sign: The Concept and Its Use—Reading as Translation Part III—Comparative Semiotics Semiotics and Logic: A Reply to Jerzy Pelc Semeiotic and Greek Logic: Peirce and Philodemus Semeiotic and Significs: Peirce and Lady Welby Semeiotic and Semiology: Peirce and Saussure Semeiotic and Semiotics: Peirce and Morris Semeiotic and Linguistics: Peirce and Jakobson Semeiotic and Communication: Peirce and McLuhan Semeiotic and Epistemology: Peirce, Frege, and Wittgenstein Part IV—Comparative Metaphysics Gnoseology—Perceiving and Knowing: Peirce, Wittgenstein, and Gestalttheorie Ontology—Transcendentals "of" or "without" Being: Peirce versus Aristotle and Thomas Aquinas Cosmology—Chaos and Chance within Order and Continuity: Peirce between Plato and Darwin Theology—The Reality of God: Peirce’s Triune God and the Church’s Trinity Conclusion—Peirce: A Lateral View. (shrink)
Charles Taylor is one of the leading living philosophers. In this book Arto Laitinen studies and develops further Taylor's philosophical views on human agency, personhood, selfhood and identity. He defends Taylor's view that our ethical understandings of values play a central role. The book also develops and defends Taylor's form of value realism as a view on the nature of ethical values, or values in general. The book criticizes Taylor's view that God, Nature or Human Reason are possible constitutive (...) sources of value – Laitinen argues that we should drop the whole notion of a constitutive source. (shrink)
Hermeneutics, also referred to as interpretive phenomenology, has led to important contributions to nursing research. The philosophy of Charles Taylor has been a major source in the development of contemporary hermeneutics, through his ontological and epistemological articulations of the human sciences. The aim of this paper is to demonstrate that Taylor's ideas can further enrich hermeneutic inquiry in nursing research, particularly for investigations of ethical concerns. The paper begins with an outline of Taylor's hermeneutical framework, followed by a review (...) of his key ideas relevant for ethics research. The paper ends with a discussion of my empirical research with critically ill children in Canada and France in relation to Taylor's ideas, chiefly Social Imaginaries. I argue that Taylor's hermeneutics provides a substantive moral framework as well as a methodology for examining ethical concerns. (shrink)
In this critical response to Charles Ess’ ‚Ethical Pluralism and Global Information Ethics’ presented in this Special Issue of Ethics and Information Technology, it is firstly argued that his account of pros hen pluralism can be more accurately reformulated as a three layered doctrine by separating one acceptance of diversity at a cultural level and another at an ethical theoretic level. Following this clarificatory section, the next section considers Ess’ political and sociological reasons for the necessity and desirability of (...) pros hen pluralism, criticising the former reasons as social scientifically problematic, while elaborating on the latter as more persuasive. In the last section, I discuss how pros hen pluralism may be realised, making three arguments in particular. First, Ess’ requirement for sensitivity to cultural diversity is to be interpreted as differentiated and extended sensitivity. Second, his discussion of shared responses to central ethical problems is ambiguous and needs further elaboration and clarification. Third, his focus on dialogue and Socratic education is persuasive, although excessive optimism is not reasonable. (shrink)
IN 1903, commenting on an article he had written more than thirty years before, Charles Peirce said that he had changed his mind on many issues at least a half-dozen times but had "never been able to think differently on that question of nominalism and realism" (1.20). For anyone acquainted with Peirce's writings, this remark alone could justify a study of "that question.".
We explain how the work of Charles Sanders Peirce (1839–1914) – the founder of semiotics and of the pragmatist tradition in philosophy – contributes an epistemological, metaphysical, and ethical foundation to some key transhumanist ideas, including the following claims: technological cognitive enhancement is not only possible but a present reality; pursuing more sweeping cognitive enhancements is epistemically rational; and current humans should try to evolve themselves into posthumans. On Peirce’s view, the fundamental aim of inquiry is truth, understood in (...) terms of a stage of ideal cognition (what he calls the “final opinion”). As current human cognitive abilities are insufficient to achieve this stage, Peirce’s views on cognition support a variety of ways in which they might be enhanced. Finally, we argue that what Peirce describes as our ethical summum bonum seems remarkably similar to what Bostrom (2005) argues to be the core transhumanist value: “the exploration of the posthuman realm.”. (shrink)
In this essay I discuss the historical adequacy of Charles Taylor's philosophical history of secularization, as presented in his A Secular Age . I do so by situating it in relation to the contextual historiography of secularization in early modern Europe, with a particular focus on developments in the German Empire. Considering how profoundly conceptions of secularization have been bound to competing religious and political programmes, we must begin our discussion by entertaining the possibility that modern philosophical and historiographic (...) conceptions of secularization might themselves be outcrops of this unfinished competition. Peter Gordon has rightly observed that Taylor's philosophical history of secularization is a Catholic one, and that this is bound up with a specific view of secularization as a theological and ecclesiological “disembedding” of rational subjectivity from its prior embodiment in a sacral body, community , and cosmos. Taylor delivers this history in his “reform master narrative”: that certain fundamental religious and cultural reforms or changes in early modern Europe wrought the secularization responsible for a modern epoch of “unbelief”. (shrink)