Issue Title: Moral Luck, Social Networking Sites, and Trust on the Web I argue that the problem of 'moral luck' is an unjustly neglected topic within Computer Ethics. This is unfortunate given that the very nature of computer technology, its 'logical malleability', leads to ever greater levels of complexity, unreliability and uncertainty. The ever widening contexts of application in turn lead to greater scope for the operation of chance and the phenomenon of moral luck. Moral luck bears down most heavily (...) on notions of professional responsibility, the identification and attribution of responsibility. It is immunity from luck that conventionally marks out moral value from other kinds of values such as instrumental, technical, and use value. The paper describes the nature of moral luck and its erosion of the scope of responsibility and agency. Moral luck poses a challenge to the kinds of theoretical approaches often deployed in Computer Ethics when analyzing moral questions arising from the design and implementation of information and communication technologies. The paper considers the impact on consequentialism; virtue ethics; and duty ethics. In addressing cases of moral luck within Computer Ethics, I argue that it is important to recognise the ways in which different types of moral systems are vulnerable, or resistant, to moral luck. Different resolutions are possible depending on the moral framework adopted. Equally, resolution of cases will depend on fundamental moral assumptions. The problem of moral luck in Computer Ethics should prompt us to new ways of looking at risk, accountability and responsibility.[PUBLICATION ABSTRACT]. (shrink)
Locally-developed measures represent great tools for institutions to use in assessing student outcomes. Such measures can be easy to administer, can be cost-effective, and can provide meaningful data for improving student learning. However, many institutions struggle with questions surrounding the quality of their locally-developed assessments. Are their instruments reliable? Are their instruments valid? Can the data generated from these instruments be trusted to drive change and improvement? The good news for faculty, staff, and assessment professionals is that there are steps (...) they can take to address these concerns and help to ensure the validity and reliability of their processes. This article describes the development and testing of a novel research instrument of students’ attitudes and abilities relating to critical thinking, metacognition, and intellectual humility. Using a $1,000 assessment grant from Sam Houston State University (SHSU), Dr. Glenn Sanford and Dr. David Wright devised the early drafts of the instruments, collaborated with colleagues, and joined with Mr. Jeff Roberts, Director of Assessment at SHSU, to develop and to test this new instrument. What follows is a description of the development of the resulting research instrument, results from the factor analysis and reliability testing of that instrument, and an overview of how those results have been used to make further instrument improvements. (shrink)
What does it mean to be human? The traditional answers from the past remain only theoretical possibilities unless they come to mean something to today's generation. Moreover, in light of new knowledge and circumstances, a new generation may call these old answers into question, and seek to reinterpret, or, indeed, provide alternatives to them. In the 1960's, the Catholic Church's Second Vatican Council attempted such a reinterpretation, an aggiornamento, for the post-war generation of the mid-twentieth century by proposing, in Gaudium (...) et Spes, a theological anthropology founded upon the ideas of human dignity and the common good. Fifty years later is an appropriate time to revisit those answers, and to seek again to reinterpret or provide alternatives to them, in light of new knowledge for a new generation. Taking the themes of Gaudium et Spes as its starting point, this book looks at developments in theology and philosophy in the latter half of the twentieth century that call some of these 'old' answers into question. It identifies some of the 'new knowledge and circumstances' that need to be taken into account for this generation's answer to the question of what it means to be human. In five parts, leading philosophers and theologians ¥ offer interpretive lenses for reading the theological anthropology of the twentieth century; ¥ address the challenges of anthropocentricism, alterity, incarnation, and postmodernity for the notion of the human subject; ¥ tackle the important moral concepts of conscience, responsibility, evil and guilt; ¥ investigate the claims of atheism, fundamentalism, scientific naturalism, nihilism, and pluralism; and ¥ consider questions of the relationship between the individual and the community in the modern secular state. In so doing, this book prepares the ground for the development of a theological anthropology for the twenty-first century. (shrink)
In "Causes and "If P, Even If X, still Q," Philosophy 57 (July 1982), Ted Honderich cites my "The Direction of Causation and the Direction of Conditionship," journal of Philosophy 73 (April 22, 1976) as an example of an account of causal priority that lacks the proper character. After emending Honderich's description of the proper character, I argue that my attempt to account for one-way causation in terms of one-way causal conditionship does not totally lack it. Rather than emphasize the (...) singularity of an effect, as Honderich does, I emphasize the multiplicity of independent factors in a causal circumstance. (shrink)
Hume, in "An Enquiry Concerning Human Understanding", holds (1) that all causal reasoning is based on experience and (2) that causal reasoning is based on nothing but experience. (1) does not imply (2), and Hume's good reasons for (1) are not good reasons for (2). This essay accepts (1) and argues against (2). A priori reasoning plays a role in causal inference. Familiar examples from Hume and from classroom examples of sudden disappearances and radical changes do not show otherwise. A (...) priori causal reasoning is closely related to understanding causal mechanisms. One uncovers the intelligibility of a causal process by understanding its mechanism. (shrink)
Wittgenstein remarks in the "Tractatus" that the eye is not in the visual field. I question the claim of Michael Dummett and P T Geach that reflection on this remark helps one conceive of an observer perceiving objects in space without having any location in that space. The literal meaning of "point of view" is illustrated by the visual field. Reflection on the fact that the point of view is not itself normally an object of sight is no help in (...) conceiving perception from no point of view. (shrink)
McTaggart argues that the A series, which orders events with reference to past, present, and future, involves an inescapable contradiction. The significant difference between the earlier version of his argument (Mind, 1908) and the version in The Nature of Existence, Volume II, Chapter 33 (1927), has often gone unnoticed. His arguments are all invalid; the conclusion can be rejected without rejecting any premiss. It is therefore unnecessary to adopt any philosophical thesis about time (e.g., that some token-reflexive analysis of tensed (...) statements is adequate) to avoid McTaggart's final conclusion that time is unreal. (shrink)
I revise J L Mackie's first account of casual direction by replacing his notion of "fixity" by a newly defined notion of "sufficing" that is designed to accommodate indeterminism. Keeping Mackie's distinction between casual order and casual direction, I then consider another revision that replaces "fixity" with "one-way conditionship". In response to the charge that all such accounts of casual priority beg the question by making an unjustified appeal to temporal priority, i maintain that one-way conditionship explains rather that assumes (...) objective temporal dependence as well as objective casual dependence. (shrink)
The revival of virtue ethics is largely inspired by Aristotle, but few---especially Christians---follow him in seeing virtue supremely exemplified in the “magnanimous” man. However, Aristotle raises a matter of importance: the character traits and type of psychological stance exemplified in those who aspire to acts of extraordinary excellence. I explore the accounts of magnanimity found in both Aristotle and Aquinas, defending the intelligibility and acceptability of some central elements of a broadly Aristotelian conception of magnanimity. Aquinas, I argue, provides insight (...) into how Christian ethics may appropriate central elements of a broadly Aristotelian conception of extraordinary virtue. (shrink)
Although the truth value (falsity) of "Henry knows that (dogs live in trees and beavers chew wood)" remains unchanged no matter what sentence is substituted in it for "beavers chew wood", we want not to regard the second as a truth functional component (tfc) of the first. Many definitions of "tfc" (e.g., Quine's) fail to insure satisfaction of the following principle: if p is a component of r which is in turn a component of q, then p is a tfc (...) of q if and only if 1) p is also a tfc of r, and 2) r is also a tfc of q. (shrink)
This piece continues the story line of “Where Am I?” by Dan Dennett. I am inclined to locate myself at the location of my point of view. In my fantasy stories, points of view can be far away from a brain inside a flesh-and-blood body. Points of view can also move discontinuously from one location to another.
One of the central points of contention in the epistemology of testimony concerns the uniqueness (or not) of the justification of beliefs formed through testimony--whether such justification can be accounted for in terms of, or 'reduced to,' other familiar sort of justification, e.g. without relying on any epistemic principles unique to testimony. One influential argument for the reductionist position, found in the work of Elizabeth Fricker, argues by appeal to the need for the hearer to monitor the testimony for credibility. (...) Fricker (1994) argues, first, that some monitoring for trustworthiness is required if the hearer is to avoid being gullible, and second, that reductionism but not anti-reductionism is compatible with ascribing an important role to the process of monitoring in the course of justifiably accepting observed testimony. In this paper we argue that such an argument fails. (shrink)
In several works on modality, G. H. von Wright presents tree structures to explain possible worlds. Worlds that might have developed from an earlier world are possible relative to it. Actually possible worlds are possible relative to the world as it actually was at some point. Many logically consistent worlds are not actually possible. Transitions from node to node in a tree structure are probabilistic. Probabilities are often more useful than similarities between worlds in treating counterfactual conditionals.
Naive mereology studies ordinary conceptions of part and whole. Parts, unlike portions, have objective boundaries and many things, such as dances and sermons have temporal parts. In order to deal with Mark Heller's claim that temporal parts "are ontologically no more or less basic than the wholes that they compose," we retell the story of Laplace's Genius, here named "Swifty." Although Swifty processes lots of information very quickly, his conceptual repertoire need not extend beyond fundamental physics. So we attempt to (...) follow Swifty's progress in the acquisition of ordinary concepts such as 'table'. (Puzzles of precision and intrusion appear along the way.) Swifty has to understand what tables are before understanding what temporal portions of tables are. This is one reason for regarding tables as ontologically prior to table portions. intrusion appear along the way.). (shrink)
The central text of this article is Thomas Reid’s response to Berkeley’s argument for distinguishing tangible from visual shape. Reid is right to hold that shape words do not have different visual and tangible meanings. We might also perceive shape, moreover, with senses other than touch and sight. As Reid also suggests, the visual perception of shape does not require perception of hue or brightness. Contrary to treatments of the Molyneux problem by H. P. Grice and Judith Jarvis Thomson, I (...) argue that breakdowns of a certain kind between tangible and visible shape are conceivable. (shrink)
The cardinality of incompatible possibilities whose actuality requires at least N seconds exceeds the cardinality of disjoint intervals at least N seconds long. Therefore, not all logical possibilities can be actual in the long run, even if the long run is infinite.
Roderick Chisholm provides, in different places, two formulations of Brentano's thesis about the relation between the psychological and the intentional: (1) all and only psychological sentences are intentional; (2) no psychological intentional sentence is equivalent to a nonintentional sentence. Chisholm also presents several definitions of intentionality. Some of these allow that a sentence is intentional while its negation is nonintentional, which ruins the prospects of defending the more plausible and interesting thesis (2). A generalization of the notion of logical independence (...) to any number of mutually independent sentences permits a revision of Chisholm's criteria of intentionality that ensures that a sentence is intentional on a criterion exactly when its negation is as well. (shrink)
To accommodate vague statements and predicates, I propose an infinite-valued, non-truth-functional interpretation of logic on which the tautologies are exactly the tautologies of classical two-valued logic. iI introduce a determinacy operator, analogous to the necessity operator in alethic modal logic, to allow the definition of first-order and higher-order borderline cases. On the interpretation proposed for determinacy, every statement corresponding to a theorem of modal system T is a logical truth, and I conjecture that every logical truth on the interpretation corresponds (...) to a theorem of T. the interpretation is extended to predicate logic. A borderline case of a predicate 'F’ is neither determinately F nor determinately not-F. Traditional sorites arguments are seen to fall apart early in their gradual stepwise passage from truth to falsity. (shrink)
Twenty-one paragraphs in this entry begin with a statement of a view about causation. To help organize the entry, the next sentence then classifies the view as 'prevailing, majority, controversial', or 'minority'. The following brief discussions attempt to be clear and fair. Respect for fairness, however, does not prevent the author from referring to his own views. For example, the author classifies "There is no element of genuine a priori reasoning in causal inference" as a majority view. After expounded the (...) central reasons for this majority view, he goes on to criticize these reasons. (shrink)
Suppose that Susan did not go to the movies. The reconciling project attempts to show that this plus Determinism does not imply that Susan could not have gone to the movies. The estranging project attempts to show the opposite. A counter-entailment argument is of the form A is consistent with C, and C entails not-B, therefore A does not entail B. An instance of the counter-entailment arguments undermines a central argument for the reconciling project. Another instance undermines a central argument (...) for the estranging project. This is one symmetry. In each case, the natural response to the counter-entailment argument begs the question. This is another symmetry. (shrink)
Everything red is colored, and all squares are polygons. A square is distinguished from other polygons by being four-sided, equilateral, and equiangular. What distinguishes red things from other colored things? This has been understood as a conceptual rather than scientific question. Theories of wavelengths and reflectance and sensory processing are not considered. Given just our ordinary understanding of color, it seems that what differentiates red from other colors is only redness itself. The Cambridge logician W. E. Johnson introduced the terms (...) determinate and determinable to apply to examples such as red and colored. Chapter XI, of Johnson's Logic, Part I (1921), “The Determinate and the Determinable,” is the main text for discussion of this distinction. (shrink)