The British Parliament legalized therapeutic cloning in December 2000 despite opposition from the European Union. The watershed event in Parliament's move was the active and unprecedented government support for the generation and destruction of human embryonic life merely as a means of medical advancement. This article contends that the utilitarian analysis of this procedure is necessary to identify the real world risks of therapeutic cloning but insufficient to identify the breach of defensible ethical limits that this procedure represents. A value-oriented (...) approach to Kantian ethics demonstrates that the utilitarian endorsement of therapeutic cloning entails a contradiction of the necessity of human vulnerability and a faulty valuation of the human embryo. The concern is that a narrow utilitarian focus ultimately commodifies human embryonic life and preferences outcomes as the sole determinant of moral value. (shrink)
A Foucauldian assessment of the common presumption that genetic information is potent and thus oppressive demonstrates that the concern may be misplaced. Foucault's concept of “technologies of self” reveals that genetic power originates not only from the potency of genetic information but from the penchant of individuals to victimize themselves in the name of optimal health, enhanced intelligence, perfect babies, or would-be immortality. Rather than seeking liberation from the power of the new genetics, Foucault's reinterpretation of the ancient understanding of (...) concern for the self offers the possibility to avoid control by the scientific discourse. His ethical response calls for resistance rather than opposition and places responsibility for resistance in the hands of the subject. Characteristically, he avoids a generalizable form of morality but clarifies that resistance includes acknowledging the human appetite for perfection and subordinating science to ethical and aesthetic matters. (shrink)
marilyn fischer’s careful historiographical treatment of the ideas and life of Jane Addams deepens our understanding of Addams’s important work as a thinker and practitioner. The paper paints a picture of the ideological and sociological landscape of Addams’s world, paying close attention to the relationships Addams had with other prominent thinkers of the day, such as the African Americans W. E. B. Du Bois and Ida B. Wells-Barnett, as well as the pragmatist Josiah Royce. Fischer seems to have doubled (...) aims in the paper. On the one hand, she makes a set of claims that helps us understand Addams better as a historical thinker. On the other hand, the paper advances another set of claims about Addams’s relation to .. (shrink)
This introduction to the Common Knowledge symposium titled “Comparative Relativism” outlines a variety of intellectual contexts where placing the unlikely companion terms comparison and relativism in conjunction offers analytical purchase. If comparison, in the most general sense, involves the investigation of discrete contexts in order to elucidate their similarities and differences, then relativism, as a tendency, stance, or working method, usually involves the assumption that contexts exhibit, or may exhibit, radically different, incomparable, or incommensurable traits. Comparative studies are required to (...) treat their objects as alike, at least in some crucial respects; relativism indicates the limits of this practice. Jensen argues that this seeming paradox is productive, as he moves across contexts, from Lévi-Strauss's analysis of comparison as an anthropological method to Peter Galison's history of physics, and on to the anthropological, philosophical, and historical examples offered in symposium contributions by Barbara Herrnstein Smith, Eduardo Viveiros de Castro, Marilyn Strathern, and Isabelle Stengers. Comparative relativism is understood by some to imply that relativism comes in various kinds and that these have multiple uses, functions, and effects, varying widely in different personal, historical, and institutional contexts that can be compared and contrasted. Comparative relativism is taken by others to encourage a “comparison of comparisons,” in order to relativize what different peoples—say, Western academics and Amerindian shamans—compare things “for.” Jensen concludes that what is compared and relativized in this symposium are the methods of comparison and relativization themselves. He ventures that the contributors all hope that treating these terms in juxtaposition may allow for new configurations of inquiry. (shrink)
When Einstein formulated his special relativity, he developed his dynamics for point particles. Of course, many valiant efforts have been made to extend his relativity to rigid bodies, but this subject is forgotten in history. This is largely because of the emergence of quantum mechanics with wave-particle duality. Instead of Lorentz-boosting rigid bodies, we now boost waves and have to deal with Lorentz transformations of waves. We now have some nderstanding of plane waves or running waves in the covariant picture, (...) but we do not yet have a clear picture of standing waves. In this report, we show that there is one set of standing waves which can be Lorentz-transformed while being consistent with all physical principle of quantum mechanics and relativity. It is possible to construct a representation of the Poincaré group using harmonic oscillator wave functions satisfying space-time boundary conditions. This set of wave functions is capable of explaining the quantum bound state for both slow and fast hadrons. In particular it can explain the quark model for hadrons at rest, and Feynman’s parton model hadrons moving with a speed close to that of light. (shrink)
It is shown that the jet mechanism derivable from the Lorentz deformation picture leads to a nearly constant average jet transverse momentum. It is pointed out that this is consistent with the high-energy experimental data. It is pointed out further that this result strengthens the physical basis for the minimal time-energy uncertainty combined covariantly with Heisenberg's space-momentum uncertainty relation.
A physical basis for the minimal time-energy uncertainty relation is formulated from basic high-energy hadronic properties such as the resonance mass spectrum, the form factor behavior, and the peculiarities of Feynman's parton picture. It is shown that the covariant oscillator formalism combines covariantly this time-energy uncertainty relation with Heisenberg's space-momentum uncertainty relation. A pictorial method is developed to describe the spacetime distribution of the localized probability density.
It is shown that both covariant harmonic oscillator formalism and quantum field theory are based on common physical principles which include Poincaré covariance, Heisenberg's space-momentum uncertainty relation, and Dirac's “C-number” time-energy uncertainty relation. It is shown in particular that the oscillator wave functions are derivable from the physical principles which are used in the derivation of the Klein-Nishina formula.
The strong weak truth table (sw) reducibility was suggested by Downey, Hirschfeldt, and LaForte as a measure of relative randomness, alternative to the Solovay reducibility. It also occurs naturally in proofs in classical computability theory as well as in the recent work of Soare, Nabutovsky, and Weinberger on applications of computability to differential geometry. We study the sw-degrees of c.e. reals and construct a c.e. real which has no random c.e. real (i.e., Ω number) sw-above it.
G. E. Moore's ‘A Defence of Common Sense’ has generated the kind of interest and contrariety which often accompany what is new, provocative, and even important in philosophy. Moore himself reportedly agreed with Wittgenstein's estimate that this was his best article, while C. D. Broad has lamented its very great but largely unfortunate influence. Although the essay inspired Wittgenstein to explore the basis of Moore's claim to know many propositions of common sense to be true, A. J. Ayer judges its (...) enduring value to lie in provoking a more sophisticated conception of the very type of metaphysics which disputes any such unqualified claim of certainty. (shrink)
Esse artigo mostra o significado da unicidade da ética e da estética no Tractatus Logico-Philosophicus. Primeiro, ele apresenta os principais aspectos ética tractatiana: que ela não hierarquiza fatos, que ela é eudemonista, e que ela não propõe qualquer finalidade externa às ações do sujeito ético. Segundo, ele mostra que a obra de arte é a expressão da vida de um ponto de vista ético, ou seja, ela é a expressão do significado da vida de um ponto de vista da eternidade. (...) Concluindo, ele mostra que essa concepção propõe uma delimitação absoluta que separa o que é arte e que não é arte. (shrink)
This paper provides a new analysis of e - trust , trust occurring in digital contexts, among the artificial agents of a distributed artificial system. The analysis endorses a non-psychological approach and rests on a Kantian regulative ideal of a rational agent, able to choose the best option for itself, given a specific scenario and a goal to achieve. The paper first introduces e-trust describing its relevance for the contemporary society and then presents a new theoretical analysis of this phenomenon. (...) The analysis first focuses on an agent’s trustworthiness , this one is presented as the necessary requirement for e-trust to occur. Then, a new definition of e-trust as a second-order-property of first-order relations is presented. It is shown that the second-order-property of e-trust has the effect of minimising an agent’s effort and commitment in the achievement of a given goal. On this basis, a method is provided for the objective assessment of the levels of e-trust occurring among the artificial agents of a distributed artificial system. (shrink)
The E-Z Reader model (Reichle et al. 1998; 1999) provides a theoretical framework for understanding how word identification, visual processing, attention, and oculomotor control jointly determine when and where the eyes move during reading. In this article, we first review what is known about eye movements during reading. Then we provide an updated version of the model (E-Z Reader 7) and describe how it accounts for basic findings about eye movement control in reading. We then review several alternative models of (...) eye movement control in reading, discussing both their core assumptions and their theoretical scope. On the basis of this discussion, we conclude that E-Z Reader provides the most comprehensive account of eye movement control during reading. Finally, we provide a brief overview of what is known about the neural systems that support the various components of reading, and suggest how the cognitive constructs of our model might map onto this neural architecture. Key Words: attention; eye-movement control; E-Z Reader; fixations; lexical access; models; reading; regressions; saccades. (shrink)
E. F. Carritt (1876-1964) was educated at and taught in Oxford University. He made substantial contributions both to aesthetics and to moral philosophy. The focus of this entry is his work in moral philosophy. His most notable works in this field are The Theory of Morals (1928) and Ethical and Political Thinking (1947). Carritt developed views in metaethics and in normative ethics. In meta-ethics he defends a cognitivist, non-naturalist moral realism and was among the first to respond to A. J. (...) Ayer’s emotivist challenge to this view. In normative ethics he advocates a deontological view in which there is a plurality of obligations and of non-instrumental goods. In the context of defending this view he raised some penetrating and novel criticisms of ideal utilitarianism. He held that it is not acceptable to revise our reflective common-sense moral attitudes in the face of philosophical moral theories, and that moral philosophy is only indirectly practical. (shrink)
Has the diversity of corporate boards of directors improved? Should it? What role does diversity play in reducing corporate wrongdoing? Will diversity result in a more focused board of directors or more board autonomy? Examining the state of Tennessee as a case study, the authors collected data on the board composition of publicly traded corporations and compared those data to an original study conducted in 1995. Data indicate only a modest improvement in board diversity. This article discusses reasons for the (...) scarcity of women on boards and concludes that, to enhance strategic decisions, board membership should reflect the corporation''s consumer population. Thus, women are a critical but overlooked resource. Areas for future research are also considered. (shrink)
It is tempting to interpret Marilyn Strathern as saying that the concept of nature is a social construction, because in her essay “No Nature, No Culture: the Hagen Case” she tells us that the Hagen people do not describe the world using this concept. However, I point out an obstacle to interpreting her in this way, an obstacle which leads me to reject this interpretation. Interpreting her in this way makes her inconsistent. The inconsistency is owing to a commitment (...) that she shares with previous British anthropologists, a commitment which points to an incompatibility between two intellectual traditions. (shrink)
Contemporary health care relies on electronic devices. These technologies are not ethically neutral but change the practice of care. In light of Sennett’s work and that of other thinkers (Dewey, Dreyfus, Borgmann) one worry is that “e-care”—care by means of new information and communication technologies—does not promote skilful and careful engagement with patients and hence is neither conducive to the quality of care nor to the virtues of the care worker. Attending to the kinds of knowledge involved in care work (...) and their moral significance, this paper explores what “craftsmanship” means in the context of medicine and health care and discusses whether today the care giver’s craftsmanship is eroded. It is argued that this is a real danger, especially under modern conditions and in the case of telecare, but that whether it happens, and to what extent it happens, depends on whether in a specific practice and given a specific technology e-carers can develop the know-how and skill to engage more intensely with those under their care and to cooperate with their co-workers. (shrink)
I have two aims in this paper. In §§2-4 I contend that Moore has two arguments (not one) for the view that that ‘good’ denotes a non-natural property not to be identified with the naturalistic properties of science and common sense (or, for that matter, the more exotic properties posited by metaphysicians and theologians). The first argument, the Barren Tautology Argument (or the BTA), is derived, via Sidgwick, from a long tradition of anti-naturalist polemic. But the second argument, the Open (...) Question Argument proper (or the OQA), seems to have been Moore’s own invention and was probably devised to deal with naturalistic theories, such as Russell’s, which are immune to the Barren Tautology Argument. The OQA is valid and not (as Frankena (1939) has alleged) question-begging. Moreover, if its premises were true, it would have disposed of the desire-to-desire theory. But as I explain in §5, from 1970 onwards, two key premises of the OQA were successively called into question, the one because philosophers came to believe in synthetic identities between properties and the other because it led to the Paradox of Analysis. By 1989 a philosopher like Lewis could put forward precisely the kind of theory that Moore professed to have refuted with a clean intellectual conscience. However, in §§6-8 I shall argue that all is not lost for the OQA. I first press an objection to the desire-to-desire theory derived from Kripke’s famous epistemic argument. On reflection this argument looks uncannily like the OQA. But the premise on which it relies is weaker than the one that betrayed Moore by leading to the Paradox of Analysis. This suggests three conclusions: 1) that the desire-to-desire theory is false; 2) that the OQA can be revived, albeit in a modified form; and 3) that the revived OQA poses a serious threat to what might be called semantic naturalism. (shrink)
This essay serves as both a response and embellishment of Marilyn Frye's now classic essay " Oppression." It is meant to pick up where this essay left off and to make connections between oppression, as Frye defines it, and the privileges that result from institutional structures. This essay tries to clarify one meaning of privilege that is lost in philosophical discussions of injustice. I develop a distinction between unearned privileges and earned advantages. Clarifying the meaning of privilege as unearned (...) structural advantage makes visible the role white privilege plays in maintaining complex systems of domination such as racism, sexism, heterosexism and classism. Using a critical reading of both Frye and Young's accounts of oppression as a springboard, I develop a definition of privilege as a particular class of unearned advantages. -/- I distinguish my account of privilege from standard legal and philosophical definitions of privilege. The general distinction I make between privileges and advantages rests on three interrelated claims: that benefits granted by privilege are always unearned and conferred systemically to members of dominant social groups; that privileges granted to members of dominant groups solely on the basis of their membership in these groups is never justifiable; and, that privileges have an unconditional value that can be explained not only in terms of immunities, but also in terms of additional benefits. (shrink)
The effectiveness of information retrieval technology in electronic discovery (E-discovery) has become the subject of judicial rulings and practitioner controversy. The scale and nature of E-discovery tasks, however, has pushed traditional information retrieval evaluation approaches to their limits. This paper reviews the legal and operational context of E-discovery and the approaches to evaluating search technology that have evolved in the research community. It then describes a multi-year effort carried out as part of the Text Retrieval Conference to develop evaluation methods (...) for responsive review tasks in E-discovery. This work has led to new approaches to measuring effectiveness in both batch and interactive frameworks, large data sets, and some surprising results for the recall and precision of Boolean and statistical information retrieval methods. The paper concludes by offering some thoughts about future research in both the legal and technical communities toward the goal of reliable, effective use of information retrieval in E-discovery. (shrink)
This article is aimed at contributing to the study of the relationship that new religious movements entertain with technology and science. It focuses on an object that is central in Scientology's teachings and practice: the Electropsychometer or E-meter. In interaction with the general public, such as in a 2014 TV Super Bowl advertisement, Scientology seems to claim a unique relationship with science and technology in the form of a “combination” and a “connection” evoked while displaying this very E-meter. Hence, exploring (...) the teachings related to it is relevant in order to understand how such combination or connection is conceptualized. (shrink)
Given the very large numbers of documents involved in e-discovery investigations, lawyers face a considerable challenge of collaborative sensemaking. We report findings from three workplace studies which looked at different aspects of how this challenge was met. From a sociotechnical perspective, the studies aimed to understand how investigators collectively and individually worked with information to support sensemaking and decision making. Here, we focus on discovery-led refinement; specifically, how engaging with the materials of the investigations led to discoveries that supported refinement (...) of the problems and new strategies for addressing them. These refinements were essential for tractability. We begin with observations which show how new lines of enquiry were recursively embedded. We then analyse the conceptual structure of a line of enquiry and consider how reflecting this in e-discovery support systems might support scalability and group collaboration. We then focus on the individual activity of manual document review where refinement corresponded with the inductive identification of classes of irrelevant and relevant documents within a collection. Our observations point to the effects of priming on dealing with these efficiently and to issues of cognitive ergonomics at the human–computer interface. We use these observations to introduce visualisations that might enable reviewers to deal with such refinements more efficiently. (shrink)
The information overload in E-Discovery proceedings makes reviewing expensive and it increases the risk of failure to produce results on time and consistently. New interactive techniques have been introduced to increase reviewer productivity. In contrast, the techniques presented in this article propose an alternative method that tries to reduce information during culling so that less information needs to be reviewed. The proposed method first focuses on mapping the email collection universe using straightforward statistical methods based on keyword filtering combined with (...) date time and custodian identities. Subsequently, a social network is constructed from the email collection that is analyzed by filtering on date time and keywords. By using the network context we expect to provide a better understanding of the keyword hits and the ability to discard certain parts of the collection. (shrink)
Published in 1982, Carol Gilligan's In a Different Voice proposed a new model of moral reasoning based on care, arguing that it better described the moral life of women. An Ethic of Care is the first volume to bring together key contributions to the extensive debate engaging Gilligan's work. It provides the highlights of the often impassioned discussion of the ethic of care, drawing on the literature of the wide range of disciplines that have entered into the debate. Contributors: Annette (...) Baier, Diana Baumrind, Lawrence A. Blum, Mary Brabeck, John Broughton, Owen Flanagan, Marilyn Friedman, Carol Gilligan, Catherine G. Greeno, Catherine Jackson, Linda K. Kerber, Mary Jeanne Larrabee, Zella Luria, Eleanor E. Maccoby, Linda Nicholson, Bill Puka, Carol B. Stack, Joan C. Tronto, Lawrence Walker, Gertrud Nunner-Winkler. (shrink)
E-inclusion is getting a lot of attention in Europe these days. The European Commission and EU Member States have initiated e-inclusion strategies aimed at reaching out to the e-excluded and bringing them into the mainstream of society and the economy. The benefits of mainstreaming the excluded are numerous. Good practices play an important role in the strategies, and examples can be found in e-health, e-learning, e-government, e-inclusion and other e-domains. So laudable seems the rationale for e-inclusion, few have questioned the (...) benefits. In fact, e-inclusion does raise ethical issues, and this paper discusses a few of the key ones. The paper draws several conclusions, principally regarding the need for some empirical research on what happens to the e-excluded once they have access to information and communications technologies, notably the Internet. (shrink)
In this work, we provide a broad overview of the distinct stages of E-Discovery. We portray them as an interconnected, often complex workflow process, while relating them to the general Electronic Discovery Reference Model (EDRM). We start with the definition of E-Discovery. We then describe the very positive role that NIST’s Text REtrieval Conference (TREC) has added to the science of E-Discovery, in terms of the tasks involved and the evaluation of the legal discovery work performed. Given the critical nature (...) that data analysis plays at various stages of the process, we present a pyramid model, which complements the EDRM model: for gathering and hosting; indexing; searching and navigating; and finally consolidating and summarizing E-Discovery findings. Next we discuss where the current areas of need and areas of growth appear to be, using one of the field’s most authoritative surveys of providers and consumers of E-Discovery products and services. We subsequently address some areas of Artificial Intelligence, both Information Retrieval-related and not, which promise to make future contributions to the E-Discovery discipline. Some of these areas include data mining applied to e-mail and social networks, classification and machine learning, and the technologies that will enable next generation E-Discovery. The lesson we convey is that the more IR researchers and others understand the broader context of E-Discovery, including the stages that occur before and after primary search, the greater will be the prospects for broader solutions, creative optimizations and synergies yet to be tapped. (shrink)
Retrieval of relevant unstructured information from the ever-increasing textual communications of individuals and businesses has become a major barrier to effective litigation/defense, mergers/acquisitions, and regulatory compliance. Such e-discovery requires simultaneously high precision with high recall (high-P/R) and is therefore a prototype for many legal reasoning tasks. The requisite exhaustive information retrieval (IR) system must employ very different techniques than those applicable in the hyper-precise, consumer search task where insignificant recall is the accepted norm. We apply Russell, et al.’s cognitive task (...) analysis of sensemaking by intelligence analysts to develop a semi-autonomous system that achieves high IR accuracy of F1 ≥ 0.8 compared to F1 < 0.4 typical of computer-assisted human-assessment (CAHA) or alternative approaches such as Roitblat, et al.’s. By understanding the ‘Learning Loop Complexes’ of lawyers engaged in successful small-scale document review, we have used socio-technical design principles to create roles, processes, and technologies for scalable human-assisted computer-assessment (HACA). Results from the NIST-TREC Legal Track’s interactive task from both 2008 and 2009 validate the efficacy of this sensemaking approach to the high-P/R IR task. (shrink)
Autonomous e-coaching systems offer their users suggestions for action, thereby affecting the user's decision-making process. More specifically, the suggestions that these systems make influence the options for action that people actually consider. Surprisingly though, options and the corresponding process of option generation --- a decision-making stage preceding intention formation and action selection --- has received very little attention in the various disciplines studying decision making. We argue that this neglect is unjustified and that it is important, particularly for designers of (...) autonomous e-coaching systems, to understand how human option generation works. The aims of this paper are threefold. The first aim is to generate awareness with designers of autonomous e-coaching systems that these systems do in fact influence their users' options. The second is to show that understanding the interplay between a person's options and the e-coaching system's suggestions is important for improving the effectiveness of the system. The third is that the very same interplay is also crucial for designing e-coaching systems that respect people's autonomy. (shrink)
The paper deals with involutive FL e -monoids, that is, commutative residuated, partially-ordered monoids with an involutive negation. Involutive FL e -monoids over lattices are exactly involutive FL e -algebras, the algebraic counterparts of the substructural logic IUL. A cone representation is given for conic involutive FL e -monoids, along with a new construction method, called twin-rotation. Some classes of finite involutive FL e -chains are classified by using the notion of rank of involutive FL e -chains, and a kind (...) of duality is developed between positive and non-positive rank algebras. As a side effect, it is shown that the substructural logic IUL plus t ↔ f does not have the finite model property. (shrink)