This paper addresses a significant gap in the conceptualization of business ethics within different cultural influences. Though theoretical models of business ethics have recognized the importance of culture in ethical decision-making, few have examinedhow this influences ethical decision-making. Therefore, this paper develops propositions concerning the influence of various cultural dimensions on ethical decision-making using Hofstede''s typology.
The present study examines the relationships between consumers'' ethical beliefs and personality traits. Based on a survey of 295 undergraduate business students, the authors found that individuals with high needs for autonomy, innovation, and aggression, as well as individuals with a high propensity for taking risks tend to have less ethical beliefs concerning possible consumer actions. Individuals with a high need for social desirability and individuals with a strong problem solving coping style tend to have more ethical beliefs concerning possible (...) consumer actions. The needs for achievement, affiliation, complexity and an emotion solving coping style were not significantly correlated with consumer ethical beliefs. (shrink)
Perhaps no technological innovation has so dominated the second half of the twentieth century as has the introduction of the programmable computer. It is quite difficult if not impossible to imagine how contemporary affairs—in business and science, communications and transportation, governmental and military activities, for example—could be conducted without the use of computing machines, whose principal contribution has been to relieve us of the necessity for certain kinds of mental exertion. The computer revolution has reduced our mental labors by means (...) of these machines, just as the Industrial Revolution reduced our physical labor by means of other machines. (shrink)
Historian James H. Jones published the first edition of Bad Blood, the definitive history of the Tuskegee Syphilis Experiment, in 1981. Its clear-eyed examination of that research and its implications remains a bioethics classic, and the 30-year anniversary of its publication served as the impetus for the reexamination of research ethics that this symposium presents. Recent revelations about the United States Public Health Service study that infected mental patients and prisoners in Guatemala with syphilis in the late 1940s in (...) order to determine the efficacy of treatment represent only one of the many attestations to the persistence of ongoing, critical, and underaddressed issues in research ethics that Bad Blood first explored. Those issues include, but are not limited to: the complex and contested matters of the value of a given research question, the validity of the clinical trial designed to address it, and the priorities of science. (shrink)
1. WHAT IS ARTIFICIAL INTELLIGENCE? One of the fascinating aspects of the field of artificial intelligence (AI) is that the precise nature of its subject ..
If the decades of the forties through the sixties were dominated by discussion of Hempel's “covering law“ explication of explanation, that of the seventies was preoccupied with Salmon's “statistical relevance” conception, which emerged as the principal alternative to Hempel's enormously influential account. Readers of Wesley C. Salmon's Scientific Explanation and the Causal Structure of the World, therefore, ought to find it refreshing to discover that its author has not remained content with a facile defense of his previous investigations; on the (...) contrary, Salmon offers an original account of different kinds of explications, advances additional criticisms of various alternative theories, and elaborates a novel “two-tiered“ analysis of explanation that tacitly depends upon a “two-tiered” account of homogeneity. Indeed, if the considerations that follow are correct, Salmon has not merely refined his statistical relevance account but has actually abandoned it in favor of a “causal/mechanistic“ construction. This striking development suggests that the theory of explanation is likely to remain as lively an arena of debate in the eighties as it has been in the past. (shrink)
Offers a vital, unique and agenda-setting perspective for the field of social epistemology – the philosophical basis for prescribing the social means and ends for pursuing knowledge.
The purpose of this paper is to explore three alternative frameworks for understanding the nature of language and mentality, which accent syntactical, semantical, and pragmatical aspects of the phenomena with which they are concerned, respectively. Although the computational conception currently exerts considerable appeal, its defensibility appears to hinge upon an extremely implausible theory of the relation of form to content. Similarly, while the representational approach has much to recommend it, its range is essentially restricted to those units of language that (...) can be understood in terms of undefined units. Thus, the only alternative among these three that can account for the meaning of primitive units of language is one emphasizing the basic role of skills, habits, and tendencies in relating signs and dispositions. (shrink)
Globalization has increased the need for managers (and future managers) to predict the potential for country corruption. This study examines the relationship between Hofstede''s cultural dimensions and how country corruption is perceived. Power distance, individualism and masculinity were found to explain a significant portion of the variance in perceived corruption. A significant portion of country''s risk, trade flow with U.S.A., foreign investment, and per capita income was explained by perceived corruption.
Technological revolutions are dissected into three stages: the introduction stage, the permeation stage, and the power stage. The information revolution is a primary example of this tripartite model. A hypothesis about ethics is proposed, namely, ethical problems increase as technological revolutions progress toward and into the power stage. Genetic technology, nanotechnology, and neurotechnology are good candidates for impending technological revolutions. Two reasons favoring their candidacy as revolutionary are their high degree of malleability and their convergence. Assuming the emerging technologies develop (...) into mutually enabling revolutionary technologies, we will need better ethical responses to cope with them. Some suggestions are offered about how our approach to ethics might be improved. (shrink)
This sequel to the widely read Zen and the Brain continues James Austin's explorations into the key interrelationships between Zen Buddhism and brain research. In Zen-Brain Reflections, Austin, a clinical neurologist, researcher, and Zen practitioner, examines the evolving psychological processes and brain changes associated with the path of long-range meditative training. Austin draws not only on the latest neuroscience research and new neuroimaging studies but also on Zen literature and his personal experience with alternate states of consciousness.Zen-Brain Reflections takes (...) up where the earlier book left off. It addresses such questions as: how do placebos and acupuncture change the brain? Can neuroimaging studies localize the sites where our notions of self arise? How can the latest brain imaging methods monitor meditators more effectively? How do long years of meditative training plus brief enlightened states produce pivotal transformations in the physiology of the brain? In many chapters testable hypotheses suggest ways to correlate normal brain functions and meditative training with the phenomena of extraordinary states of consciousness.After briefly introducing the topic of Zen and describing recent research into meditation, Austin reviews the latest studies on the amygdala, frontotemporal interactions, and paralimbic extensions of the limbic system. He then explores different states of consciousness, both the early superficial absorptions and the later, major "peak experiences." This discussion begins with the states called kensho and satori and includes a fresh analysis of their several different expressions of "oneness." He points beyond the still more advanced states toward that rare ongoing stage of enlightenment that is manifest as "sage wisdom."Finally, with reference to a delayed "moonlight" phase of kensho, Austin envisions novel links between migraines and metaphors, moonlight and mysticism. The Zen perspective on the self and consciousness is an ancient one. Readers will discover how relevant Zen is to the neurosciences, and how each field can illuminate the other. (shrink)
This sequel to the widely read Zen and the Brain continues James Austin's explorations into the key interrelationships between Zen Buddhism and brain research. In Zen-Brain Reflections, Austin, a clinical neurologist, researcher, and Zen practitioner, examines the evolving psychological processes and brain changes associated with the path of long-range meditative training. Austin draws not only on the latest neuroscience research and new neuroimaging studies but also on Zen literature and his personal experience with alternate states of consciousness.Zen-Brain Reflections takes (...) up where the earlier book left off. It addresses such questions as: how do placebos and acupuncture change the brain? Can neuroimaging studies localize the sites where our notions of self arise? How can the latest brain imaging methods monitor meditators more effectively? How do long years of meditative training plus brief enlightened states produce pivotal transformations in the physiology of the brain? In many chapters testable hypotheses suggest ways to correlate normal brain functions and meditative training with the phenomena of extraordinary states of consciousness.After briefly introducing the topic of Zen and describing recent research into meditation, Austin reviews the latest studies on the amygdala, frontotemporal interactions, and paralimbic extensions of the limbic system. He then explores different states of consciousness, both the early superficial absorptions and the later, major "peak experiences." This discussion begins with the states called kensho and satori and includes a fresh analysis of their several different expressions of "oneness." He points beyond the still more advanced states toward that rare ongoing stage of enlightenment that is manifest as "sage wisdom."Finally, with reference to a delayed "moonlight" phase of kensho, Austin envisions novel links between migraines and metaphors, moonlight and mysticism. The Zen perspective on the self and consciousness is an ancient one. Readers will discover how relevant Zen is to the neurosciences, and how each field can illuminate the other. (shrink)
The notion of program verification appears to trade upon an equivocation. Algorithms, as logical structures, are appropriate subjects for deductive verification. Programs, as causal models of those structures, are not. The success of program verification as a generally applicable and completely reliable method for guaranteeing program performance is not even a theoretical possibility.
Luciano Floridi offers a theory of information as a “strongly semantic” notion, according to which information encapsulates truth, thereby making truth a necessary condition for a sentence to qualify as “information”. While Floridi provides an impressive development of this position, the aspects of his approach of greatest philosophical significance are its foundations rather than its formalization. He rejects the conception of information as meaningful data, which entails at least three theses – that information can be false; that tautologies are information; (...) and, that “It is true that...” is non-redundant – appear to be defensible. This inquiry offers various logical, epistemic, and ordinary-language grounds to demonstrate that an account of his kind is too narrow to be true and that its adoption would hopelessly obscure crucial differences between information, misinformation, and disinformation. (shrink)
Luciano Floridi (2003) offers a theory of information as a strongly semantic notion, according to which information encapsulates truth, thereby making truth a necessary condition for a sentence to qualify as information. While Floridi provides an impressive development of this position, the aspects of his approach of greatest philosophical significance are its foundations rather than its formalization. He rejects the conception of information as meaningful data, which entails at least three theses – that information can be false; that tautologies are (...) information; and, that It is true that ... is non-redundant – appear to be defensible. This inquiry offers various logical, epistemic, and ordinary-language grounds to demonstrate that an account of his kind is too narrow to be true and that its adoption would hopelessly obscure crucial differences between information, misinformation, and disinformation. (shrink)
The idea that human thought requires the execution of mental algorithms provides a foundation for research programs in cognitive science, which are largely based upon the computational conception of language and mentality. Consideration is given to recent work by Penrose, Searle, and Cleland, who supply various grounds for disputing computationalism. These grounds in turn qualify as reasons for preferring a non-computational, semiotic approach, which can account for them as predictable manifestations of a more adquate conception. Thinking does not ordinarily require (...) the execution of mental algorithms, which appears to be at best no more than one rather special kind of thinking. (shrink)
The idea that human thought requires the execution of mental algorithms provides a foundation for research programs in cognitive science, which are largely based upon the computational conception of language and mentality. Consideration is given to recent work by Penrose, Searle, and Cleland, who supply various grounds for disputing computationalism. These grounds in turn qualify as reasons for preferring a non-computational, semiotic approach, which can account for them as predictable manifestations of a more adquate conception. Thinking does not ordinarily require (...) the execution of mental algorithms, which appears to be at best no more than one rather special kind of thinking. (shrink)
It has been remarked that there is a tendency in all Governments to an augmentation of power at the expense of liberty. But the remark as usually understood does not appear to me well founded.... It is a melancholy reflection that liberty should be equally exposed to danger whether the Government have too much or too little power, and that the line which divides the extremes should be so inaccurately drawn by experience. -/- Madison, letter to Jefferson, October 17, 1788.
Being able to ask others to do things, and thereby giving them reasons to do those things, is a prominent feature of our interpersonal lives. In this paper, I discuss the distinctive normative status of requests – what makes them different from commands and demands. I argue for a theory of this normative phenomenon which explains the sense in which the reasons presented in requests are a matter of discretion. This discretionary quality, I argue, is something that other theories cannot (...) accommodate, though it is a significant aspect of the relations that people stand in to one another, and the kinds of practical reasons that flow from those relations. (shrink)
Computer and information ethics, as well as other fields of applied ethics, need ethical theories which coherently unify deontological and consequentialist aspects of ethical analysis. The proposed theory of just consequentialism emphasizes consequences of policies within the constraints of justice. This makes just consequentialism a practical and theoretically sound approach to ethical problems of computer and information ethics.
The standard interpretation of the imitation game is defended over the rival gender interpretation though it is noted that Turing himself proposed several variations of his imitation game. The Turing test is then justified as an inductive test not as an operational definition as commonly suggested. Turing's famous prediction about his test being passed at the 70% level is disconfirmed by the results of the Loebner 2000 contest and the absence of any serious Turing test competitors from AI on the (...) horizon. But, reports of the death of the Turing test and AI are premature. AI continues to flourish and the test continues to play an important philosophical role in AI. Intelligence attribution, methodological, and visionary arguments are given in defense of a continuing role for the Turing test. With regard to Turing's predictions one is disconfirmed, one is confirmed, but another is still outstanding. (shrink)
John Locke's subtle and influential defense of religious toleration as argued in his seminal _Letter Concerning Toleration_ appears in this edition as introduced by one of our most distinguished political theorists and historians of political thought.