In this paper we explore the intersection of three topics which have historically been singled out for ethical consideration in advertising and marketing: the use of fear appeals, marketing to the elderly, and the marketing of health care services and products. Issues relevant to using fear appeals in promoting health care issues to the elderly are explored with a consumer psychologist's theoretical view of fear appeals. Next the assumption of the elderly market's vulnerability and indicants of social or psychological function (...) which would differentiate the elderly recipients of marketing communications are examined both in terms of function and ethical concerns.Overall, our review of the theoretical underpinnings of fear-based communication and the psychological characteristics does not indicate that the elderly of today are particularlyvulnerable. While the elderly are probably somewhat more dogmatic than younger consumers and perhaps view outcomes from the perspective of their age, there are no indications that their psychological responses to fear-based appeals differ significantly from those of younger consumers. (shrink)
The managerial ethics literature is used as a base for the inclusion of Ethical Attribution, as an element in the consumer's decision process. A situational model of ethical consideration in consumer behavior is proposed and examined for Personal vs. Vicarious effects. Using a path analytic approach, unique structures are reported for Personal and Vicarious situations in the evaluation of a seller's unethical behavior. An attributional paradigm is suggested to explain the results.
A twofold taxonomy for emergence is presented into which a variety of contemporary accounts of emergence fit. The first taxonomy consists of inferential, conceptual, and ontological emergence; the second of diachronic and synchronic emergence. The adequacy of weak emergence, a computational form of inferential emergence, is then examined and its relationship to conceptual emergence and ontological emergence is detailed. †To contact the author, please write to: Corcoran Department of Philosophy, 120 Cocke Hall, University of Virginia, Charlottesville, VA 22904‐4780; e‐mail: (...) email@example.com. (shrink)
Evolutionary psychologists, among others, have used a method called “reverse engineering” to uncover ( a ) whether a trait was selected for, and ( b ) if so, why that trait was selected for. In this paper I argue that reverse engineering cannot deliver on either ( a ) or ( b ), and tends to pervert, rather than enhance, our knowledge of natural history. In particular, I expose as false a fundamental assumption of reverse engineering—namely, that all traits selected (...) for a particular function will share some nontrivial properties. *Received March 2006; revised June 2008. †To contact the author, please write to: Department of Philosophy, 229 Major Williams Hall (0126), Virginia Tech, Blacksburg, VA 24061; e‐mail: firstname.lastname@example.org. (shrink)
The growing acceptance and success of experimental economics has increased the interest of researchers in tackling philosophical and methodological challenges to which their work increasingly gives rise. I sketch some general issues that call for the combined expertise of experimental economists and philosophers of science, of experiment, and of inductive‐statistical inference and modeling. †To contact the author, please write to: 235 Major Williams, Virginia Tech, Blacksburg, VA 24061‐0126; e‐mail: email@example.com.
The main aim of this paper is to revisit the curve fitting problem using the reliability of inductive inference as a primary criterion for the ‘fittest' curve. Viewed from this perspective, it is argued that a crucial concern with the current framework for addressing the curve fitting problem is, on the one hand, the undue influence of the mathematical approximation perspective, and on the other, the insufficient attention paid to the statistical modeling aspects of the problem. Using goodness-of-fit as the (...) primary criterion for ‘best', the mathematical approximation perspective undermines the reliability of inference objective by giving rise to selection rules which pay insufficient attention to ‘accounting for the regularities in the data'. A more appropriate framework is offered by the error-statistical approach, where (i) statistical adequacy provides the criterion for assessing when a curve captures the regularities in the data adequately, and (ii) the relevant error probabilities can be used to assess the reliability of inductive inference. Broadly speaking, the fittest curve (statistically adequate) is not determined by the smallness if its residuals, tempered by simplicity or other pragmatic criteria, but by the nonsystematic (e.g. white noise) nature of its residuals. The advocated error-statistical arguments are illustrated by comparing the Kepler and Ptolemaic models on empirical grounds. ‡I am grateful to Deborah Mayo and Clark Glymour for many valuable suggestions and comments on an earlier draft of the paper; estimating the Ptolemaic model was the result of Glymour's prompting and encouragement. †To contact the author, please write to: Department of Economics, Virginia Tech 3019 Pamplin Hall (0316), Blacksburg, VA 24061; e-mail: firstname.lastname@example.org. (shrink)
This article calls into question the charge that frequentist testing is susceptible to the base-rate fallacy. It is argued that the apparent similarity between examples like the Harvard Medical School test and frequentist testing is highly misleading. A closer scrutiny reveals that such examples have none of the basic features of a proper frequentist test, such as legitimate data, hypotheses, test statistics, and sampling distributions. Indeed, the relevant error probabilities are replaced with the false positive/negative rates that constitute deductive calculations (...) based on known probabilities among events. As a result, the ampliative dimension of frequentist induction—learning from data about the underlying data-generating mechanism—is missing. *Received August 2009; revised January 2010. †To contact the author, please write to: Department of Economics, Virginia Tech, Blacksburg, VA 24061; e-mail: email@example.com. (shrink)
In cognitive psychology there appears to be a creative tension between models that use connections of a network, and models that use rules for symbol manipulation. The idea of a connectionist network goes back to McCulloch & Pitts  and Hebb , and finds recent revival in the `parallel distributed processing' (PDP) models that have been extensively examined in the last few years (see e.g. Rumelhart et al. ). In the intervening years, however, the predominant explanations of psychology have (...) been in terms of rules for the manipulation of symbolic structures (e.g. since Newell & Simon ). Because of the tension between these different approaches (see e.g. Fodor & Pylyshyn ), there is great interest in the formulation of `hybrid systems' which combine connectionist components with traditional symbolic processing. I attempt below to outline the general architecture of such a hybrid system. (shrink)
Rayleigh and Ramsay discovered the inert gas argon in the atmospheric air in 1895 using a carefully designed sequence of experiments guided by an informal statistical analysis of the resulting data. The primary objective of this article is to revisit this remarkable historical episode in order to make a case that the error‐statistical perspective can be used to bring out and systematize (not to reconstruct) these scientists' resourceful ways and strategies for detecting and eliminating error, as well as dealing with (...) Duhemian ambiguities and underdetermination problems as they arose in the context of their local research settings. *Received December 2009; revised January 2010. †To contact the author, please write to: Department of Economics, 3016 Pamplin Hall, Virginia Tech, Blacksburg, VA 24061; e‐mail: firstname.lastname@example.org. (shrink)
Mental retardation (MR) is an invented bureaucratic category, currently undergoing radical rethinking and likely renaming, that includes many who have biologically based brain disorders, but is itself determined on functional criteria (e.g., IQ below a certain level) that are purely arbitrary. People with MR are socially vulnerable and thus are more likely to be "naíve confessors", "naíve defendants", and "naíve offenders." That is most likely the (largely unarticulated) rationale and justification for the Supreme Court's decision, in Atkins v. Virginia (...) (2002), to exempt the class from execution. Although the decision is to be applauded as a step in a more humane direction, it is problematic to use an indirect, artificial, and insufficiently inclusive category to determine who should or should not be executed. Limited social intelligence (with consequent social vulnerability) is a characteristic of a wide range of brainbased syndromes and disorders, including many who fall above the (artificial) upper IQ limit and, thus, are ineligible for the MR label and the legal protections associated with it. A more equitable, and logical, policy would be to extend execution exemption to all who demonstrate the same kinds of vulnerabilities, especially if they can be linked to some brain-based medical condition, regardless of whether one qualifies for the (soon to be discarded) label of MR. (shrink)
Following are excerpts from the keynote speech delivered to the second of the Colloquium 2000 series on applied media ethics by Kevin Klose, president and chief executive officer of National Public Radio. Mr. Klose spoke in the Robert E. Lee chapel at Washington and Lee University in Lexington, Virginia, November 2, 2001. This colloquium sought to unearth global values in media ethics.
From Bauhaus to Dada, from Virginia Woolf to John Dos Passos, the Modernist movement revolutionized the way we perceive, portray, and participate in the world. This landmark anthology is a comprehensive documentary resource for the study of Modernism, bringing together more than 150 key essays, articles, manifestos, and other writings of the political and aesthetic avant-garde between 1840 and 1950. By favoring short extracts over lengthier originals, the editors cover a remarkable range and variety of modernist thinking. Included are (...) not just the familiar high modernist landmarks such as Gustave Flaubert, Ezra Pound, and James Joyce, but also a diverse representation from the sciences, politics, philosophy, and the arts, including Charles Darwin, Thorstein Veblen, W. E. B. Du Bois, Isadora Duncan, John Reed, Adolf Hitler, and Sergei Eisenstein. Another welcome feature is a substantial selection of hard-to-find manifestos from the many modernist movements, among them futurism, cubism, Dada, surrealism, and anarchism. (shrink)
Machine generated contents note: -- List of Contributors -- Acknowledgments -- Introduction: Towards a New Literary Humanism; A. Mousley -- PART I: LITERATURE_AS ERSATZ_THEOLOGY: DEEP SELVES -- Introduction; A. Mousley -- Faith, Feeling, Reality: Anne Brontë as an Existentialist Poet; R. Styler -- Virginia Woolf, Sympathy and Feeling for the Human; K. Martin -- Being Human and being Animal in Twentieth-Century Horse-Whispering Writings: 'Word-Bound Creatures' and 'the Breath of Horses'; E. Graham_ -- Judith Butler and the Catachretic Human; I. (...) Arteel -- PART II: SCEPTICISM,_OR HUMANISM AT THE LIMIT -- Introduction; A. Mousley -- Shakespeare's Refusers: Humanism at the Limit; R. Chamberlain -- Why Eliot Killed Lydgate: 'Joyful Cruelty' in Middlemarch; S. Earnshaw -- Atomised: Mary Midgley and Michel Houellebecq; J. Wallace -- Humanity without Itself: Robert Musil, Giorgio Agamben and Posthumanism; I. Callus_& S. Herbrechter -- PART III: LITERATURE, DEMOCRACY, HUMANISMS FROM BELOW -- Introduction; A. Mousley -- Mobilising Unbribable Life: The Politics of Contemporary Poetry in Bosnia and Herzegovina; D. Arsenijevic -- HUM (-an, -ane, -anity, -anities, -anism, -anise); M. Robson -- Humanising Marx: Theory and Fiction in the Fin de Siècle British Socialist Periodical; D. Mutch -- Civic Humanism: Said, Brecht and Coriolanus; N. Wood -- References -- Index. (shrink)
Fish, S. Georgics of the mind: Bacon's philosophy and the experience of his Essays.--Brett, R. L. Thomas Hobbes.--Watt, I. Realism and the novel.--Tuveson, E. Locke and Sterne.--Kampf, L. Gibbon and Hume.--Frye, N. Blake's case against Locke.--Abrams, M. H. Mechanical and organic psychologies of literary invention.--Ryle, G. Jane Austen and the moralists.--Schneewind, J. B. Moral problems and moral philosophy in the Victorian period.--Donagan, A. Victorian philosophical prose: J. S. Mill and F. H. Bradley.--Pitcher, G. Wittgenstein, nonsense, and Lewis Carroll.--Bolgan, A. C. (...) The philosophy of F. H. Bradley and the mind and art of T. S. Eliot: an introduction.--Davie, D. Yeats, Berkeley, and Romanticism.--Ross, M. L. The mythology of friendship: D. H. Lawrence, Bertrand Russell, and "The Blind man".--Rosenbaum, S. P. The philosophical realism of Virginia Woolf.--Bibliography (p. 357-360). (shrink)
Carol Gilligan has identified two orientations to moral understanding; the dominant justice orientation and the under-valued care orientation. Based on her discernment of a voice of care, Gilligan challenges the adequacy of a deontological liberal framework for moral development and moral theory. This paper examines how the orientations of justice and care are played out in medical ethical theory. Specifically, I question whether the medical moral domain is adequately described by the norms of impartiality, universality, and equality that (...) characterize the liberal ideal. My analysis of justice-oriented medical ethics, focuses on the libertarian theory of H.T. Engelhardt and the contractarian theory of R.M. Veatch. I suggest that in the work of E.D. Pellegrino and D.C. Thomasma we find not only a more authentic representation of medical morality but also a project that is compatible with the care orientation's emphasis on human need and responsiveness to particular others. (shrink)
I explore some new directions-suggested by feminism-for medical ethics and for philosophical ethics generally. Moral philosophers need to confront two issues. The first is deciding which moral issues merit attention. Questions which incorporate the perspectives of women need to be posed-e.g., about the unequal treatment of women in health care, about the roles of physician and nurse, and about relationship issues other than power struggles. "Crisis issues" currently dominate medical ethics, to the neglect of what I call "housekeeping issues." The (...) second issue is how philosophical moral debates are conducted, especially how ulterior motives influence our beliefs and arguments. Both what we select-and neglect-to study as well as the "games" we play may be sending a message as loud as the words we do speak on ethics. (shrink)
A more complete methodology for normative ethics is needed, and Kierkegaard's philosophy, which emphasizes the individual's role in moral decision-making, can help to meet this need. This essay discusses two ways in which Kierkegaard sought to expand a commonly accepted conception of morality. First, he stressed that the agent changes as part of the process of moral decision-making, with personal experience and insight integral parts of that process. Second, Kierkegaard included within the realm of morality decisions (e.g., about occupation) which (...) are normally viewed as simply matters of personal preference. (shrink)
This series of responses was commissioned to accompany the article by Singer et al, which can be found at http://www.biomedcentral.com/1472-6939/2/1. If you would like to comment on the article by Singer et al or any of the responses, please email us on email@example.com.
Various fault modes of determinism in classical physics are outlined. It is shown how quantum mechanics can cure some forms of classical indeterminism. †To contact the author, please write to: Department of HPS, University of Pittsburgh, 1017 Cathedral of Learning, Pittsburgh, PA 15260; e‐mail: firstname.lastname@example.org.
In this article I consider the challenges for exporting causal knowledge raised by complex biological systems. In particular, James Woodward’s interventionist approach to causality identified three types of stability in causal explanation: invariance, modularity, and insensitivity. I consider an example of robust degeneracy in genetic regulatory networks and knockout experimental practice to pose methodological and conceptual questions for our understanding of causal explanation in biology. †To contact the author, please write to: Department of History and Philosophy of Science, University of (...) Pittsburgh, 1017 Cathedral of Learning, Pittsburgh, PA 15260; e‐mail: email@example.com. (shrink)
Newton’s equations of motion tell us that a mass at rest at the apex of a dome with the shape specified here can spontaneously move. It has been suggested that this indeterminism should be discounted since it draws on an incomplete rendering of Newtonian physics, or it is “unphysical,” or it employs illicit idealizations. I analyze and reject each of these reasons. †To contact the author, please write to: Department of History and Philosophy of Science, University of Pittsburgh, Pittsburgh, PA (...) 15260; e‐mail: firstname.lastname@example.org. (shrink)
This article considers claims that biology should seek general theories similar to those found in physics but argues for an alternative framework for biological theories as collections of prototypical interlevel models that can be extrapolated by analogy to different organisms. This position is exemplified in the development of the Hodgkin‐Huxley giant squid model for action potentials, which uses equations in specialized ways. This model is viewed as an “emergent unifier.” Such unifiers, which require various simplifications, involve the types of heuristics (...) discussed in Wimsatt’s writings on reduction, but with a twist. Here, the heuristics are used to generate emergent rather than reductive explanations. †To contact the author, please write to: Department of History and Philosophy of Science, University of Pittsburgh, 1017 Cathedral of Learning, Pittsburgh, PA 15260; e‐mail: email@example.com. (shrink)
The epistemic state of complete ignorance is not a probability distribution. In it, we assign the same, unique, ignorance degree of belief to any contingent outcome and each of its contingent, disjunctive parts. That this is the appropriate way to represent complete ignorance is established by two instruments, each individually strong enough to identify this state. They are the principle of indifference (PI) and the notion that ignorance is invariant under certain redescriptions of the outcome space, here developed into the (...) ‘principle of invariance of ignorance' (PII). Both instruments are so innocuous as almost to be platitudes. Yet the literature in probabilistic epistemology has misdiagnosed them as paradoxical or defective since they generate inconsistencies when conjoined with the assumption that an epistemic state must be a probability distribution. To underscore the need to drop this assumption, I express PII in its most defensible form as relating symmetric descriptions and show that paradoxes still arise if we assume the ignorance state to be a probability distribution. *Received February 2007; revised July 2007. †To contact the author, please write to: Department of History and Philosophy of Science, University of Pittsburgh, Pittsburgh, PA 15260; e-mail: firstname.lastname@example.org. (shrink)
The rotating discs argument (RDA) against perdurantism has been mostly discussed by metaphysicians, though the argument of course appeals to ideas from classical mechanics, especially about rotation. In contrast, I assess the RDA from the perspective of the philosophy of physics. I argue for three main conclusions. The first conclusion is that the RDA can be formulated more strongly than is usually recognized: it is not necessary to imagine away the dynamical effects of rotation. The second is that in (...) general relativity, the RDA fails because of frame-dragging. The third conclusion is that even setting aside general relativity, the strong formulation of the RDA can after all be defeated. Namely, by the perdurantist taking objects in classical mechanics (whether point-particles or continuous bodies) to have only temporally extended, i.e. non-instantaneous, temporal parts: which immediately blocks the RDA. Admittedly, this version of perdurantism defines persistence in a weaker sense of `definition' than pointilliste versions that aim to define persistence assuming only instantaneous temporal parts. But I argue that temporally extended temporal parts: (i) can do the jobs within the endurantism-perdurantism debate that the perdurantist wants temporal parts to do; and (ii) are supported by both classical and quantum mechanics. This is an extract from a much longer paper, which is available at: http://philsci-archive.pitt.edu/archive/00001760. The main differences are that the longer paper: (i) gives much more detail about the form and scope of the RDA, the interpretative subtleties of classical mechanics, and the physics of rotation; and (ii) reports and assesses several other replies to the RDA, especially those by Callender, Lewis, Robinson and Sider. (shrink)
Robert MacArthur's mathematical ecology is often regarded as ahistorical and has been criticized by historically oriented ecologists and philosophers for ignoring the importance of history. I clarify and defend his approach, especially his use of simple mathematical models to explain patterns in data and to generate predictions that stimulate empirical research. First I argue that it is misleading to call his approach ahistorical because it is not against historical explanation. Next I distinguish three kinds of criticism of his approach and (...) argue that his approach is compatible with the first two of them. Finally, I argue that the third kind of criticism, advanced by Kim Sterelny and Paul Griffiths, is largely irrelevant to MacArthur's approach. ‡I am especially grateful to Thomas Nickles for encouragement and helpful comments on earlier versions of this paper. Thanks also to Guy Hoelzer, Stephen Jenkins, and Jay Odenbaugh for comments on an earlier draft, Kim Sterelny for clarifications of the Tasmania example, Gregory Mikkelson for references, and the audience at PSA 2006 for discussions. †To contact the author, please write to: Department of History and Philosophy of Science, University of Pittsburgh, 1017 Cathedral of Learning, Pittsburgh, PA 15260; e-mail: email@example.com. (shrink)
Sewall Wright and Gustave Malécot developed important theories of isolation by distance. Wright’s theory was statistical and Malécot’s probabilistic. Because of this mathematical difference, they were not clear about the relationship between their theories. In this paper, I make two points to clarify this relationship. First, I argue that Wright’s theory concerns what I call ecological isolation by distance , whereas Malécot’s concerns what I call genetic isolation by distance . Second, I suggest that if Wright’s theory is interpreted appropriately, (...) a previously unnoticed connection between the two theories emerges. †To contact the author, please write to: Yoichi Ishida, Department of History and Philosophy of Science, University of Pittsburgh, 1017 Cathedral of Learning, Pittsburgh, PA 15260; e‐mail: firstname.lastname@example.org. (shrink)
Call a thought whose expression involves the utterance of an indexical an indexical thought . Thus, my thoughts that I’m annoyed, that now is not the right time, that this is not acceptable, are all indexical thoughts. Such thoughts present a prima facie problem for the thesis that thought contents are phenomenally individuated – i.e., that each distinct thought type has a proprietarily cognitive phenomenology such that its having that phenomenology makes it the thought that it is – given the (...) assumption that phenomenology is intrinsically determined (i.e. (shrink)
The arguments of Fodor, Garret, Walker and Parkes [(1980) Against definitions, Cognition, 8, 263-367] are the source of widespread skepticism in cognitive science about lexical semantic structure. Whereas the thesis that lexical items, and the concepts they express, have decompositional structure (i.e. have significant constituents) was at one time "one of those ideas that hardly anybody [in the cognitive sciences] ever considers giving up" (p. 264), most researchers now believe that "[a]ll the evidence suggests that the classical [(decompositional)] view is (...) wrong as a general theory of concepts" [Smith, Medin & Rips (1984) A psychological approach to concepts: comments on Rey, Cognition, 17, 272], and cite Fodor et al. (1980) as "sounding the death knell for decompositional theories" [MacNamara & Miller (1989) Attributes of theories of meaning, Psychological Bulletin, 106, 360]. I argue that the prevailing skepticism is unmotivated by the arguments in Fodor et al. Fodor et al. misrepresent the form, function and scope of the decompositional hypothesis, and the procedures they employ to test for the psychological reality of definitions are flawed. I argue, further, that decompositional explanations of the phenomena they consider are preferable to their primitivist alternatives, and, hence, that there is prima facie reason to accept them as evidence for the existence of decompositional structure. Cognitive scientists would, therefore, do well to revert to their former commitment to the decompositional hypothesis. (shrink)