Hommel et al. emphasize that the Theory of Event Coding (TEC)'s utility is not its ability to be a new theory of cognition, but its ability to engender new thinking about new and old problems. In this commentary we use the TEC to re-examine a long-standing discrepancy in the attention literature.
The Meno , one of the most widely read of the Platonic dialogues, is seen afresh in this original interpretation that explores the dialogue as a theatrical presentation. Just as Socrates's listeners would have questioned and examined their own thinking in response to the presentation, so, Klein shows, should modern readers become involved in the drama of the dialogue. Klein offers a line-by-line commentary on the text of the Meno itself that animates the characters and conversation (...) and carefully probes each significant turn of the argument. "A major addition to the literature on the Meno and necessary reading for every student of the dialogue."--Alexander Seasonske, Philosophical Review "There exists no other commentary on Meno which is so thorough, sound, and enlightening."-- Choice Jacob Klein (1899-1978) was a student of Martin Heidegger and a tutor at St. John's College from 1937 until his death. His other works include Plato's Trilogy: Theaetetus, the Sophist, and the Statesman , also published by the University of Chicago Press. (shrink)
In this paper, I first consider a famous objection that the standard interpretation of the Lockean account of diachronicity (i.e., one’s sense of personal identity over time) via psychological connectedness falls prey to breaks in one’s personal narrative. I argue that recent case studies show that while this critique may hold with regard to some long-term autobiographical self-knowledge (e.g., episodic memory), it carries less warrant with respect to accounts based on trait-relevant, semantic felfknowledge. The second issue I address concerns the (...) question of diachronicity from the vantage point that there are (at least) two aspects of self—the self of psychophysical instantiation (what I term the epistemological self) and the self of first person subjectivity (what I term the ontological self; for discussion, see Klein SB, The self and its brain, Social Cognition, 30, 474–518, 2012). Each is held to be a necessary component of selfhood, and, in interaction, they are appear jointly sufficient for a synchronic sense of self (Klein SB, The self and its brain, Social Cognition, 30, 474–518, 2012). As pertains to diachronicity, by contrast, I contend that while the epistemological self, by itself, is precariously situated to do the work required by a coherent theory of personal identity across time, the ontological self may be better positioned to take up the challenge. (shrink)
This is the third draft of a paper that aims to clarify the apparent contradictions in the views presented in certain standards and other specifications of health informatics systems, contradictions which come to light when the latter are evaluated from the perspective of realist philosophy. One of the origins of this document was Klein’s discussion paper of 2005-07-02 entitled “Conceptology vs Reality” and the responses from Smith, as well as the several hours of discussions during the 2005 MIE meeting (...) in Geneva. (shrink)
The third Earl of Shaftesbury was a pivotal figure in eighteenth-century thought and culture. Professor Klein's study is the first to examine the extensive Shaftesbury manuscripts and offer an interpretation of his diverse writings as an attempt to comprehend contemporary society and politics and, in particular, to offer a legitimation for the new Whig political order established after 1688. As the focus of Shaftesbury's thinking was the idea of politeness, this study involves the first serious examination of the (...) importance of the idea of politeness in the eighteenth century for thinking about society and culture and organising cultural practices. Through politeness, Shaftesbury conceptualised a new kind of public and critical culture for Britain and Europe, and greatly influenced the philosophical and cultural models associated with the European Enlightenment. (shrink)
Abstract This review discussion outlines Justin Barrett’s Preparedness Model. This evolutionary model for belief in God is shown to posit a maladaptive mind for infants. Questions about its implications and the supporting data are considered. Content Type Journal Article Pages 1-3 DOI 10.1007/s11841-012-0300-x Authors Dwayne Raymond, Department of Philosophy, Texas A&M University, College Station, TX, USA Journal Sophia Online ISSN 1873-930X Print ISSN 0038-1527.
This book casts new light on the traditional disagreement between those who hold that we cannot be morally responsible for our actions if they are causally determined, and those who deny this. Klein suggests that reflection on the relation between justice and deprivation offers a way out of this perplexity.
What is the meaning of the word `grace'? Can Wittgenstein's maxim that the meaning of a word is its usage help explicate the claims that Christians have made about grace? When Christians use the word, they reference within language the point of contact between humanity and the divine. Terrance W. Klein suggests that grace is not an occult object but rather an insight, a moment when we perceive God to be active on our behalf. Klein examines the biblical (...) evidence that grace begins as a recognition of God's favour, before considering Augustine as the theologian who champions history rather than nature as the place of encounter with grace. Aquinas' work on grace is also explored, retrieving the saint's thought on three seminal concepts: nature, form, and the striving intellect. Overall, Klein suggests that grace is the perception of a form, an awareness that the human person is being addressed by the world itself. (shrink)
Klein, Renate The practice of surrogacy in Australia has been controversial since its beginning in the late 1980s. In 1988, the famous 'Kirkman case' in the state of Victoria put surrogacy on the national map. This was a two-sisters surrogacy - Linda and Maggie Kirkman and the resulting baby Alice - in which power differences between the two women were extraordinarily stark: Maggie was the glamorous and well spoken woman of the world; Linda who carried the baby, was the (...) demure school teacher in child-like frocks and pig tails. Their IVF doctor applauded altruistic surrogacy. He called it 'gestational surrogacy' and proclaimed that if the so-called surrogate mother didn't use her own eggs, thus wasn't the baby's 'genetic' mother, no attachment would ensue! This statement is haunting us to this day. It is patently absurd: as a baby grows in a woman's body over the nine months of the pregnancy, it is hard to see why the 24/7 presence of the baby inside her body, its growth, its interaction with her (movements, the baby's kicking) would be any different whether s/he has the mother's genes! (shrink)
Academic freedom has become the enemy of the individual professors working in colleges and universities across the United States. Despite its historical (and maybe even essential) roots in the First Amendment, contemporary case law has consistently shown that professors, unlike most members of society, have no rights to free speech on their respective campuses. (Ironically, this is especially true on our State campuses.) Outlined is the dramatic change in the history of the courts from recognizing “academic freedom” as a construct (...) needed to protect professors from the status quo, to the abuse of “academic freedom” appropriated to protect the institution from “undesirable” professorial actions such as politically incorrect speech or research. Klein warns all those in the academy to become familiar with this pernicious 180-degree turn in the use of the “academic freedom” construct. (shrink)
In this study, we examine differences in cheating behaviors in higher education between two countries, namely the United States and the Czech Republic, which differ in many social, cultural and political aspects. We compare a recent (2011) Czech Republic survey of 291 students to that of 268 students in the US (Klein et al., 2007). For all items surveyed, CR students showed a higher propensity to engage in cheating. Additionally, we found more forms of serious cheating present in the (...) Czech sample. In all cases, the differences between the US and Czech samples were statistically significant. (shrink)
The purpose of this paper is to explain how infinitism—the view that reasons are endless and non-repeating—solves the epistemic regress problem and to defend that solution against some objections. The first step is to explain what the epistemic regress problem is and, equally important, what it is not. Second, I will discuss the foundationalist and coherentist responses to the regress problem and offer some reasons for thinking that neither response can solve the problem, no matter how they are tweaked. Then, (...) I want to present the infinitist solution to the problem and defend it against some of the well known objections to it. (shrink)
Functional neuroimaging (NI) technologies like Positron Emission Tomography and functional Magnetic Resonance Imaging (fMRI) have revolutionized neuroscience, and provide crucial tools to link cognitive psychology and traditional neuroscientific models. A growing discipline of 'neurophilosophy' brings fMRI evidence to bear on traditional philosophical issues such as weakness of will, moral psychology, rational choice, social interaction, free will, and consciousness. NI has also attracted critical attention from psychologists and from philosophers of science. I review debates over the evidential status of fMRI, including (...) the differences between brain scans and ordinary images, the legitimacy of forward inference and reverse inference, and deductive versus probabilistic accounts of NI evidence. I conclude with a discussion of fMRI as exploratory rather than confirmatory evidence, linking this debate to the growing literature on cognitive ontology. (shrink)
: In this essay, I examine the arguments against physician-assisted suicide (PAS) Susan Wolf offers in her essay, "Gender, Feminism, and Death: Physician-Assisted Suicide and Euthanasia." I argue that Wolf's analysis of PAS, while timely and instructive in many ways, does not require that feminists reject policy approaches that might permit PAS. The essay concludes with reflections on the relationship between feminism and questions of agency, especially women's agency.
Unlike the overall framework of Ernest Nagel's work on reduction, his theory of intertheoretic connection still has life in it. It handles aptly cases where reduction requires complex representation of a target domain. Abandoning his formulation as too liberal was a mistake. Arguments that it is too liberal at best touch only Nagel's deductivist theory of explanation, not his condition of connectability. Taking this condition seriously gives a powerful view of reduction, but one which requires us to index explanatory power (...) to sciences as they are formulated at particular times. While we may thereby reduce more than philosophers have supposed, we must abandon hope (as Nagel did) of saying anything useful about reductionism. (shrink)
fMRI promises to uncover the functional structure of the brain. I argue, however, that pictures of ‘brain activity' associated with fMRI experiments are poor evidence for functional claims. These neuroimages present the results of null hypothesis significance tests performed on fMRI data. Significance tests alone cannot provide evidence about the functional structure of causally dense systems, including the brain. Instead, neuroimages should be seen as indicating regions where further data analysis is warranted. This additional analysis rarely involves simple significance testing, (...) and so justified skepticism about neuroimages does not provide reason for skepticism about fMRI more generally. (shrink)
Multiply realizable properties are those whose realizers are physically diverse. It is often argued that theories which contain them are ipso facto irreducible. These arguments assume that physical explanations are restricted to the most specific descriptions possible of physical entities. This assumption is descriptively false, and philosophically unmotivated. I argue that it is a holdover from the late positivist axiomatic view of theories. A semantic view of theories, by contrast, correctly allows scientific explanations to be couched in the most perspicuous, (...) powerful language available. On a semantic view, traditional notions of multiple realizability are thus very hard to motivate. At best, one must abandon either the idea that multiple realizability is an interesting scientific notion, or else admit that multiply realizable properties do not automatically block scientific reductions. (shrink)
Amputation of a limb can result in the persistent hallucination that the limb is still present [Ramachandran and Hirstein, 1998]. Distressingly, these socalled ‘phantom limbs’ are often quite painful. Of a friend whose arm had been amputated due to gas gangrene, W.K. Livingston writes: I once asked him why the sense of tenseness in the hand was so frequently emphasized among his complaints. He asked me to clench my fingers over my thumb, flex my wrist, and raise the arm into (...) a hammerlock position and hold it there. He kept me in this position as long as I could stand it. At the end of five minutes I was perspiring freely, my hand and arm felt unbearably cramped, and I quit. But you can take your hand down, he said. (quoted in [Melzack, 1973] 53) In addition to the obvious medical issues, phantom limb pain also presents philosophical problems. Here’s a thorny one: are phantom limb pains hallucinations of pain? (shrink)
Functional magnetic resonance imaging (or fMRI)1 is widely used to support hypotheses about brain function. Many find the images produced from fMRI data to be especially compelling evidence for scientific hypotheses [McCabe and Castel, 2008]. There are many problems with all of this; I want to start with two of them, and argue that they get us closer to an under-appreciated worry about many imaging experiments.
The dual-track theory of moral reasoning has received considerable attention due to the neuroimaging work of Greene et al. Greene et al. claimed that certain kinds of moral dilemmas activated brain regions specific to emotional responses, while others activated areas specific to cognition. This appears to indicate a dissociation between different types of moral reasoning. I re-evaluate these claims of specificity in light of subsequent empirical work. I argue that none of the cortical areas identified by Greene et al. are (...) functionally specific: each is active in a wide variety of both cognitive and emotional tasks. I further argue that distinct activation across conditions is not strong evidence for dissociation. This undermines support for the dual-track hypothesis. I further argue that moral decision-making appears to activate a common network that underlies self-projection: the ability to imagine oneself from a variety of viewpoints in a variety of situations. I argue that the utilization of self-projection indicates a continuity between moral decision-making and other kinds of complex social deliberation. This may have normative consequences, but teasing them out will require careful attention to both empirical and philosophical concerns. (shrink)
Robert Rupert is well-known as an vigorous opponent of the hypothesis of extended cognition (HEC). His Cognitive Systems and the Extended Mind is a first-rate development of his “systems-based” approach to demarcating the mind. The results are impressive. Rupert’s account brings much-needed clarity to the often-frustrating debate over HEC: much more than just an attack on HEC, he gives a compelling picture of why the debate matters.
Multiply realizable kinds are scientifically problematic, for it appears that we should not expect discoveries about them to hold of other members of that kind. As such, it looks like MR kinds should have no place in the ontology of the special sciences. Many resist this conclusion, however, because we lack a positive account of the role that certain realization-unrestricted terms play in special science explanations. I argue that many such terms actually pick out idealizing models. Idealizing explanation has many (...) of the features normally associated with explanation by MR kinds. As idealized models are usually mere possibilia, such explanations do not run afoul of the metaphysical problems that plague MR kinds. (shrink)
A conclusion drawn after a conference devoted (in 1995) to the “arrow of time” was the following: “Indeed, it seems not a very great exaggeration to say that the main problem with “the problem of the direction of time” is to figure out exactly what the problem is supposed to be !” What does that mean? That more than 130 years after the work of Ludwig Boltzmann on the interpretation of irreversibility of physical phenomena, and that one century after Einstein’s (...) formulation of Special Relativity, we are still not sure what we mean when we talk of “time” or “arrow of time”. We shall try to show that one source of this difficulty is our tendency to confuse, at least verbally, time and becoming, i.e. the course of time and the arrow of time, two concepts that the formalisms of modern physics are careful to distinguish. (shrink)
Philosophers have sought to characterize a type of knowledge — what I call real knowledge — which is significantly different from the ordinary concept of knowledge. The concept of knowledge as true, justified belief — what I call knowledge simpliciter — failed to depict the sought after real knowledge because the necessary and jointly sufficient conditions of knowledge simpliciter can be felicitously but accidentally fulfilled. Real knowledge is knowledge simpliciter plus a set of requirements which guarantee that the truth, belief (...) and justification conditions are not accidentally conjoined. Two of those requirements have received considerable attention in recent literature by the defeasibility theorists and the causal theorists. I argue that a third requirement is needed to block the merely coincidental cosatisfaction of the belief and justification conditions and to capture our intuitions about the epistemic agent who possesses real knowledge. That condition ascribes a disposition to the real knower to believe all and only justified propositions in virtue of his/her belief that the propositions are justified. Two consequences of that requirement are discussed: (1) if S really knows that p, then S knows simpliciter that S knows simpliciter that p and (2) the iterative feature of real knowledge mentioned in (1) provides a basis for the rejection of a particularly pernicious form of scepticism. (shrink)
ABSTRACT. Associationist psychologists of the late 19th-century premised their research on a fundamentally Humean picture of the mind. So the very idea of mental science was called into question when T. H. Green, a founder of British idealism, wrote an influential attack on Hume’s Treatise. I first analyze Green’s interpretation and criticism of Hume, situating his reading with respect to more recent Hume scholarship. I focus on Green’s argument that Hume cannot consistently admit real ideas of spatial relations. I then (...) argue that William James’s early work on spatial perception attempted to vindicate the new science of mind by showing how to avoid the problems Green had exposed in Hume’s empiricism. James’s solution involved rejecting a basic Humean assumption—that perceptual experience is fundamentally composed of so-called minima sensibilia, or psychological atoms. The claim that there are no psychological atoms is interesting because James supported it with experimental data rather than (as commentators typically suppose) with introspective description or a priori argument. James claimed to be the real descendant of British empiricism on grounds that his anti-atomistic model of perception fortified what Green had perhaps most wanted to demolish—the prospect of using empirical, scientific methods in the study of mind. (shrink)
The concept of empiricism evokes both a historical tradition and a set of philosophical theses. The theses are usually understood to have been developed by Locke, Berkeley, and Hume. But these figures did not use the term “empiricism,” and they did not see themselves as united by a shared epistemology into one school of thought. My dissertation analyzes the debate that elevated the concept of empiricism (and of an empiricist tradition) to prominence in English-language philosophy. -/- In the 1870s and (...) ’80s a lively debate about psychology emerged. Neo-Kantian idealists criticized the very idea that the mind can be studied scientifically. A group of philosopher-psychologists responded, often in Mind. They were among the first to call themselves “empiricists,” arguing that psychology could provide a scientific basis for philosophical progress. Idealists held that empirical psychology depended on premises developed by Locke, Berkeley, and Hume. These premises were allegedly absurd because they rendered ideas of extension, as well as other ideas crucial to natural science, unreal. Those who wanted to advance psychology towards becoming a legitimate science were forced to engage these philosophical attacks, while at the same time to develop empirical theories that could successfully explain some characteristics of experience. I show how James’s theory of space perception accomplished both tasks. -/- In developing this theory, James found he had to reject the Lockean notion that reality is associated with passively-registered sensations. James also abandoned Berkeley and Hume’s claim that ideas are ultimately derived from atomic sensations. Instead, James presented experimental evidence that sensation is a continuous stream. The mind must actively parse this stream if it is to gain a coherent representation of its environment. I argue that James’s stream-of-thought thesis served as a presupposition of his entire psychology. The thesis showed how the labor of investigating the mind could be divided between philosophers and scientists, and in a manner sensitive to the concerns of both. The stream thesis also provided a scientific basis for a new philosophical empiricism that, I argue, has a hidden legacy in the history of analytic philosophy. (shrink)
Maura Tumulty has raised two objections to my imperative account of pain.1 First, she argues that there is a disanalogy between pains and other imperative sensations like itch, hunger, and thirst. Suppose (with Hall) one thinks that an itch says “Scratch here!”2 Scratch the itch, and it dutifully disappears. Not so with pain. The pain of a broken ankle has the content ‘Do not put weight on that ankle!’ Yet the coddled ankle still throbs: obeying the imperative does not extinguish (...) it. Second, Tumulty argues that the imperative account cannot handle certain pains, particularly pains of the deep viscera. On my account, pains proscribe against taking action with the painful body part. Yet some pains are associated with body parts over which we have no control. Kidney stones cause intense pain, but I cannot (voluntarily) control my kidney. What action, then, could that pain possibly proscribe? Lacking such a story, it is hard to say (as I do) that pains are exhausted by their imperative content. (shrink)
Modern logicians have sought to unlock the modal secrets of Aristotle's Syllogistic by assuming a version of essentialism and treating it as a primitive within the semantics. These attempts ultimately distort Aristotle's ontology. None of these approaches make full use of tests found throughout Aristotle's corpus and ancient Greek philosophy. I base a system on Aristotle's tests for things that can never combine (polarity) and things that can never separate (inseparability). The resulting system not only reproduces Aristotle's recorded results for (...) the apodictic syllogistic in the Prior Analytics but it also generates rather than assumes Aristotle's distinctions among 'necessary', 'essential' and 'accidental'. By developing a system around tests that are in Aristotle and basic to ancient Greek philosophy, the system is linked to a history of practices, providing a platform for future work on the origins of logic. (shrink)
Kohler's experiments with inverting goggles are often thought to support enactivism by showing that visual re-inversion occurs simultaneous with the return of sensorimotor skill. Closer examination reveals that Kohler's work does not show this. Recent work by Linden et al. shows that re-inversion, if it occurs at all, does not occur when the enactivist predicts. As such, the empirical evidence weighs against enactivism.
Hans Reichenbach's so-called geometrical conventionalism is often taken as an example of a positivistic philosophy of science, based on a verificationist theory of meaning. By contrast, we shall argue that this view rests on a misinterpretation of Reichenbach's major work in this area, the Philosophy of Space and Time (1928). The conception of equivalent descriptions, which lies at the heart of Reichenbach's conventionalism, should be seen as an attempt to refute Poincaré's geometrical relativism. Based upon an examination of the reasons (...) Reichenbach gives for the cognitive equivalence of geometrical descriptions, the paper argues that his conventionalism is a specific form of scientific realism. At the same time we shall argue against those interpretations which lead to a trivialization of Reichenbach's conventionalism or deny it entirely. (shrink)
Corballis seems to have not considered two points: (1) the importance of direct selection pressures for the evolution of handedness; and (2) the evolutionary significance of the polymorphism of handedness. We provide arguments for the need to explain handedness in terms of adaptation and natural selection.
How can we help people develop judgement and decision skills? One approach is to teach formal methods such as decision analyses, but these are difficult to apply in ill-structured settings, and the methods are unworkable when one is under time pressure and uncertain conditions. If we regard these skills as types of expertise that can be developed, then in a given domain we may attempt to define the cues, patterns, and strategies used by experts, and develop a programme to teach (...) people how to think like experts. However, in many settings this can be impractical. A different approach to decision skills training is to teach people how to learn like experts. By identifying a set of strategies used by experts to develop their proficiency at decision making, we can develop a programme aimed at helping people become reflective practitioners. This article describes such a training programme. (shrink)
Eco, Chopin, and the limits of intertextuality -- The appeal to structure -- On codes, topics, and leaps of interpretation -- Bloom, Freud, and Riffaterre : influence and intertext as signs of the uncanny -- Narrative and intertext : the logic of suffering in Lutosawski's Symphony no. 4.
Consciousness supervenes on activity; computation supervenes on structure. Because of this, some argue, conscious states cannot supervene on computational ones. If true, this would present serious difficulties for computationalist analyses of consciousness (or, indeed, of any domain with properties that supervene on actual activity). I argue that the computationalist can avoid the Superfluous Structure Problem (SSP) by moving to a dispositional theory of implementation. On a dispositional theory, the activity of computation depends entirely on changes in the intrinsic properties of (...) implementing material. As extraneous structure is not required for computation, a system can implement a program running on some but not all possible inputs. Dispositional computationalism thus permits episodes of computational activity that correspond to potential episodes of conscious awareness. The SSP cannot be motivated against this account, and so computationalism may be preserved. (shrink)
In his 1981 article "What is 'business ethics'"? Peter Drucker maintains that the then current business ethics literature is a form of casuistry, and it provides an illegitimate argument for business apologists, while it also unjustly bashes business. I agree with W. Michael Hoffman's and Jennifer Mills Moore's criticisms of Drucker's article. However, by limiting themselves to this article, rather than considering Drucker's management works, they have missed an opportunity to benefit from his acknowledged practical wisdom. In this paper, I (...) seize the opportunity to show that Drucker takes business ethics seriously, and I develop his position on business morality. His view of business management responsibility and the related notion of a just organization is seen to be essentially Platonic. (shrink)
It would be nice if our definition of ‘physical’ incorporated the distinctive content of physics. Attempts at such a definition quickly run into what’s known as Hempel’s dilemma. Briefly: when we talk about ‘physics’, we refer either to current physics or to some idealized version of physics. Current physics is likely wrong and so an unsuitable basis for a definition. ‘Ideal physics’ can’t itself be cashed out except as the science which has completed an accurate survey of the physical; appeals (...) to it to define the physical must therefore end up trivial or circular. So defining the physical in terms of physics looks like a non-starter. (shrink)
The debate over off-line simulation has largely focussed on the capacity to predict behavior, but the basic idea of off-line simulation can be cast in a much broader framework. The central claim of the off-line account of behavior prediction is that the practical reasoning mechanism is taken off-line and used for predicting behavior. However, there's no reason to suppose that the idea of off-line simulation can't be extended to mechanisms other than the practical reasoning system. In principle, any cognitive component (...) can be taken off-line and used to perform some other function. On this view of off-line simulation, such accounts differ radically from traditional information-based accounts of cognitive capacities. And cognitive penetrability provides a wedge for empirically determining whether a capacity requires an information-based account or an off-line simulation account. Stich and Nichols (1992) argued that the simulation theory of behavior prediction was inadequate because behavior prediction seemed to be cognitively penetrable. We present empirical evidence that supports the claim that the behavior prediction is cognitively penetrable. As a result, the simulation account of behavior prediction still seems unpromising. However, off-line simulation might provide accounts of other cognitive capacities. Indeed, off- line simulation accounts have recently been offered for a strikingly diverse set of capacities including counterfactual reasoning, empathy and mental imagery. Goldman, for instance, maintains that counterfactual reasoning and empathy clearly demand off-line simulation accounts. We argue that there are alternative information-based explanations of these phenomena. Nonetheless, the off-line accounts of these phenomena are interesting and clearly worthy of further exploration. (shrink)
: I argue and demonstrate in this essay that interconnected systems of science and technology, or technoscience, existed long before the late nineteenth century, and that eighteenth-century chemistry was such an early form of technoscience. Based on recent historical research on the early development of carbon chemistry from the late 1820s until the 1840s—which revealed that early carbon chemistry was an experimental expert culture that was largely detached from the mundane industrial world—I further examine the question of the internal preconditions (...) within the expert culture of carbon chemistry that contributed to its convergence with the synthetic-dye industry in the late 1850s. I argue that the introduction of new types and techniques of organic-chemical reactions and organic substances in this experimental expert culture, along with the application of chemical formulae as paper tools for modeling reactions as well as the chemical constitution and structure of substances, enabled academic chemists to make specific, novel contributions to chemical technology and industry in the second half of the nineteenth century. (shrink)
When it comes to cheating in higher education, business school students have often been accused of being the worst offenders; if true, this may be a contributing factor in the kinds of fraud that have plagued the business community in recent years. We examined the issue of cheating in the business school by surveying 268 students in business and other professional schools on their attitudes about, and experiences with, cheating. We found that while business school students actually cheated no more (...) or less than students in other professional schools, their attitudes on what constitutes cheating are more lax than those of other professional school students. Additionally, we found that serious cheaters across all professional schools were more likely to be younger and have a lower grade point average. (shrink)
Plato's paradigm for statesmanship in the Statesman, the weaving of temperate and courageous properties, provides the contemporary business ethics theorist with an aid for determining certain problems and solutions with regard to business leadership. The history of American business values manifests the destructive, and especially unethical, effects of deviating from this paradigm by over-emphasizing one or the other of the above types of qualities. However, with the aid of Plato's model for leadership in the Statesman and suggestions from Peters and (...) Waterman's In Search of Excellence, progress can be made towards constructing an adequate model for corporate leadership, especially from an ethical standpoint. (shrink)
I distinguish between two problems related to business ethics. (1) How can business ethics help morally conscientious business people to resolve moral problems in business? (2) Given the widespread belief that immorality, or at least amorality, is too prevalent in business, how can one discover both the sources of business amorality and immorality and make business as morally respectable an institution as possible? Philosophers who have concerned themselves with business ethics have emphasized (1), i.e., they consider the normative ethical principles (...) applicable to solving moral questions in business. Although some benefit can be derived from this approach, there are a number of problems with this position. I then argue that, in considering (2), we ought to analyze business life styles (ideals) that have determined the character of American business people, and show both their negative and positive moral consequences. This analysis reveals the morality, or lack of it, in modern American business, possible changes in business morality, and possible ways of developing a desirable and viable business ethic. In a sketchy way, I show how this project can be developed. (shrink)
Showing that a radical feminist analysis cuts across class, race, sexuality, region, and religion, the varied contributors in this collection reveal the global reach of radical feminism and analyze the causes and solutions to patriarchal oppression.
There has been much discussion concerning the consequences of 'going natural', i.e., of replacing a priori epistemology with empirical psychology. Traditionalists claim that a naturalized epistemology is not viable—to eliminate the normative from an account of knowledge is to cease to do epistemology at all. Naturalists claim that a naturalized account is the only viable one—assuming, in step with the urgings of Quine, that there are no standards independent of (and external to) science, science itself must act as the sole (...) epistemic norm. In the wake of the above debate some epistemologists have attempted to argue in favor of, and develop, a middle-ground position— normative naturalism . Such a position is intended to be consistent with the naturalist's intuition that the traditional search for a 'first philosophy'is misguided and consistent with the traditionalist's intuition that a complete elimination of the normative would leave epistemology impotent. In this paper I will examine one argument in favor of a normative naturalism . / will show that such a proposal is inherently problematic and argue that naturalism and normativity are mutually exclusive concepts. Therefore, I suggest that, even in the 'age of cognitive science', epistemologists continue to do traditional epistemology by attempting to develop, a priori , the general criteria for determining the justificatory status of our (scientific) beliefs. (shrink)
: Noting that academic writing typically falls in the category of work, this piece considers the relationship such writing might have with love. Animated by its observation that love's affinity with wholeness distinguishes it from work's tendency to divide a subject from herself, the essay playfully develops this contrast by telling a story of writing and wholeness. This story attempts to embody the contrasts of which it speaks, and in the process, to discover a counterpoint to the work of writing.
In the last few years, off-line simulation has become an increasingly important alternative to standard explanations in cognitive science. The contemporary debate began with Gordon (1986) and Goldman's (1989) off-line simulation account of our capacity to predict behavior. On their view, in predicting people's behavior we take our own decision making system `off line' and supply it with the `pretend' beliefs and desires of the person whose behavior we are trying to predict; we then let the decision maker reach a (...) decision on the basis of these pretend inputs. Figure 1 offers a `boxological' version of the off-line simulation theory of behavior prediction.(1). (shrink)
The allegedly alternative theories of Phyletic Gradualism and Punctuated Equilibria are examined as regards the nature of their differences. The explanatory value of both models is determined by establishing their actual connection with reality. It is concluded that they are to be considered complementary rather than mutually exclusive at all levels of infraspecific, specific, and supraspecific evolution. So, in order to be described comprehensively, the pathways of evolution require at least two distinct models, each based on a discrete range of (...) real phenomena. [Phyletic Gradualism; Punctuated Equilibria; evolutionary theories; divergence models; additive speciation; microevolution; macroevolution; anagenesis.]. (shrink)
The problem with idealization is not just that, when idealizing, scientists ask us to suppose false things. Many people do that. No, the puzzling thing about idealizers—unlike astrologers, spodomancers, and homeopaths—is that it is worth listening to them. Supposing that populations of rabbits are in- finite is useful for a variety of ecological explanations. Yet we are not up to our necks in rabbits; the puzzle is why it should be useful to suppose that we are.
In Section I, I criticize the view, implied by the concept of rational economic man, that feelings are inherently opposed to rationality. I attempt to show that emotions or feelings are essential to the proper functioning of reason, rational objectivity, and practical rationality or rational decision making. In addition, I argue that emotions can help to resolve certain ethical dilemmas. In Section II, I consider business writers who criticize business for overemphasizing the head at the expense of feelings or the (...) heart. In Section III, I discuss the connection between material self-interest (as manifested in trade) – a concept of rational economic man – and business virtues. (shrink)
In an attempt to accommodate natural language phenomena involving nominalization and self-application, various researchers in formal semantics have proposed abandoning the hierarchical type system which Montague inherited from Russell, in favour of more flexible type regimes. We briefly review the main extant proposals, and then develop a new approach, based semantically on Aczel's notion of Frege structure, which implements a version ofsubsumption polymorphism. Nominalization is achieved by virtue of the fact that the types of predicative and propositional complements are contained (...) in the type of individuals. Russell's paradox is avoided by placing a type-constraint on lambda-abstraction, rather than by restricting comprehension. (shrink)
This paper studies the semiotic,epistemological and historical aspects of Berzelianformulas in early nineteenth-century organicchemistry. I argue that Berzelian formulas wereenormously productive `paper tools' for representingchemical reactions of organic substances, and forcreating different pathways of reactions. Moreover, myanalysis of Jean Dumas's application of Berzelianformulas to model the creation of chloral from alcoholand chlorine exemplifies the role played by chemicalformulas in conceptual development (the concept ofsubstitution). Studying the dialectic of chemists'collectively shared goals and tools, I argue thatpaper tools, like laboratory instruments, areresources (...) whose possibilities are not exhausted byscientists' attempts to achieve existing goals, butrather whose applications generate new goals. The term`paper tools' is introduced to emphasize that thepragmatic and syntactic aspects of symbol systems arefully comparable to physical laboratory tools. (shrink)
The paper studies various functions of Berzelian formulas in European organic chemistry prior to the mid-nineteenth century from a semiotic, historical and epistemological perspective. I argue that chemists applied Berzelian formulas as productive 'paper tools' for creating a chemical order in the 'jungle' of organic chemistry. Beginning in the late 1820s, chemists applied chemical formulas to build models of the binary constitution of organic compounds in analogy to inorganic compounds. Based on these formula models, they constructed new classifications of organic (...) substances. They further applied Berzelian formulas in a twofold way to experimentally investigate organic chemical reactions: as tools which supplemented laboratory tools and as tools for constructing interpretive models of organic reactions. The scrutiny of chemists' performances with chemical formulas on paper also reveals a dialectic which contributed considerably to the formation of the new experimental culture of synthetic carbon chemistry that emerged between the late 1820s and the early 1840s. In an unintended and unforeseen way, the tools reacted back on the goals of their users and contributed to conceptual development and a shift of scientific objects and practices ('substitution') which transcended the originally intended chemical order. (shrink)
In the early eighteenth century, chemistry became the main academic locus where, in Francis Bacon's words, Experimenta lucifera were performed alongside Experimenta fructifera and where natural philosophy was coupled with natural history and 'experimental history' in the Baconian and Boyleian sense of an inventory and exploration of the extant operations of the arts and crafts. The Dutch social and political system and the institutional setting of the university of Leiden endorsed this empiricist, utilitarian orientation toward the sciences, which was forcefully (...) propagated by one of the university's most famous representatives in the first half of the eighteenth century, the professor of medicine, botany and chemistry Herman Boerhaave. Recent historical investigations on Boerhaave's chemistry have provided important insights into Boerhaave's religious background, his theoretical and philosophical goals, and his pedagogical agenda. But comparatively little attention has been paid to the chemical experiments presented in Boerhaave's famous chemical textbook, the Elementa chemiae, and to the question of how these experiments relate not only to experimental philosophy but also to experimental history and natural history, and to contemporary utilitarianism. I argue in this essay that Boerhaave shared a strong commitment to Baconian utilitarianism and empiricism with many other European chemists around the middle of the eighteenth century, in particular to what Bacon designated 'experimental history' and I will provide evidence for this claim through a careful analysis of Boerhaave's plant-chemical experiments presented in the Elementa chemiae. (shrink)