We present an account of processing capacity in the ACT-R theory. At the symbolic level, the number of chunks in the current goal provides a measure of relational complexity. At the subsymbolic level, limits on spreading activation, measured by the attentional parameter W, provide a theory of processing capacity, which has been applied to performance, learning, and individual differences data.
In the context of psychiatric diagnosis, operationists claim that mental disorders are nothing more than the satisfying of objective diagnostic criteria, whereas realists claim that mental disorders are latent entities that are detected by applying those criteria. The implications of this distinction are substantial in actual clinical situations, such as in the co-occurrence of disorders that may interfere with one another's detection, or when patients falsify their symptoms. Realist and operationist conceptions of diagnosis may lead to different clinical decisions in (...) these situations, affecting treatment efficacy and ultimate patient outcomes. (shrink)
How do therapists learn to manage sexual feelings in the therapeutic relationship in an ethical, responsible manner? Data from 293 university-based psychotherapists show that the minority who report that their training prepared them to do so "very well" were more likely to have received "content-specific" training related to the topic or an opportunity to explore themselves as sexual beings, or both. In addition, they had experience with supervisors who modeled the belief that sexual feelings are a normal, expected part of (...) any human relationship and must be anticipated and planned for by therapists. (shrink)
We report on the development and initial validation of the Moralization of Everyday Life Scale (MELS), designed to measure variations in people's assignment of moral weight to commonplace behaviors. In Study 1, participants reported their judgments for a large number of potential moral infractions in everyday life; principal components analysis revealed 6 main dimensions of these judgments. In Study 2, scores on the 30-item MELS showed high reliability and distinctness from the Big 5 personality traits. In Study 3, scores on (...) the MELS were strongly correlated with scores on an early scale of moral judgments, suggesting convergent validity. (shrink)
Sketching is a powerful means of working out and communicating ideas. Sketch understanding involves a combination of visual, spatial, and conceptual knowledge and reasoning, which makes it both challenging to model and potentially illuminating for cognitive science. This paper describes CogSketch, an ongoing effort of the NSF-funded Spatial Intelligence and Learning Center, which is being developed both as a research instrument for cognitive science and as a platform for sketch-based educational software. We describe the idea of open-domain sketch understanding, the (...) scientific hypotheses underlying CogSketch, and provide an overview of the models it employs, illustrated by simulation studies and ongoing experiments in creating sketch-based educational software. (shrink)
In all societies, past and present, many persons and groups have been subject to domination. Properly understood, domination is a great evil, the suffering of which ought to be minimized so far as possible. Surprisingly, however, political and social theorists have failed to provide a detailed analysis of the concept of domination in general. This study aims to redress this lacuna. It argues first, that domination should be understood as a condition experienced by persons or groups to the extent that (...) they are dependent on a social relationship in which some other person or group wields arbitrary power over them; this is termed the 'arbitrary power conception' of domination. It argues second, that we should regard it as wrong to perpetrate or permit unnecessary domination and, thus, that as a matter of justice the political and social institutions and practices of any society should be organized so as to minimize avoidable domination; this is termed 'justice as minimizing domination', a conception of social justice that connects with more familiar civic republican accounts of freedom as non-domination. In developing these arguments, this study employs a variety of methodological techniques - including conceptual analysis, formal modelling, social theory, and moral philosophy; existing accounts of dependency, power, social convention, and so on are clarified, expanded, or revised along the way. While of special interest to contemporary civic republicans, this study should appeal to a broad audience with diverse methodological and substantive interests. (shrink)
Detecting that two images are different is faster for highly dissimilar images than for highly similar images. Paradoxically, we showed that the reverse occurs when people are asked to describe how two images differ—that is, to state a difference between two images. Following structure-mapping theory, we propose that this disassociation arises from the multistage nature of the comparison process. Detecting that two images are different can be done in the initial (local-matching) stage, but only for pairs with low overlap; thus, (...) “different” responses are faster for low-similarity than for high-similarity pairs. In contrast, identifying a specific difference generally requires a full structural alignment of the two images, and this alignment process is faster for high-similarity pairs. We described four experiments that demonstrate this dissociation and show that the results can be simulated using the Structure-Mapping Engine. These results pose a significant challenge for nonstructural accounts of similarity comparison and suggest that structural alignment processes play a significant role in visual comparison. (shrink)
In his paper, "The Right of the Guilty," Corey Brettschneider aims to develop and defend a theory of punishment within the framework of a liberal-contractarian conception of political legitimacy. My response argues that this attempt to extend the liberal-contractarian theory reveals, in a particularly clear and striking manner, deep and ultimately insurmountable conceptual difficulties for that theory.
When should burdened social practices be granted special accommodation? One issue of concern—raised by Okin and others—is that some social practices involve domination, and so the accommodation of those practices might (inadvertently, perhaps) support social injustice. Suppose one wants to take this concern very seriously. Starting from the assumption that freedom from domination is an especially important value, this article examines whether cultural accommodation would ever be advisable. Approaching the problem of multicultural accommodation from this point of view greatly clarifies (...) the debate and yields some interesting results. In particular, the discussion concludes that there are circumstances under which the goal of minimizing domination itself would be furthered by policies of special accommodation. (shrink)
The review argues that Lovett’s theory of domination suffers from a problem. Lovett is aware of the problem and bites a fairly large bullet in response to it. What he does not seem aware of is that the problem can be avoided by opting for an account of welfare that he unfortunately ignores, despite the fact that it would serve his purposes well.
This article comprises a dialogue between two historians who have attempted, individually, to narrate the life of Lord George Gordon (1751 – 93), the Scottish prophet, revolutionary, and convert to Judaism. For modern cultural historians, Gordon's peregrinations between identities offer a kaleidoscopic view of Britain in the overlooked but crucial interstice between the upheavals of 1776 and 1789. Yet the partial nature of the evidence, the long omission of Gordon from the historiography of eighteenth-century Britain, and the complex, often furtive (...) nature of Gordon's activism create multiple ambiguities in his story. These ambiguities are compounded here by the authors' differing approaches. Marsha Keith Schuchard argues for a Gordon shaped by Scottish origins; Dominic Green, for a Gordon responding to English opportunities. They disagree over the likely date of Gordon's conversion to Judaism and, crucially, over whether he was a religious atavist or a Romantic pioneer. This dialogue is meant to illustrate the utility of a scholarship that acknowledges fuzziness rather than attempting to overclarify it. The article is also meant to show, however, that on the public stage fuzziness can be less benign: Gordon was a religious politician, who reworked his complexities and confusions into a violent, uncompromising critique of eighteenth-century British social order. (shrink)
Discharge planning for vulnerable infants and children is a collaborative, inter-disciplinary, decision-making activity that is grounded in the ethical complexities of clinical practice. Although it is a psychosocial intervention that frequently causes moral distress for professionals and has the potential to inflict harm on children and their families, the process has received little attention from ethicists. An ongoing study of the transition of technology-dependent children from hospital to home suggests that the ethical issues embedded in the discharge-planning process may be (...) concealed by dominant cultural values, institutional policies, clinical standards, historical precedents, and legal regulations. (shrink)
The issues of double-counting, use-constructing, and selection effects have long been the subject of debate in the philosophical as well as statistical literature. I have argued that it is the severity, stringency, or probativeness of the test—or lack of it—that should determine if a double-use of data is admissible. Hitchcock and Sober () question whether this severity criterion' can perform its intended job. I argue that their criticisms stem from a flawed interpretation of the severity criterion. Taking their criticism as (...) a springboard, I elucidate some of the central examples that have long been controversial, and clarify how the severity criterion is properly applied to them. Severity and Use-Constructing: Four Points (and Some Clarificatory Notes) 1.1 Point 1: Getting beyond all or nothing standpoints 1.2 Point 2: The rationale for prohibiting double-counting is the requirement that tests be severe 1.3 Point 3: Evaluate severity of a test T by its associated construction rule R 1.4 Point 4: The ease of passing vs. ease of erroneous passing: Statistical vs. Definitional probability The False Dilemma: Hitchcock and Sober 2.1 Marsha measures her desk reliably 2.2 A false dilemma Canonical Errors of Inference 3.1 How construction rules may alter the error-probing performance of tests 3.2 Rules for accounting for anomalies 3.3 Hunting for statistically significant differences Concluding Remarks CiteULike Connotea Del.icio.us What's this? (shrink)
This essay provides an explanation and interpretation of the undertreatment of pain by discussing some of the scientific, clinical, cultural, and philosophical aspects of this problem. One reason why pain continues to be a problem for medicine is that pain does not conform to the scientific approach to health and disease, a philosophy adopted by most health care professionals. Pain does not fit this philosophical perspective because (1) pain is subjective, not objective; (2) the causal basis of pain is often (...) poorly understood; (3) pain is often regarded as a mere symptom, not as a disease; (4) there often are no magic bullets for pain; (5) pain does not fit the expert knowledge model. In order for health care professionals to do a better job of treating pain, some changes need to occur in medical philosophy, education, and practice. (shrink)
This study explored the relationship of current incidences of academic dishonesty with future norm/rule-violating behavior. Data were collected from 154 college students enrolled in introductory and upper-level psychology students at a large Midwest public university who received credit for participating. The sample included students from many different majors and all years of study. Participants completed a self-report survey that included a measure of Academic Dishonesty (including three subscales: Self-Dishonest, Social Falsifying, and Plagiarism) and an Imagined Futures Scale (five subscales that (...) included Norm/Rule Violating, Physically Threatening, Culturally Diverse, Emotionally Distressing, and Agentic Futures). Correlation analyses indicated a significant positive relationship between all three Academic Dishonesty subscales and an imagined norm/rule-violating future. Further, regression analyses revealed social falsifying as being significantly predictive of a norm/ rule-violating future. Suggestions are made alerting educators to the importance of monitoring and discouraging academic dishonesty as it may lead to rule-violating behavior in the future. (shrink)
In this pioneering book, noted international scholars explore the limits and definitions of knowing, thinking, and communicating meaning as we move into the 21st century. Coming from disciplines as diverse as anthropology, philosophy, literature, aesthetics, and art practice, together they work towards reconceiving the boundaries between entrenched domains of knowledge to great effect.
The data in this special issue are both encouraging and discouraging. On the positive side, researchers are making theoretical breakthroughs into the psychology of the academic cheater, which may result in practical interventions. Yet the studies illustrate the sheer magnitude of the problem and the resources needed to address unethical behavior among the younger members of the American academe. In short, this special issue shows that the "Internet revolution" facilitates new types of academic dishonesty (Sisti, this issue; Stephens, Young, & (...) Calabrese, this issue); that academic cheating is often an intentional, planned act that results from a Machiavellian tendency to neutralize moral sanctions against cheating (Harding, Mayhew, Finelli, & Carpenter, this issue); that motivations to cheat differ across students (Davy, Kincaid, Smith, & Trawick, this issue; Wowra, this issue); and that academic cheating is a symptom of a larger problem (Lovett-Hooper, Komarraju, Weston, & Dollinger, this issue; Wowra, this issue). (shrink)
Several standard conditions of adequacy for confirmation are considered and a conclusion of B. Skyrms regarding the converse-consequence condition is shown to be mistaken. Widely accepted conditions such as the entailment condition and the special consequence condition are shown to be open to counterexample, and confusion about these conditions is traced to confusion about the difference between two kinds of confirmation concepts--concepts of firmness and concepts of increase in firmness. The importance of concepts of the latter sort is stressed. Finally, (...) some suggestions are offered relating to the notion of selective confirmation. (shrink)
In 1853, the young Thomas Henry Huxley published a long review of German cell theory in which he roundly criticized the basic tenets of the Schleiden-Schwann model of the cell. Although historians of cytology have dismissed Huxley's criticism as based on an erroneous interpretation of cell physiology, the review is better understood as a contribution to embryology. "The Cell-theory" presents Huxley's "epigenetic" interpretation of histological organization emerging from changes in the protoplasm to replace the "preformationist" cell theory of Schleiden and (...) Schwann (as modified by Albert von Kölliker), which posited the nucleus as the seat of organic vitality. Huxley's views influenced a number of British biologists, who continued to oppose German cell theory well into the twentieth century. Yet Huxley was pivotal in introducing the new German program of "scientific zoology" to Britain in the early 1850s, championing its empiricist methodology as a means to enact broad disciplinary and institutional reforms in British natural history. (shrink)
The design of Web browsers has resulted in a transfer of power to Web users and developers who often lack an ethical framework in which to act. For example, the technology makes it simple to copy and use other people’s Web page formatting without their permission. The author argues that we need to educate more people about ethical Web practices, and the author asks for “rules of the road” which amateurs and professionals can understand and follow. This article discusses four (...) areas of concern about Web development: the browser wars, information storage and retrieval, access for the handicapped, and cookies. For teachers, there are suggestions on how to use browsers to help students learn about Web ethics. (shrink)
Muriel Wheldale, a distinguished graduate of Newnham College, Cambridge, was a member of William Bateson's school of genetics at Cambridge University from 1903. Her investigation of flower color inheritance in snapdragons (Antirrhinum), a topic of particular interest to botanists, contributed to establishing Mendelism as a powerful new tool in studying heredity. Her understanding of the genetics of pigment formation led her to do cutting-edge work in biochemistry, culminating in the publication of her landmark work, The Anthocyanin Pigments of Plants (1916). (...) In 1915, she joined Frederick Gowland Hopkin's Department of Biochemistry as assistant and in 1926 became one of the first women to be appointed university lecturer. In 1919 she married the biochemist Huia Onslow, with whom she collaborated until his death in 1922. This paper examines Wheldale's work in genetics and especially focuses on the early linkage of Mendelian methdology with new techniques in biochemistry that eventually led to the founding of biochemical genetics. It highlights significant issues in the early history of women in genetics, including the critical role of mentors, funding opportunities, and career strategies. (shrink)
With historical hindsight, it can be little questioned that the view of protozoa as unicellular organisms was important for the development of the discipline of protozoology. In the early years of this century, the assumption of unicellularity provided a sound justification for the study of protists: it linked them to the metazoa and supported the claim that the study of these “simple” unicellular organisms could shed light on the organization of the metazoan cell. This prospect was significant, given the state (...) of cytology circa 1910. In the wake of the major gains made in understanding nuclear division in the last two decades of the nineteenth century, cytology was suddenly confronted with many, seemingly less penetrable, problems. Several aspects of nuclear organization still remained unexplained, and recent research had revealed the presence in the cytoplasm of structures whose functions in cell life were unknown. Classical methods of cytology, relying on descriptive, morphological analysis, seemed ill equipped to resolve these questions.Hertwig's program for protozoology, grounded in the assumption of the fundamental unity of organization in protozoa and metazoa, offered a potential means for investigating these and other problems of cell theory. Linked to mainstream cytology, protozoology was advocated as a means of experimentally investigating key biolobical processes — reproduction, metabolism, and organelle morphology and physiology — less accessible in higher organisms. Protozoa were hailed as prime experimental organisms, in which cell structure and function could be more easily studied. Unlike the metazoan cell, they could be subjected to controlled experiments in which the external environment was modified and the effects monitored. Protozoa offered, in other words, a promising experimental means by which to investigate the cell — its structures and its processes.The success of this program within Germany was soon apparent. In contrast to the rather neglected state of the discipline in 1900, protozoology began to be recognized as more than a somewhat obscure area of study for specialists. In practical terms, this translated into greater numbers of students attracted to the field, increased institutional support for the discipline, and its elevation in status within the biological sciences as new developments, particularly in connection with medical applications, began to draw attention to the field.64The unicellular hypothesis also promoted the internal development of the science. It provided a rationale for introducing the various techniques used in cytology, embryology, physiology, and the new field of biochemistry as suitable research tools for protozoology as well. This vastly extended the research possibilities and facilitated the understanding of, among other things, protozoan organization, modes of reproduction, and evolutionary relationships. The new experimental grounding of the discipline in turn fitted in well with developments in biology at large: in contrast to the descriptive methodology that had predominated in the nineteenth century, this new experimental program placed protozoology at the forefront of the early twentieth-century movement to make biology an experimental science comparable to physics and chemistry.Yet the criticism of the unicellular hypothesis can also be seen as having served a valuable function within the development of the discipline. It focused attention on the study of protists as organisms in their own right, not simply as models for metazoan cells. More generally, it helped to remind biologists of the particular evolutionary assumptions that supported this conception. Dobell and other British critics pointed out, among other things, the association between the theory of recapitulation and the interpretation of protozoa as unicellular organisms. At the time when the recapitulation theory and the germ-layer doctrine were in decline, it was important to stress how these evolutionary ideas also entered into the contemporary concepts of subsidiary specialties. This was as true for protozoology as it was for cell theory itself. The chromidial theory, in its various guises, and the binuclearity hypothesis did in fact contain elements of recapitulationist reasoning, and they were open to criticism for the same kind of overly speculative theorizing that characterized this evolution theory. Dobell's critique forced protozoologists and cell theorists alike to review the theoretical postulates guiding their investigations.In the absence of further historical studies, it is hard to evaluate the consequences of this debate in later years. The issue was not whether protozoa were the precursors of metazoa — both sides accepted this. They disagreed over which particular protozoon had served as the ancestral form, and this, in turn, influenced their stance on the question of unicellularity. The situation is little changed today. Because the former question is still an open one, the latter remains so too. Both of the models for the origin of multicellular organisms — colonies of ciliates versus multinucleate protists — are still presented as possible mechanisms in modern textbooks of evolution. Unable to judge the dispute in terms of the ultimate validity of the competing conceptions, the historian requires other criteria. It perhaps becomes more important to evaluate the issues in the context of the internal and external stimulus they provided the discipline.65 In these terms, and from the present historical vantage point, Hertwig's research program for protozoology, based upon the unicellular hypothesis, appears to have been a successful one. (shrink)
Research studies demonstrate wide variation in how physicians diagnose and treat patients with similar medical conditions and suggest that at least some of the variation reflects inefficiencies and unnecessary medical costs. Health care researchers are actively examining ways to reduce variations in practice through standardization of medicine to reduce the cost of treatment and ensure the quality of outcomes. The most widely accepted form of this standardization is Evidence Based Best Practices (EBBP). Furthermore, financial health care providers such as hospitals (...) and managed care organizations are investigating methods to tie resource usage to medical protocols in their efforts to monitor and control health care costs. Such proposals are contentious because they report on physicians’ medical practice behaviors (such as the number of tests ordered, use of specific therapies, etc.) and such reports could potentially be used to influence their clinical behaviors. The intent of this exploratory study was to examine physicians’ perceptions about linking a standard costing system to EBBP guidelines. The authors interviewed nine practicing physicians asking each physician to respond to the question, ‘As a physician working in a hospital environment, what are your reactions to and concerns with combining standard costing techniques with EBBP?’ The interviews were in-depth and free form in nature. The physicians’ responses were recorded and analyzed using Grounded Theory Methodology. Using this methodology the field data was categorized into two major themes. The most important theme centered on ethics and the second theme was concerned with the implementation and use of a standard cost system in regard to EBBP. If physicians’ worries about ethical dilemmas and implementation issues are not resolved, then it is likely that doctors would be unwilling to participate in any efforts to develop or use a standard cost-reporting system in medicine. While this study was exploratory in nature, it should provide future guidance to accountants, health care researchers and health care providers about physicians’ issues with the use of standard costing methods in medicine. (shrink)