Identification of those who have the potential to become knowledgeable, skilled, and compassionate physicians, and determining how best to prepare them for medical education has been an on ongoing challenge since the mid-1800s (Ludmerer 1985). When medical education was almost exclusively proprietary, the primary consideration for admission was having adequate financial resources. However, in the late 1800s, two men became the driving forces for structuring medical and premedical education in the United States. Daniel Coit Gilman, of Yale and the University (...) of California, later the founding President of Johns Hopkins University, and Charles W. Eliot, President of Harvard, articulated two radical objectives that .. (shrink)
Two new philosophical problems surrounding the gradation of certainty began to emerge in the 17th century and are still very much alive today. One is concerned with the evaluation of inductive reasoning, whether in science, jurisprudence, or elsewhere; the other with the interpretation of the mathematical calculus of change. This book, aimed at non-specialists, investigates both problems and the extent to which they are connected. Cohen demonstrates the diversity of logical structures that are available for judgements of probability, and (...) explores the rationale for their appropriateness in different contexts of application. Thus his study deals with the complexity of the underlying philosophical issues without simply cataloging alternative conceptions or espousing a particular "favorite" theory. (shrink)
Johnathan Cohen's book provides a lucid and penetrating treatment of the fundamental issues of contemporary analytical philosophy. This field now spans a greater variety of topics and divergence of opinion than fifty years ago, and Cohen's book addresses the presuppositions implicit to it and the patterns of reasoning on which it relies.
Historians of medicine generally credit the hospital standardization movement of the early 20th century with establishing the record as a sign of hospital and staff quality. The medical record's role had already been the subject of intense interest at the New York Hospital several decades before, however. In the 1880s malpractice and insurance concerns caused the administration to attempt to supervise record creation, quality, and access, over the objections of physicians. Contemporary concerns about the uses of the medical record were (...) in play well before 1910. (shrink)
In this incisive new book one of Britain's most eminent philosophers explores the often overlooked tension between voluntariness and involuntariness in human cognition. He seeks to counter the widespread tendency for analytic epistemology to be dominated by the concept of belief. Is scientific knowledge properly conceived as being embodied, at its best, in a passive feeling of belief or in an active policy of acceptance? Should a jury's verdict declare what its members involuntarily believe or what they voluntarily accept? And (...) should statements and assertions be presumed to express what their authors believe or what they accept? Does such a distinction between belief and acceptance help to resolve the paradoxes of self-deception and akrasia? Must people be taken to believe everything entailed by what they believe, or merely to accept everything entailed by what they accept? Through a systematic examination of these problems, the author sheds new light on issues of crucial importance in contemporary epistemology, philosophy of mind, and cognitive science. (shrink)
It can happen that a single surface S, viewed in normal conditions, looks pure blue (“true blue”) to observer John but looks blue tinged with green to a second observer, Jane, even though both are normal in the sense that they pass the standard psychophysical tests for color vision. Tye (2006a) ﬁnds this situation prima facie puzzling, and then oﬀers two diﬀerent “solutions” to the puzzle.1 The ﬁrst is that at least one observer misrepresents S’s color because, though normal in (...) the sense explained, she is not a Normal color observer: her color detection system is not operating in the current condition in the way that Mother Nature intended it to operate. His second solution involves the idea that Mother Nature designed our color detection systems to be reliable with respect to the detection of coarse-grained colors (e.g., blue, green, yellow, orange), but our capacity to represent the ﬁne-grained colors (e.g., true blue, blue tinged with green) is an undesigned spandrel. On this second solution, it is consistent with the variation between John and Jane that both represent the color of S in a way that complies with Mother Nature’s intentions: both represent S as exemplifying the coarse-grained color blue, and since (we may assume) S is in fact blue, both represent it veridically. Of course, they also represent ﬁne-grained colors of S, and, according to Tye, at most one of these representations is veridical (Tye says that only God knows which). But at the level of representation for which Mother Nature designed our color detection systems, both John and Jane (qua Normal observers) are reliable detectors. (shrink)
(Tye 2006) presents us with the following scenario: John and Jane are both stan- dard human visual perceivers (according to the Ishihara test or the Farnsworth test, for example) viewing the same surface of Munsell chip 527 in standard conditions of visual observation. The surface of the chip looks “true blue” to John (i.e., it looks blue not tinged with any other colour to John), and blue tinged with green to Jane.1 Tye then in eﬀect poses a multiple choice question.
The classical analysis of relevance in probabilistic terms does not fit legal, moral or conversational relevance, and, though analysis in terms of a psychological model may fit conversational relevance, it certainly does not fit legal, moral or evidential relevance. It is important to notice here that some sentences are ambiguous between conversational and non-conversational relevance. But, if and only ifR is relevant to a questionQ, R is a reason, though not necessarily a complete or conclusive reason, for accepting or rejecting (...) something as an answer toQ. Reasons of this kind are governed by appropriate covering laws or principled probabilities and a number of questions thus arise about the relationship between relevance and certain formal-logical properties. (shrink)
This article defends the principle of non-establishment against 21st-century projects of political religion, constitutional theocracy and political theology. It is divided into two parts, which will appear in two consecutive issues of Philosophy & Social Criticism, 39(4–5) and 39(6). Part 1 proceeds by constructing an ideal type of political secularism, and then discussing the innovative American model of constitutional dualism regarding religion that combined constitutional protection for the freedom of religious conscience and exercise with the principle of non-establishment. The article (...) analyses the strengths and limits of the ‘separation– accommodation’ frame that became hegemonic in 1st amendment jurisprudence from the 1940s to the 1990s. It challenges the standard caricature of the American model as strictly separationist and privatizing. It then critically assesses two contemporary alternatives to that frame: the integrationist approach and the equal liberty approach. The first, disguised as a concern for pluralism and fairness, challenges ‘separation’ and political secularism in a subtle attack on the non-establishment principle, aimed at drastically narrowing its scope. Successes of this approach in recent Supreme Court jurisprudence and politics have triggered a response by liberal egalitarians. The author addresses this response – the equal liberty model – in part 2, which will appear in Philosophy & Social Criticism 39(6), arguing that although on the right track, it fails to find a middle ground between political secularism and integration. (shrink)
GEOMETRY AND SEMANTICS: AN EXAMINATION OF PUTNAM'S PHILOSOPHY OF GEOMETRY There are many ways to shed light on how and why our conception of geometry changed during the last two centuries. One fruitful strategy is to relate those ...
Nevin & Grace's behavioral-momentum model accommodates a large body of data. This commentary highlights some experimental findings that the model does not always predict. The model does not consistently predict resistance to change when response-independent food is delivered simultaneously with response-contingent food, when drugs are used as response disrupters, and when responding is reinforced under single rather than multiple schedules of reinforcement.