In face of the multiple controversies surrounding the DSM process in general and the development of DSM-5 in particular, we have organized a discussion around what we consider six essential questions in further work on the DSM. The six questions involve: 1) the nature of a mental disorder; 2) the definition of mental disorder; 3) the issue of whether, in the current state of psychiatric science, DSM-5 should assume a cautious, conservative posture or an assertive, transformative posture; 4) the role (...) of pragmatic considerations in the construction of DSM-5; 5) the issue of utility of the DSM – whether DSM-III and IV have been designed more for clinicians or researchers, and how this conflict should be dealt with in the new manual; and 6) the possibility and advisability, given all the problems with DSM-III and IV, of designing a different diagnostic system. Part 1 of this article took up the first two questions. Part 2 took up the second two questions. Part 3 now deals with Questions 5 & 6. Question 5 confronts the issue of utility, whether the manual design of DSM-III and IV favors clinicians or researchers, and what that means for DSM-5. Our final question, Question 6, takes up a concluding issue, whether the acknowledged problems with the earlier DSMs warrants a significant overhaul of DSM-5 and future manuals. As in Parts 1 & 2 of this article, the general introduction, as well as the introductions and conclusions for the specific questions, are written by James Phillips, and the responses to commentaries are written by Allen Frances. (shrink)
In face of the multiple controversies surrounding the DSM process in general and the development of DSM-5 in particular, we have organized a discussion around what we consider six essential questions in further work on the DSM. The six questions involve: 1) the nature of a mental disorder; 2) the definition of mental disorder; 3) the issue of whether, in the current state of psychiatric science, DSM-5 should assume a cautious, conservative posture or an assertive, transformative posture; 4) the role (...) of pragmatic considerations in the construction of DSM-5; 5) the issue of utility of the DSM - whether DSM-III and IV have been designed more for clinicians or researchers, and how this conflict should be dealt with in the new manual; and 6) the possibility and advisability, given all the problems with DSM-III and IV, of designing a different diagnostic system. Part I of this article will take up the first two questions. With the first question, invited commentators express a range of opinion regarding the nature of psychiatric disorders, loosely divided into a realist position that the diagnostic categories represent real diseases that we can accurately name and know with our perceptual abilities, a middle, nominalist position that psychiatric disorders do exist in the real world but that our diagnostic categories are constructs that may or may not accurately represent the disorders out there, and finally a purely constructivist position that the diagnostic categories are simply constructs with no evidence of psychiatric disorders in the real world. The second question again offers a range of opinion as to how we should define a mental or psychiatric disorder, including the possibility that we should not try to formulate a definition. The general introduction, as well as the introductions and conclusions for the specific questions, are written by James Phillips, and the responses to commentaries are written by Allen Frances. (shrink)
In face of the multiple controversies surrounding the DSM process in general and the development of DSM-5 in particular, we have organized a discussion around what we consider six essential questions in further work on the DSM. The six questions involve: 1) the nature of a mental disorder; 2) the definition of mental disorder; 3) the issue of whether, in the current state of psychiatric science, DSM-5 should assume a cautious, conservative posture or an assertive, transformative posture; 4) the role (...) of pragmatic considerations in the construction of DSM-5; 5) the issue of utility of the DSM - whether DSM-III and IV have been designed more for clinicians or researchers, and how this conflict should be dealt with in the new manual; and 6) the possibility and advisability, given all the problems with DSM-III and IV, of designing a different diagnostic system. Part I of this article took up the first two questions. Part II will take up the second two questions. Question 3 deals with the question as to whether DSM-V should assume a conservative or assertive posture in making changes from DSM-IV. That question in turn breaks down into discussion of diagnoses that depend on, and aim toward, empirical, scientific validation, and diagnoses that are more value-laden and less amenable to scientific validation. Question 4 takes up the role of pragmatic consideration in a psychiatric nosology, whether the purely empirical considerations need to be tempered by considerations of practical consequence. As in Part 1 of this article, the general introduction, as well as the introductions and conclusions for the specific questions, are written by James Phillips, and the responses to commentaries are written by Allen Frances. (shrink)
In the conclusion to this multi-part article I first review the discussions carried out around the six essential questions in psychiatric diagnosis – the position taken by Allen Frances on each question, the commentaries on the respective question along with Frances’ responses to the commentaries, and my own view of the multiple discussions. In this review I emphasize that the core question is the first – what is the nature of psychiatric illness – and that in some manner all further (...) questions follow from the first. Following this review I attempt to move the discussion forward, addressing the first question from the perspectives of natural kind analysis and complexity analysis. This reflection leads toward a view of psychiatric disorders – and future nosologies – as far more complex and uncertain than we have imagined. (shrink)
We do not dispute the findings of Ceci et al.'s study, though they are based on survey research which does not always reflect real-life experiences. We report on cases we have defended on the basis of the tenure system, few of which mirror the situations reported in the target article. We end with a strong defense of the tenure system in the modern university. (Published Online February 8 2007).
The emergence of the German Jewish philosopher Ernst Cassirer as the object of scholarly attention has been both surprising and rapid. In the decades since his early death while in exile in the United States, Cassirer never fell into complete oblivion. His works remained known to specialists in German intellectual history; his participation in a famous 1929 debate with Martin Heidegger in Davos, Switzerland, one of the most iconic moments in modern Continental thought, made his name familiar to most students (...) of modern philosophy. Yet Cassirer lacked the widespread recognition given to contemporaries such as Heidegger or Walter Benjamin, and his work never became the center of historical or philosophical study. This neglect stemmed, in part, from dismissal by his peers; as Edward Skidelsky explains in his new study, Rudolf Carnap found him “rather pastoral,” Isaiah Berlin dismissed him as “serenely innocent,” and Theodor Adorno thought he was “totally gaga”. The last few years, however, have seen the rise of a remarkable new interest in Cassirer in both Germany and the English-speaking world. Among this recent literature, Edward Skidelsky's and Peter Gordon's works lead the small “Cassirer renaissance” and offer the best English-language introduction to his thought. Both Gordon and Skidelsky ambitiously seek to relocate Cassirer at the forefront of modern German and European thought. Gordon goes as far as to call him “one of the greatest philosophers and intellectual historians to emerge from the cultural ferment of modern Germany” and one of the most important thinkers of the twentieth century. In making such bold statements, Gordon and Skidelsky clearly set their sights beyond the person himself; they aspire to highlight a central strand of thought that enjoyed a powerful presence in early twentieth-century Germany but fell into neglect in the postwar era. In doing so, they seek to reevaluate the nature and legacy of Weimar thought, its complex relationship with the period's unstable politics, and its relevance today. (shrink)
This paper deals with the exceptions-tolerance property of generic sentences with indefinite singular and bare plural subjects (IS and BP generics, respectively) and with the way this property is connected to some well-known observations about felicity differences between the two types of generics (e.g. Lawler's 1973, Madrigals are popular vs. #A madrigal is popular). I show that whereas both IS and BP generics tolerate exceptional and contextually irrelevant individuals and situations in a strikingly similar way, which indicates the existence of (...) a basically equivalent tolerance mechanism, there is also a difference between them, unnoticed so far, which concerns the degree to which the properties of the legitimate exceptions can be characterized in advance. Following claims in Greenberg (2003), I argue that both this newly observed difference as well as the traditional felicity differences result from an underlying contrast in the type of ‘non-accidentalness’ expressed by the two types of generic sentences, and more formally, in the accessibility relations that their generic quantifier (Gen) is compatible with. To capture the new difference in tolerance of exceptions, I develop an improved version of the exceptions-tolerance mechanism for generic sentences suggested in Kadmon & Landman (1993), namely, a restriction on the set of individuals and situations quantified by Gen, which is partially vague to two different degrees using supervaluationist methods. The different degrees of vagueness in this restriction are shown to be systematically dependent on the two types of accessibility relations that IS and BP generics are compatible with, which are redefined as precise and vague restrictions on the generic quantification over worlds. (shrink)
The prevailing interpretation of Kant’s _First Critique _in Anglo-American philosophy views his theory of a priori knowledge as basically a theory about the possibility of empirical knowledge, or the a priori conditions for that possibility. Instead, Robert Greenberg argues that Kant is more fundamentally concerned with the possibility of a priori knowledge—the very possibility of the possibility of empirical knowledge in the first place. Greenberg advances four central theses: the _Critique_ is primarily concerned about the possibility, or relation (...) to objects, of a priori_,_ not empirical knowledge, and Kant’s theory of that possibility is defensible; Kant’s transcendental ontology must be distinct from the conditions of the possibility of a priori knowledge; the functions of judgment, in Kant’s discussion of the Table of Judgments, should be seen according to his transcendental logic as having content, not as being just logical forms of judgment making; Kant’s distinction between and connection of ordering relations and reference relations have to be kept in mind to avoid misunderstanding the _Critique_. At every step of the way Greenberg contrasts his view with the major interpretations of Kant by commentators like Henry Allison, Jonathan Bennett, Paul Guyer, and Peter Strawson. Not only does this new approach to Kant present a strong challenge to these dominant interpretations, but by being more true to Kant’s own intent it holds promise for making better sense out of what have been seen as the _First Critique_’s discordant themes. (shrink)
At present liberal education is generally understood and justified as the acquisition of critical thinking skills and individual autonomy. Traditionally, however, the ultimate purpose of liberal education has been leisure. Freedom, it was thought, was not simply the result of critical thinking but also required the cultivation of leisure that involved a vigilant receptivity — a stillness from the busy world of work and the restive probing of a discursive mind. In this essay, Kevin Gary argues that the cultivation (...) of leisure has been and ought to be an essential part of what constitutes a liberal education. Focused on interior freedom, leisure offers a valuable way of learning that ushers in an authentic freedom that a critical approach to learning and liberal education does not. Accordingly, it offers a valuable defense against the hegemonic world of work that defines and appraises one’s value exclusively in terms of one’s doing. (shrink)
Thanks to his unsurpassed eye and his fearless willingness to take a stand, Clement Greenberg (1909 1994) became one of the giants of 20th century art criticism a writer who set the terms of critical discourse from the moment he burst onto the scene with his seminal essays Avant Garde and Kitsch (1939) and Towards a Newer Laocoon (1940). In this work, which gathers previously uncollected essays and a series of seminars delivered at Bennington in 1971, Greenberg provides (...) his most expansive statement of his views on taste and quality in art, arguing for an esthetic that flies in the face of current art world fashions. Greenberg insists despite the attempts from Marcel Duchamp onwards to escape the jurisdiction of taste by producing an art so disjunctive that it cannot be judged that taste is inexorable. He argues that standards of quality in art, the artist's responsibility to seek out the hardest demands of a medium, and the critic's responsibility to discriminate, are essential conditions for great art. The obsession with innovation the epidemic of newness leads, in Greenbergs view, to the boringness of so much avant garde art. He discusses the interplay of expectation and surprise in aesthetic experience, and the exalted consciousness produced by great art. Homemade Esthetics allows us particularly in the transcribed seminar sessions, never before published to watch the critics mind at work, defending (and at times reconsidering) his theories. His views, often controversial, are the record of a lifetime of looking at and thinking about art as intensely as anyone ever has. (shrink)
Sean Greenberg - Descartes and the Passionate Mind - Journal of the History of Philosophy 45:3 Journal of the History of Philosophy 45.3 499-500 Muse Search Journals This Journal Contents Reviewed by Sean Greenberg University of California Irvine Deborah J. Brown. Descartes and the Passionate Mind. Cambridge-New York: Cambridge University Press, 2006. Pp. xi + 231. Cloth, $85.00. In the past two decades, Descartes's last work, The Passions of the Soul, has received considerable attention from Descartes scholars. In (...) the first English-language monograph on the Passions, Deborah Brown mounts a case for the work's philosophical significance. Brown takes Descartes's treatment of the passions to extend the discussion of the.. (shrink)
In “How Facts Make Law” (Greenberg 2004), I argue that non-normative contingent facts are not sufficient to determine the content of the law. In the present paper, I take up a challenge raised by Enrique Villanueva (2005). He suggests that, to put it very briefly, descriptive facts can be reasons of the relevant kind. Therefore, even if the content of the law depends on reasons, it does not follow that law practices cannot themselves determine the content of the law. (...) Villanueva proposes a value-neutral criterion – textualism. In other words, he suggests that the descriptive facts about the meaning of legal texts are themselves reasons that determine the contribution of law practices to the content of the law. This suggestion depends on too shallow a conception of the requirement of reasons. For the law to be rationally determined, it is not enough that there be some value-neutral criterion that specifies that law practices have certain consequences for the content of the law. There have to be reasons that explain why that criterion, as opposed to all others, is the legally correct one – the one that, in the relevant legal system, determines the contribution of law practices to the content of the law. Normative facts are the best candidates for such reasons. And, in fact, Villanueva’s textualist criterion derives its appeal from normative facts. Reasons play a central role in the ontology of law. The determinants of the content of the law, which include law-determining practices such as statutes and judicial decisions, influence the content of the law in a systematic way. But their influence on the content of the law cannot be brute: the determining facts must constitute reasons why particular legal facts obtain. Descriptive facts cannot themselves provide the necessary reasons: for any descriptive fact that is a candidate reason, there are many possible models of its significance for the legal facts. Given the descriptive facts alone, it is arbitrary which of the possible models is correct, and therefore what the legal facts are. Descriptive facts therefore cannot alone determine the content of the law. Normative facts are the best candidates for what needs to be added to the law practices in order for the determining facts to make rationally intelligible why particular legal facts, as opposed to others, obtain. (shrink)
Here, Greenberg excavates the skeletons of some of our most iconoclastic buildings, spurring on a continued engagement with those intentionally (World Trade Center) and accidentally (Charles DeGaulle Airport Terminal) destroyed that ...
Each year since 1983 the American Council of Learned Societies has invited one of America's leading scholars to deliver the Haskins Lecture, in honor of Charles Homer Haskins, a distinguished scholar and teacher who was instrumental in the founding of the ACLS. In this volume, which commemorates the 75th anniversary of the ACLS, Douglas Greenberg and Stanley Katz bring together the lectures presented by ten of America's most distinguished scholars. Each lecture is a personal and intellectual glimpse into the (...) "life of learning" of such leading scholars as Maynard Mack, Annemarie Schimmel, and John Hope Franklin. The lectures focus on self-reflection of lives dedicated to learning, rather than on scholarship in the usual sense of the term. Ranging from being forced to learn Latin to painful memories of war and racism, the lecturers all recount stories from their eventful lives. Each offers thoughts on the body of work he or she has produced and the forces, personal and intellectual, that have shaped it. The scholars bring something of their disciplines to the lecture, sharing not only personal anecdotes but their love of learning. (shrink)
I offer a new argument against the legal positivist view that non-normative social facts can themselves determine the content of the law. I argue that the nature of the determination relation in law is rational determination: the contribution of law-determining practices to the content of the law must be based on reasons. That is why it must be possible in principle to explain what makes the law have the content that it does. It follows, I argue, that non-normative facts about (...) statutes, judicial decisions, and other practices cannot themselves determine the content of the law. A full account must appeal to considerations independent of the practices that determine the relevance of the practices to the content of the law. Normative facts are the best candidates. (shrink)
In this paper, I propose a new way of understanding the space of possibilities in the field of mental content. The resulting map assigns separate locations to theories of content that have generally been lumped together on the more traditional map. Conversely, it clusters together some theories of content that have typically been regarded as occupying opposite poles. I make my points concrete by developing a taxonomy of theories of mental content, but the main points of the paper concern not (...) merely how to classify, but how to understand, the theories. Also, though the paper takes theories of mental content as a case study, much of the discussion is applicable to theories of other phenomena. To a first approximation, the difference between the traditional and the proposed taxonomies turns on whether we classify theories of content by, on the one hand, their implications for a non-redundant supervenience base for content facts (i.e., for facts about what contents thoughts have) or, on the other, by their constitutive accounts of content. By a "constitutive account," I mean the kind of elucidation of the nature of a phenomenon that theorists have tried to give for, for example, knowledge, justice, personal identity, consciousness, convention, heat, and limit. The tendency to taxonomize by supervenience base is encouraged, I suggest, by a failure to keep clearly in view a distinction between constitutive and modal determination. Many philosophers would accept that a constitutive account cannot be captured in purely modal terms. Giving a constitutive account is not the same as specifying modally necessary and sufficient conditions. Nevertheless, philosophers often try to cash constitutive claims in modal terms. A case in point is that theories of content tend to be conceptualized in terms of the theories' implications for a supervenience base for content facts. My thesis goes beyond the by-now somewhat familiar proposition that not all modal determinants of a phenomenon are constitutive determinants. One who has taken that point on board might nevertheless conceive of a philosophical account as an attempt to specify constitutive determinants of the target phenomenon that make up a non-redundant supervenience base for the phenomenon. Shoehorning a philosophical account into this form leaves out elements that are modally redundant, but may be explanatorily or ontologically significant. For example, when a constitutive account has multiple levels, the different levels will typically be modally redundant. Formulating the account as a specification of a supervenience base of constitutive determinants will therefore flatten the account into a single level. Many of my arguments can be illustrated by considering the place of normativity in the theory of content. The new taxonomy gives a distinct niche to normative theories of content - theories that explain a thought's having a certain content at least in part in terms of the obtaining of normative facts. By contrast, on a traditional map, normative theories are invisible as such because normative facts supervene on non-normative ones. (shrink)
In this paper, I argue that there is a picture of how law works that most legal theorists are implicitly committed to and take to be common ground. This Standard Picture (SP, for short) is generally unacknowledged and unargued for. SP leads to a characteristic set of concerns and problems and yields a distinctive way of thinking about how law is supposed to operate. I suggest that the issue of whether SP is correct is a fundamental one for the philosophy (...) of law, more basic, for example, than the issue that divides legal positivists and anti-positivists, at least as the latter issue is ordinarily understood. The goals of the paper are fourfold: 1) to identify and articulate in some detail the Standard Picture; 2) to show that SP is widely held and has important consequences for other debates in the philosophy of law; 3) to show that SP leads to a serious theoretical problem; 4) to sketch an alternative picture that promises to avoid this problem. I emphasize the modesty of these goals in one respect. I make no claim to refute SP or to fully develop and defend an alternative picture. (shrink)
Tyler Burge’s influential arguments have convinced most philosophers that a thinker can have a thought involving a particular concept without fully grasping or having mastery of that concept. In Burge’s (1979) famous example, a thinker who lacks mastery of the concept of arthritis nonetheless has thoughts involving that concept. It is generally supposed, however, that this phenomenon – incomplete understanding, for short – does not require us to reconsider in a fundamental way what it is for a thought to involve (...) a particular concept. In this paper, I argue that the real significance of incomplete understanding has not been appreciated. To the extent that theorists of content address the phenomenon of thoughts involving incompletely grasped contents at all, they tend to assume that some hand-waving about deference to other thinkers who fully grasp the relevant concepts will take care of the inconvenient cases of incomplete understanding. The main lesson of Burge’s arguments is often taken to be that the content of language and thought is socially determined. On this picture, we do not need to change our basic view about what it is to have a concept; we just need to recognize that some thinkers can manage to have a concept by piggybacking on others. In contrast, on the view I defend, taking incomplete understanding seriously forces us to rethink some of our most basic assumptions about the nature of mental content. Deference is a red herring. The role of society in determining the content of thought is not the main lesson, but at most a useful clue as to the nature of mental and linguistic content. (shrink)
The present article attempts to evaluate various tenets of moral philosophy by reviewing empirical data from the field of organizational justice bearing on: (a) people''s concerns about fairness in organizations, and (b) the consequences of following or not following rules of justice. With respect to concerns about fairness in organizations, utilitarian claims that people believe that fairness requires distributions of reward based on merit were assessed. Similarly, evidence was reviewed bearing on the claim of psychological egoists that judgments of fairness (...) reflect a self-interested bias. Finally, Kant''s view that people should never lie was evaluated in light of evidence describing people''s actual attitudes toward lying. In all three cases, the underlying philosophical premises were concluded to be overly simplistic in view of complexities about human nature revealed in empirical research. In addition, evidence was reviewed that shed light on the teleological claims of utilitarians regarding the purported benefits of following two common organizational practices — using merit-based pay, and punishing harmdoers. Here, too, the empirical evidence (findings that these techniques do not consistently yield the greatest good for the greatest number) suggests that the premise underlying utilitarian thought is based on unfounded, overly simplistic assumptions about the effects of following various allegedly moral ideas. It is concluded that our analyses provide a useful first step toward the desired goal of business ethicists of integrating prescriptive and descriptive approaches to justice. (shrink)
David Chalmers has proposed several principles in his attack on the ‘hard problem’ of consciousness. One of these is the principle of organizational invariance , which he asserts is significantly supported by two thought experiments involving human brains and their functional silicon-based isomorphs. I claim that while the principle is an intelligible hypothesis and could possibly be true, his thought experiments fail to provide support for it.
In this paper, I deploy an argument that I have developed in a number of recent papers in the service of three projects. First, I show that the most influential version of legal positivism – that associated with H.L.A. Hart – fails. The argument’s engine is a requirement that a constitutive account of legal facts must meet. According to this rational-relation requirement, it is not enough for a constitutive account of legal facts to specify non-legal facts that modally determine the (...) legal facts. The constitutive determinants of legal facts must provide reasons for the obtaining of the legal facts (in a sense of “reason” that I develop). I show that the Hartian account is unable to meet this requirement. That officials accept a rule of recognition does not by itself constitute a reason why the standards specified in that rule are part of the law of the community. I argue that it is false that understanding the explanatory significance of officials’ acceptance of a rule is part of our reflective understanding of the nature of law. The second project of the paper is to respond to a family of objections that challenge me to explain why normative facts and descriptive facts together are better placed to provide reasons for legal facts than descriptive facts alone. A unifying theme of the objections is that explanations have to stop somewhere; descriptive facts, it is suggested, are no worse a stopping place than normative facts. Third, the paper spells out a consequence of the rational-relation requirement: if an account of what, at the most basic level, determines legal facts is true in any possible legal system, it is true in all possible legal systems. For example, if a Hartian account of legal facts is true in any possible legal system, it is true in all possible legal systems. I use this all-or-nothing result in my critique of a Hartian account, but the result is of interest in its own right. (shrink)
The accounting profession is concerned with the ethical beliefs of its members. To this end, the authors surveyed public accountants, questioning them about the AICPA''s Code of Professional Conduct and their perceptions of how potentially unethical behaviors impact the firm. The paper focuses on respondents'' perceptions of the impact on the firm''s practice, image and degree of concern.Public accountants appear to agree with the AICPA''s Code of Professional Ethics. Their mean responses indicate they believe the Code components are important and (...) extremely important. Some Code components were significantly more important than others, especially demonstrating professionalism and maintaining independence while performing independent audits. Gender, role and organizational level all had significant effects on the importance of the Code. Males, non-auditors and upper management all expressed stronger beliefs in the importance of the overall Code and its components. (shrink)
The view (most prominently advocated by Justice Scalia) that original meaning entails the constitutionality of original practices has strong intuitive appeal and has been broadly assumed by originalists and nonoriginalists alike. But the position is mistaken. We suggest that a failure to distinguish between two different notions of meaning accounts for the position's wide currency. According to the first notion, the meaning of a term is roughly what a dictionary definition attempts to convey--the semantic or linguistic understanding necessary to use (...) the term, as opposed to nonlinguistic facts about the objects or activities to which the term applies. In contrast, according to the second, looser notion, the meaning of a term incorporates the objects or activities to which the term is applied. The first notion lies behind originalism's theoretical force; it is untenable that the meaning of the Constitution in the first sense could evolve. In sharp contrast, it is not only tenable but inevitable that changes occur over time in the class of things to which a constitutional provision is applied. Once recognized, the distinction undermines the seemingly natural move from the necessity of interpreting the Constitution in accordance with how it was originally understood to the necessity of upholding practices originally understood to be constitutional. By taking the distinction on board and rejecting the assumption, originalism can readily deflect the challenges based on unacceptable original practices; as a consequence, however, it will not be tenable for originalism, in any case challenging an original practice, simply to rule out the possibility of the practice's invalidity. (shrink)
A sample survey of members of the Association of Environmental and Resource Economists (AERE) found relatively low rates of obvious ethical misconduct, such as data fabrication and falsification, and higher rates of dubious behaviors, such as deliberate overstatement of positive and understatement of negative results. AERE members reported that job-related pressures-including competition with peers, pressure due to professional implication and on-the-job pressure-were the most important causes. The most effective preventive measures, according to respondents, were discussion of ethics in existing classes, (...) codes of ethics, and short courses at professional meetings. The vast majority of AERE members were against government audits and regulations. (shrink)
Hume begins the Treatise of Human Nature by announcing the goal of developing a science of man; by the end of Book 1 of the Treatise, the science of man seems to founder in doubt. Underlying the tension between Hume's constructive ambition – his 'naturalism'– and his doubts about that ambition – his 'skepticism'– is the question of whether Hume is justified in continuing his philosophical project. In this paper, I explain how this question emerges in the final section of (...) Book 1 of the Treatise, the 'Conclusion of this Book', then examine Janet Broughton's and Don Garrett's answers to it, and conclude by sketching a different approach to this question. (shrink)
In this paper, I criticize an influential understanding of naturalization according to which work on traditional problems in the philosophy of law should be replaced with sociological or psychological explanations of how judges decide cases. W.V. Quine famously proposed the “naturalization of epistemology.” Quine argued that we should replace certain traditional philosophical inquiries into the justification of our beliefs with empirical psychological inquiry into how we actually form beliefs. In a prominent series of papers and a forthcoming book, Brian Leiter (...) has raised the intriguing idea that Quine’s naturalization of epistemology is a useful model for philosophy of law. I examine Quine’s naturalization of epistemology and Leiter’s suggested parallel. I argue that the parallel does not hold up. I show that, granting Leiter’s substantive assumption that the law is indeterminate, there is no philosophical confusion or overreaching in the legal case that is parallel to the philosophical overreaching of foundationalism in epistemology. Moreover, if we take seriously Leiter’s analogy between, on the one hand, the justification of belief in scientific theories and, on the other, the justification of decisions in legal cases, the result is almost the opposite of what Leiter suggests. The closest parallel in the legal case to Quine’s position would be the rejection of the philosophical positions that lead to the indeterminacy thesis. Finally, the conclusion that law is indeterminate could not establish the bankruptcy of philosophical investigation into the relation between the grounds of law and the content of the law. After all, the argument for that conclusion depends on a philosophical account of the relation between the grounds of law and the content of law. The argument therefore presupposes that that relation is an appropriate subject for philosophical inquiry. (shrink)
The cyclic nature of speech production, as manifested in the syllabic organization of spoken language, is likely to reflect general properties of sensori-motor integration rather than merely a phylogenetic progression from mastication, teeth chattering, and lipsmacks. The temporal properties of spontaneous speech reflect the entropy of its underlying constituents and are optimized for rapid transmission and decoding of linguistic information conveyed by a complex constellation of acoustic and visual cues, suggesting that the dawn of human language may have occurred when (...) the articulatory cycle was efficiently yoked to the temporal dynamics of sensory coding and rapid retrieval from referential memory. (shrink)
Terror management theory and research can rectify shortcomings in Atran & Norenzayan's (A&N's) analysis of religion. (1) Religious and secular worldviews are much more similar than the target article supposes; (2) a propensity for embracing supernatural beliefs is likely to have conferred an adaptive advantage over the course of evolution; and (3) the claim that supernatural agent beliefs serve a terror management function independent of worldview bolstering is not empirically supported.
Bering's analysis is inadequate because it fails to consider past and present adult soul beliefs and the psychological functions they serve. We suggest that a valid folk psychology of souls must consider features of adult soul beliefs, the unique problem engendered by awareness of death, and terror management findings, in addition to cognitive inclinations toward dualistic and teleological thinking.
Fodor’s asymmetric-dependence theory of content is probably the best known and most developed causal or informational theory of mental content. Many writers have attempted to provide counterexamples to Fodor’s theory. In this paper, I offer a more fundamental critique. I begin by attacking Fodor’s view of the dialectical situation. Fodor’s theory is cast in terms of laws covering the occurrence of an individual thinker’s mental symbols. I show that, contrary to Fodor’s view, we cannot restrict consideration to hypothetical cases in (...) which his conditions for content are satisfied, but must consider whether the relevant laws exhibit the specified asymmetric-dependence relations in actual cases. My central argument is that the laws that the theory requires do not in fact exhibit the appropriate asymmetric-dependence relations. I show that, in general, part of the mechanism for the crucial, supposedly content-determining law for a mental symbol is not shared by the mechanisms for the other laws covering the occurrence of the same mental symbol. As a result, the former law can be eliminated (by eliminating the non-overlapping part of the mechanism) without eliminating the latter laws. The latter laws do not asymmetrically depend on the former law. (shrink)
The role of transcendental idealism in Kant's theory of knowledge has been both deliberately underrated and inadvertently exaggerated. If conceivably not necessary, its role in Kant's explanation of the possibility of a priori knowledge in the Critique of Pure Reason is at least pivotal to the success of the explanation. On the other hand, though transcendental idealism depends on Kant's epistemological criterion of an existing object, or, simply, his criterion of existence, the criterion for its part is actually independent of (...) the idealism. In fact, it may be because this independence has hardly been recognized that commentators have been unaware of the role the criterion may actually be playing in the continuing controversy over the correct interpretation of the idealism. Altogether, this article addresses both shortcomings – the underestimation and the exaggeration of the role of the idealism in Kant's epistemology. While it places the idealism at the centre of the epistemology, it also separates the criterion of existence from the idealism. In highlighting this contrast, the article explains how the criterion may actually be contributing to the persistence of the ongoing dispute over the correct interpretation of the idealism. (shrink)
In a recent paper, Eckart Förster challenges interpreters to explain why in the first Critique practical reason has a canon but no dialectic, whereas in the second Critique, there is not only a dialectic, but an antinomy of practical reason. In the Groundwork, Kant claims that there is a natural dialectic with respect to morality (4:405), a different claim from those advanced in the first and second Critiques. Förster's challenge may therefore be reformulated as the problem of explaining why practical (...) reason has a canon in the first Critique, a dialectic in the Groundwork, and an antinomy in the second Critique. In this paper, I answer this challenge. I argue that these differences are due to the different aims and scope of the works, and in particular, the different place of the inclinations in their arguments. (shrink)
Darwinian theories of culture need to show that they improve upon the commonsense view that cultural change is explained by humans? skillful pursuit of their conscious goals. In order for meme theory to pull its weight, it is not enough to show that the development and spread of an idea is, broadly speaking, Darwinian, in the sense that it proceeds by the accumulation of change through the differential survival and transmission of varying elements. It could still be the case that (...) the best explanation of why the idea has developed and spread is the conscious pursuit of human goals. Meme theory has the potential to do explanatory work in diverse ways. It can challenge the goal-based account of cultural change directly. Other possibilities for meme theory include explaining the acquisition of our goals and showing that memes and genes evolve together, each affecting the selective forces acting on the other. Raising the question of meme theory?s explanatory payoff brings out the importance of the ?selfish-meme? idea and the idea of non-content biases. Both have the potential to challenge the claim that our goals are in the driver?s seat. In order to show that a Darwinian theory of culture is more than an idle redescription, however, it is necessary to make the case that it offers explanatory gain over its competitors, in particular over the common sense goal-based account. (shrink)