This now-classic work challenges what Ryle calls philosophy's "official theory," the Cartesians "myth" of the separation of mind and matter. Ryle's linguistic analysis remaps the conceptual geography of mind, not so much solving traditional philosophical problems as dissolving them into the mere consequences of misguided language. His plain language and esstentially simple purpose place him in the traditioin of Locke, Berkeley, Mill, and Russell.
The Concept of Law is the most important and original work of legal philosophy written this century. First published in 1961, it is considered the masterpiece of H.L.A. Hart's enormous contribution to the study of jurisprudence and legal philosophy. Its elegant language and balanced arguments have sparked wide debate and unprecedented growth in the quantity and quality of scholarship in this area--much of it devoted to attacking or defending Hart's theories. Principal among Hart's critics is renowned lawyer and political (...) philosopher Ronald Dworkin who in the 1970s and 80s mounted a series of challenges to Hart's Concept of Law. It seemed that Hart let these challenges go unanswered until, after his death in 1992, his answer to Dworkin's criticism was discovered among his papers. In this valuable and long-awaited new edition Hart presents an Epilogue in which he answers Dworkin and some of his other most influential critics including Fuller and Finnis. Written with the same clarity and candor for which the first edition is famous, the Epilogue offers a sharper interpretation of Hart's own views, rebuffs the arguments of critics like Dworkin, and powerfully asserts that they have based their criticisms on a faulty understanding of Hart's work. Hart demonstrates that Dworkin's views are in fact strikingly similar to his own. In a final analysis, Hart's response leaves Dworkin's criticisms considerably weakened and his positions largely in question. Containing Hart's final and powerful response to Dworkin in addition to the revised text of the original Concept of Law, this thought-provoking and persuasively argued volume is essential reading for lawyers and philosophers throughout the world. (shrink)
The principal aim of this book is to develop and defend an analysis of the concept of moral obligation. The analysis is neutral regarding competing substantive theories of obligation, whether consequentialist or deontological in character. What it seeks to do is generate solutions to a range of philosophical problems concerning obligation and its application. Amongst these problems are deontic paradoxes, the supersession of obligation, conditional obligation, prima facie obligation, actualism and possibilism, dilemmas, supererogation, and cooperation. By virtue of its (...) normative neutrality, the analysis provides a theoretical framework within which competing theories of obligation can be developed and assessed. This study is a major contribution to metaethics that will be of particular interest to all philosophers concerned with normative ethical theory. (shrink)
This paper explores how the diagnosis of mental disorder may affect the diagnosed subject’s self-concept by supplying an account that emphasizes the influence of autobiographical and social narratives on self-understanding. It focuses primarily on the diagnoses made according to the criteria provided by the Diagnostic Statistical Manual of Mental Disorders (DSM), and suggests that the DSM diagnosis may function as a source of narrative that affects the subject’s self-concept. Engaging in this analysis by appealing to autobiographies and memoirs (...) written by people diagnosed with mental disorder, the paper concludes that a DSM diagnosis is a double-edged sword for self- concept. On the one hand, it sets the subject’s experience in an established classificatory system which can facilitate self-understanding by providing insight into subject’s condition and guiding her personal growth, as well as treatment and recovery. In this sense, the DSM diagnosis may have positive repercussions on self-development. On the other hand, however, given the DSM’s symptom-based approach and its adoption of the Biomedical Disease model, a diagnosis may force the subject to make sense of her condition divorced from other elements in her life that may be affecting her mental- health. It may lead her frame her experience only as an irreversible imbalance. This form of self-understanding may set limits on the subject’s hopes of recovery and may create impediments to her flourishing. (shrink)
The Concept of Law is one of the most influential texts in English-language jurisprudence. 50 years after its first publication its relevance has not diminished and in this third edition, Leslie Green adds an introduction that places the book in a contemporary context, highlighting key questions about Hart's arguments and outlining the main debates it has prompted in the field. The complete text of the second edition is replicated here, including Hart's Postscript, with fully updated notes to include modern (...) references and further reading. (shrink)
Of course we all know now that mathematics has proved that logic doesn't really make sense, but Etchemendy (philosophy, Stanford Univ.) goes further and challenges the received view of the conceptual underpinnings of modern logic by arguing that Tarski's model-theoretic analysis of logical consequences is wrong. He may have found the soft underbelly of the dead horse. Annotation copyrighted by Book News, Inc., Portland, OR.
The concept of mechanism in biology has three distinct meanings. It may refer to a philosophical thesis about the nature of life and biology (‘mechanicism’), to the internal workings of a machine-like structure (‘machine mechanism’), or to the causal explanation of a particular phenomenon (‘causal mechanism’). In this paper I trace the conceptual evolution of ‘mechanism’ in the history of biology, and I examine how the three meanings of this term have come to be featured in the philosophy of (...) biology, situating the new ‘mechanismic program’ in this context. I argue that the leading advocates of the mechanismic program (i.e., Craver, Darden, Bechtel, etc.) inadvertently conflate the different senses of ‘mechanism’. Specifically, they all inappropriately endow causal mechanisms with the ontic status of machine mechanisms, and this invariably results in problematic accounts of the role played by mechanism-talk in scientific practice. I suggest that for effective analyses of the concept of mechanism, causal mechanisms need to be distinguished from machine mechanisms, and the new mechanismic program in the philosophy of biology needs to be demarcated from the traditional concerns of mechanistic biology. (shrink)
The Concept of Physical Law is an original and creative defense of the Regularity theory of physical law, the concept that physical laws are nothing more than descriptions of whatever universal truths happen to be instanced in nature. Professor Swartz clearly identifies and analyzes the arguments and intuitions of the opposing Necessitarian theory, and argues that the standard objection to the Regularity theory turns on a mistaken view of what Regularists mean by 'physical impossibility'; that it is impossible (...) to construct an empirical test that can distinguish between events Necessitarians call 'mere accidents' and those they call 'nornologically necessary', and that the Necessitarian theory cannot account fot human beings' free wills. Other topics in this important work include: the distinction between instrumental scientific laws and true physical laws; the distinction between failure and doom; potentialities; miracles and marvels; predictability and uniformity; statistical and numerical laws; and necessity-in-praxis. (shrink)
Abstract: There is a long tradition of trying to analyze art either by providing a definition (essentialism) or by tracing its contours as an indefinable, open concept (anti-essentialism). Both art essentialists and art anti-essentialists share an implicit assumption of art concept monism. This article argues that this assumption is a mistake. Species concept pluralism—a well-explored position in philosophy of biology—provides a model for art concept pluralism. The article explores the conditions under which concept pluralism is (...) appropriate, and argues that they obtain for art. Art concept pluralism allows us to recognize that different art concepts are useful for different purposes, and what has been feuding definitions can be seen as characterizations of specific art concepts. (shrink)
Abstract: Human rights developed in response to specific violations of human dignity, and can therefore be conceived as specifications of human dignity, their moral source. This internal relationship explains the moral content and moreover the distinguishing feature of human rights: they are designed for an effective implementation of the core moral values of an egalitarian universalism in terms of coercive law. This essay is an attempt to explain this moral-legal Janus face of human rights through the mediating role of the (...)concept of human dignity. This concept is due to a remarkable generalization of the particularistic meanings of those "dignities" that once were attached to specific honorific functions and memberships. In spite of its abstract meaning, "human dignity" still retains from its particularistic precursor concepts the connotation of depending on the social recognition of a status—in this case, the status of democratic citizenship. Only membership in a constitutional political community can protect, by granting equal rights, the equal human dignity of everybody. (shrink)
In this paper I argue that Frege’s concept horse paradox is not easily avoided. I do so without appealing to Wright’s Reference Principle. I then use this result to show that Hale and Wright’s recent attempts to avoid this paradox by rejecting or otherwise defanging the Reference Principle are unsuccessful.
Contemporary debates about the nature of semantic reference have tended to focus on two competing approaches: theories which emphasize the importance of descriptive information associated with a referring term, and those which emphasize causal facts about the conditions under which the use of the term originated and was passed on. Recent empirical work by Machery and colleagues suggests that both causal and descriptive information can play a role in judgments about the reference of proper names, with findings of cross-cultural variation (...) in judgments that imply differences between individuals with respect to whether they favor causal or descriptive information in making reference judgments. We extend this theoretical and empirical line of inquiry to views of the reference of natural and nominal kind concepts, which face similar challenges to those concerning the reference of proper names. In two experiments, we find evidence that both descriptive and causal factors contribute to judgments of concept reference, with no reliable differences between natural and nominal kinds. Moreover, we find evidence that the same individuals’ judgments can rely on both descriptive and causal information, such that variation between individuals cannot be explained by appeal to a mixed population of “pure descriptive theorists” and “pure causal theorists.” These findings suggest that the contrast between descriptive and causal theories of reference may be inappropriate; intuitions may instead support a hybrid theory of reference that includes both causal and descriptive factors. We propose that future research should focus on the relationship between these factors, and describe several possible frameworks for pursuing these issues. Our findings have implications for theories of semantic reference, as well as for theories of conceptual structure. (shrink)
We focus on issues of learning assessment from the point of view of an investigation of philosophical elements in teaching. We contend that assessment of concept possession at school based on ordinary multiple-choice tests might be ineffective because it overlooks aspects of human rationality illuminated by Robert Brandom’s inferentialism––the view that conceptual content largely coincides with the inferential role of linguistic expressions used in public discourse. More particularly, we argue that multiple-choice tests at schools might fail to accurately assess (...) the possession of a concept or the lack of it, for they only check the written outputs of the pupils who take them, without detecting the inferences actually endorsed or used by them. We suggest that school tests would acquire reliability if they enabled pupils to make the reasons of their answers or the inferences they use explicit, so as to contribute to what Brandom calls the game of giving and asking for reasons. We explore the possibility to put this suggestion into practice by deploying two-tier multiple-choice tests. (shrink)
Radical concept nativism is the thesis that virtually all lexical concepts are innate. Notoriously endorsed by Jerry Fodor (1975, 1981), radical concept nativism has had few supporters. However, it has proven difficult to say exactly what’s wrong with Fodor’s argument. We show that previous responses are inadequate on a number of grounds. Chief among these is that they typically do not achieve sufficient distance from Fodor’s dialectic, and, as a result, they do not illuminate the central question of (...) how new primitive concepts are acquired. To achieve a fully satisfactory response to Fodor’s argument, one has to juxtapose questions about conceptual content with questions about cognitive development. To this end, we formulate a general schema for thinking about how concepts are acquired and then present a detailed illustration. (shrink)
Concept empiricists are committed to the claim that the vehicles of thought are re-activated perceptual representations. Evidence for empiricism comes from a range of neuroscientific studies showing that perceptual regions of the brain are employed during cognitive tasks such as categorization and inference. I examine the extant neuroscientific evidence and argue that it falls short of establishing this core empiricist claim. During conceptual tasks, the causal structure of the brain produces widespread activity in both perceptual and non-perceptual systems. I (...) lay out several conditions on what is required for a neural state to be a realizer of the functional role played by concepts, and argue that no subset of this activity can be singled out as the unique neural vehicle of conceptual thought. Finally, I suggest that, while the strongest form of empiricism is probably false, the evidence is consistent with several weaker forms of empiricism. (shrink)
The use of informational terms is widespread in molecular and developmental biology. The usage dates back to Weismann. In both protein synthesis and in later development, genes are symbols, in that there is no necessary connection between their form (sequence) and their effects. The sequence of a gene has been determined, by past natural selection, because of the effects it produces. In biology, the use of informational terms implies intentionality, in that both the form of the signal, and the response (...) to it, have evolved by selection. Where an engineer sees design, a biologist sees natural selection. (shrink)
This article examines Gilles Deleuze’s concept of the simulacrum, which Deleuze formulated in the context of his reading of Nietzsche’s project of “overturning Platonism.” The essential Platonic distinction, Deleuze argues, is more profound than the speculative distinction between model and copy, original and image. The deeper, practical distinction moves between two kinds of images or eidolon, for which the Platonic Idea is meant to provide a concrete criterion of selection “Copies” or icons (eikones) are well-grounded claimants to the transcendent (...) Idea, authenticated by their internal resemblance to the Idea, whereas “simulacra” (phantasmata) are like false claimants, built on a dissimilarity and implying an essential perversion or deviation from the Idea. If the goal of Platonism is the triumph of icons over simulacra, the inversion of Platonism would entail an affirmation of the simulacrum as such, which must thus be given its own concept. Deleuze consequently defines the simulacrum in terms of an internal dissimilitude or “disparateness,” which in turn implies a new conception of Ideas, no longer as self-identical qualities (the auto kath’hauto), but rather as constituting a pure concept of difference. An inverted Platonism would necessarily be based on a purely immanent and differential conception of Ideas. Starting from this new conception of the Idea, Deleuze proposes to take up the Platonic project anew, rethinking the fundamental figures of Platonism (selection, repetition, ungrounding, the question-problem complex) on a purely differential basis. In this sense, Deleuze’s inverted Platonism can at the same time be seen as a rejuvenated Platonism and even a completed Platonism. (shrink)
The central importance of Marx's concept of nature in the formulation of historical materialism has been largely neglected in the extensive literature on Marx. Alfred Schmidt, philosophical successor to Max Horkheimer and Theodor Adorno in Frankfurt, seeks to elucidate it in this original study.
The Concept of Time presents the reconstructed text of a lecture delivered by Martin Heidegger to the Marburg Theological Society in 1924. It offers a fascinating insight into the developmental years leading up to the publication, in 1927, of his magnum opus Being and Time, itself one of the most influential philosophical works this century. In The Concept of Time Heidegger introduces many of the central themes of his analyses of human existence which were subsequently incorporated into Being (...) and Time, themes such as Dasein, Being-in-the-world, everydayness, disposition, care, authenticity, death, uncanniness, temporality and historicity. Starting out by asking: What is time?, Heidegger proceeds to radicalise the concept of time and our relation to it, ending with the question: Are we ourselves time? Am I time? (shrink)
The science of metrology characterizes the concept of precision in exceptionally loose and open terms. That is because the details of the concept must be filled in—what I call narrowing of the concept—in ways that are sensitive to the details of a particular measurement or measurement system and its use. Since these details can never be filled in completely, the concept of the actual precision of an instrument system must always retain some of the openness of (...) its general characterization. The idea that there is something that counts as the actual precision of a measurement system must therefore always remain an idealization, a conclusion that would appear to hold very broadly for terms and the concepts they express. (shrink)
Sartori (1970) warned a long time ago of the danger of concept stretching for effective and cumulative theory building. Such concept stretching has happened with regard to deliberation, which has become a very faddish term. For theoretically well-founded empirical research it is better conceptually to distinguish clearly between strategic bargaining and deliberation, although in the empirical political world the two concepts are usually heavily intertwined. Keywords deliberation; concept stretching; strategic bargaining.
Ralph Johnson's Manifest Rationality (2000) is a major contribution to the field of informal logic, but the concept of argument that is central to its project suffers from a tension between the components that comprise it. This paper explores and addresses that tension by examining the implications of each of five aspects of the definition of ‘argument’.
In feminist theory, intersectionality has become the predominant way of conceptualizing the relation between systems of oppression which construct our multiple identities and our social locations in hierarchies of power and privilege. The aim of this essay is to clarify the origins of intersectionality as a metaphor, and its theorization as a provisional concept in Kimberlé Williams Crenshaw’s work, followed by its uptake and mainstreaming as a paradigm by feminist theorists in a period marked by its widespread and rather (...) unquestioned--if, at times, superficial and inattentive--usage. I adduce four analytic benefits of intersectionality as a research paradigm: simultaneity, complexity, irreducibility and inclusivity. Then, I gesture at, and respond to some critiques of intersectionality advanced in the last few years, during which the concept has increasingly come under scrutiny. (shrink)
Excerpt from The Concept of Morals In morals finally we have the doctrine of ethical rela tivity.' It IS the same story over again. Morality ls doubtless human. It has not descended upon us out of the sky. It has grown out of human nature, and is relative to that nature. Nor could it have, apart from that nature, any meaning whatever. This we must, accept. But if this is interpreted to mean that whatever any social group thinks good (...) is good (for that group), that there is no common standard, and that consequently any one moral code is as good as any other, then this relativism in effect denies the difference between good and evil altogether, and makes meaningless the idea of progress in moral con ceptions. About the Publisher Forgotten Books publishes hundreds of thousands of rare and classic books. Find more at www.forgottenbooks.com This book is a reproduction of an important historical work. Forgotten Books uses state-of-the-art technology to digitally reconstruct the work, preserving the original format whilst repairing imperfections present in the aged copy. In rare cases, an imperfection in the original, such as a blemish or missing page, may be replicated in our edition. We do, however, repair the vast majority of imperfections successfully; any imperfections that remain are intentionally left to preserve the state of such historical works. (shrink)
According to Ernst Cassirer, the transition from the concept of substance to that of mathematical function as a guide of knowledge coincided with the end of ancient and the beginning of modern theoretical thought. In the first part of this article we argue that a similar transition has also taken place in the practical sphere, where mathematical function occurs in one of its specific forms, which is that of the algorithm concept. In the second part we argue that (...) with the rise of modernity the idea of substance and the related concepts of category and classification, which are deeply embedded in western culture, have not been totally supplanted by that of function. The intertwining of the concepts of substance and function has generated contradictory hybrids. These hybrids are used as a key for the understanding of the different repercussions of algorithmic logic on society in terms of social integration. (shrink)
Introduced into the philosophical lexicon during the Eighteenth Century, the term ‘aesthetic’ has come to be used to designate, among other things, a kind of object, a kind of judgment, a kind of attitude, a kind of experience, and a kind of value. For the most part, aesthetic theories have divided over questions particular to one or another of these designations: whether artworks are necessarily aesthetic objects; how to square the allegedly perceptual basis of aesthetic judgments with the fact that (...) we give reasons in support of them; how best to capture the elusive contrast between an aesthetic attitude and a practical one; whether to define aesthetic experience according to its phenomenological or representational content; how best to understand the relation between aesthetic value and aesthetic experience. But questions of more general nature have lately arisen, and these have tended to have a skeptical cast: whether any use of ‘aesthetic’ may be explicated without appeal to some other; whether agreement respecting any use is sufficient to ground meaningful theoretical agreement or disagreement; whether the term ultimately answers to any legitimate philosophical purpose that justifies its inclusion in the lexicon. The skepticism expressed by such general questions did not begin to take hold until the later part of the 20th century, and this fact prompts the question whether (a) the concept of the aesthetic is inherently problematic and it is only recently that we have managed to see that it is, or (b) the concept is fine and it is only recently that we have become muddled enough to imagine otherwise. Adjudicating between these possibilities requires a vantage from which to take in both early and late theorizing on aesthetic matters. (shrink)
The concept of the dialogical soul proposed by Joseph Ratzinger is a contemporary attempt to describe the anthropology of humanity in terms of basic, fundamental theological concepts. Epistemological approach of the dialogic soul is not about the division, but co-existence in the concept of humanity significantly different anthropological concepts. Modern neuroscience, although following completely different paths of knowing is currently concerning an important issue "of the embodied mind". Such a holistic effort to discover the truth about the man, (...) though carried out on completely different epistemological platforms, however, have some points in common. The difficulty in finding a common language for the dialogue in this field can be overcome and lead to dialogue, which is extremely difficult but doable. We must, however, at the beginning formulate certain fundamental axioms that define class concepts used in different areas of scientific activity. The concept of dialogical soul of Ratzinger’s now exceeds the barrier of scientific paradigms axioms. It does not stop on the vision of human oneself, but recognition of one in the area of relationships and makes room for a substantial dialogue with the world of modern science. (shrink)
This article examines the meaning and significance of the concept of constituent power in constitutional thought by showing how it acts as a boundary concept with respect to three types of legal thought: normativism, decisionism and relationalism. The concept can be fully appreciated, it suggests, only by adopting a relationalist method. This relationalist method permits us to deal with the paradoxical aspects of constitutional founding creatively and to grasp how constituent power, as the generative aspect of the (...) political power relationship, works not only at founding moments but also within the dynamics of constitutional development. Relationalism realizes this ambition by exposing the tension between unity and hierarchy in constitutional foundation and the tension between the people-as-one and the people-as-the governed in the course of constitutional development. It contends, contrary to normativist claims, that constituent power remains a central concept of constitutional thought. (shrink)
First published in 1949, Gilbert Ryle ’s The Concept of Mind is one of the classics of twentieth-century philosophy. Described by Ryle as a ‘sustained piece of analytical hatchet-work’ on Cartesian dualism, The Concept of Mind is a radical and controversial attempt to jettison once and for all what Ryle called ‘the ghost in the machine’: Descartes’ argument that mind and body are two separate entities. This sixtieth anniversary edition includes a substantial commentary by Julia Tanney and is (...) essential reading for new readers interested not only in the history of analytic philosophy but in its power to challenge major currents in philosophy of mind and language today. (shrink)
In the first section, I consider what several logicians say informally about the notion of logical consequence. There is significant variation among these accounts, they are sometimes poorly explained, and some of them are clearly at odds with the usual technical definition. In the second section, I first argue that a certain kind of informal account—one that includes elements of necessity, generality, and apriority—is approximately correct. Next I refine this account and consider several important questions about it, including the appropriate (...) characterization of necessity, the criterion for selecting logical constants, and the exact role of apriority. I argue, among other things, that there is no need to recognize a special logical sense of necessity and that the selection of terms to serve as logical constants is ultimately a pragmatic matter. In the third section, I consider whether the informal account I have presented and defended is adequately represented by the usual technical definition. I show that it is, and provably so, for certain limited ways of selecting logical constants. In the general case, however, there seems to be no way to be sure that the technical and informal accounts coincide. (shrink)
Advances in molecular biological research in the latter half of the twentieth century have made the story of the gene vastly complicated: the more we learn about genes, the less sure we are of what a gene really is. Knowledge about the structure and functioning of genes abounds, but the gene has also become curiously intangible. This collection of essays renews the question: what are genes? Philosophers, historians and working scientists re-evaluate the question in this volume, treating the gene as (...) a focal point of interdisciplinary and international research. It will be of interest to professionals and students in the philosophy and history of science, genetics and molecular biology. (shrink)
In recent years, academics and educators have begun to use software mapping tools for a number of education-related purposes. Typically, the tools are used to help impart critical and analytical skills to students, to enable students to see relationships between concepts, and also as a method of assessment. The common feature of all these tools is the use of diagrammatic relationships of various kinds in preference to written or verbal descriptions. Pictures and structured diagrams are thought to be more comprehensible (...) than just words, and a clearer way to illustrate understanding of complex topics. Variants of these tools are available under different names: “concept mapping”, “mind mapping” and “argument mapping”. Sometimes these terms are used synonymously. However, as this paper will demonstrate, there are clear differences in each of these mapping tools. This paper offers an outline of the various types of tool available and their advantages and disadvantages. It argues that the choice of mapping tool largely depends on the purpose or aim for which the tool is used and that the tools may well be converging to offer educators as yet unrealised and potentially complementary functions. (shrink)
The creation of transgenic animals by means of modern techniques of genetic manipulation is evaluated in the light of different interpretations of the concept of intrinsic value. The zoocentric interpretation, emphasizing the suffering of individual, sentient animals, is described as an extension of the anthropocentric interpretation. In a biocentric or ecocentric approach the concept of intrinsic value first of all denotes independence of humans and a non-instrumental relation to animals. In the zoocentric approach of Bernard Rollin, genetic engineering (...) is seen as a morally neutral tool, as long as the animal does not suffer as a result of it. Robert Colwell who defends an ecocentric ethic, makes a sharp distinction between wild animals and domesticated animals. Genetic manipulation of wild species is a serious moral issue, in contrast to genetic manipulation of domesticated species which is no problem at all for Colwell. Both authors do not take the species-specific nature (or telos) of domesticated animals seriously. When domestication is seen as a process between the two poles of the wild animal and the human construct (which can be patented), the technique of genetic manipulation can only be seen as a further encroachment upon the intrinsic value of animals. At the level of molecular biology, the concept of an animal's telos loses its meaning. (shrink)
In this chapter we argue that our concept of time is a functional concept. We argue that our concept of time is such that time is whatever it is that plays the time role, and we spell out what we take the time role to consist in. We evaluate this proposal against a number of other analyses of our concept of time, and argue that it better explains various features of our dispositions as speakers and our (...) practices as agents. (shrink)
This book examines the birth of the scientific understanding of motion. It investigates which logical tools and methodological principles had to be in place to give a consistent account of motion, and which mathematical notions were introduced to gain control over conceptual problems of motion. It shows how the idea of motion raised two fundamental problems in the 5th and 4th century BCE: bringing together being and non-being, and bringing together time and space. The first problem leads to the exclusion (...) of motion from the realm of rational investigation in Parmenides, the second to Zeno's paradoxes of motion. Methodological and logical developments reacting to these puzzles are shown to be present implicitly in the atomists, and explicitly in Plato who also employs mathematical structures to make motion intelligible. With Aristotle we finally see the first outline of the fundamental framework with which we conceptualise motion today. (shrink)
Our primary focus is on analysis of the concept of voluntariness, with a secondary focus on the implications of our analysis for the concept and the requirements of voluntary informed consent. We propose that two necessary and jointly sufficient conditions must be satisfied for an action to be voluntary: intentionality, and substantial freedom from controlling influences. We reject authenticity as a necessary condition of voluntary action, and we note that constraining situations may or may not undermine voluntariness, depending (...) on the circumstances and the psychological capacities of agents. We compare and evaluate several accounts of voluntariness and argue that our view, unlike other treatments in bioethics, is not a value-laden theory. We also discuss the empirical assessment of individuals? perceptions of the degrees of noncontrol and self-control. We propose use of a particular Decision Making Control Instrument. Empirical research using this instrument can provide data that will help establish appropriate policies and procedures for obtaining voluntary consent to research. (shrink)
This volume makes available in English for the first time Adorno’s lectures on metaphysics. It provides a unique introduction not only to metaphysics but also to Adorno’s own intellectual standpoint, as developed in his major work Negative Dialectics. Metaphysics for Adorno is defined by a central tension between concepts and immediate facts. Adorno traces this dualism back to Aristotle, whom he sees as the founder of metaphysics. In Aristotle it appears as an unresolved tension between form and matter. This basic (...) split, in Adorno’s interpretation, runs right through the history of metaphysics. Perhaps not surprisingly, Adorno finds this tension resolved in the Hegelian dialectic. Underlying this dualism is a further dichotomy, which Adorno sees as essential to metaphysics: while it dissolves belief in transcendental worlds by thought, at the same time it seeks to rescue belief in a reality beyond the empirical, again by thought. It is to this profound ambiguity, for Adorno, that the metaphysical tradition owes its greatness. The major part of these lectures, given by Adorno late in his life, is devoted to a critical exposition of Aristotle’s thought, focusing on its central ambiguities. In the last lectures, Adorno’s attention switches to the question of the relevance of metaphysics today, particularly after the Holocaust. He finds in metaphysical experiences, which transcend rational discourse without lapsing into irrationalism, a last precarious refuge of the humane truth to which his own thought always aspired. This volume will be essential reading for anyone interested in Adorno’s work and will be a valuable text for students and scholars of philosophy and social theory. (shrink)