Robustness has long been recognized as an important parameter for evaluating game-theoretic results, but talk of ‘robustness’ generally remains vague. What we offer here is a graphic measure for a particular kind of robustness (‘matrix robustness’), using a three-dimensional display of the universe of 2 × 2 game theory. In such a measure specific games appear as specific volumes (Prisoner’s Dilemma, Stag Hunt, etc.), allowing a graphic image of the extent of particular game-theoretic effects in terms of those games. The (...) measure also allows for an easy comparison between different effects in terms of matrix robustness. Here we use the measure to compare the robustness of Tit for Tat’s well-known success in spatialized games (Axelrod, R. (1984). The evolution of cooperation. New York: Basic Books; Grim, P. et al. (1998). The philosophical computer: Exploratory essays in philosophical computer modeling. Cambridge, Mass: MIT Press) with the robustness of a recent game-theoretic model of the contact hypothesis regarding prejudice reduction (Grim et al. 2005. Public Affairs Quarterly, 19, 95–125). (shrink)
Book Symposium on Don Ihde’s Expanding Hermeneutics: Visualism in Science Content Type Journal Article Category Book Symposium Pages 1-22 DOI 10.1007/s13347-011-0060-5 Authors Jan Kyrre Berg Olsen Friis, University of Copenhagen, Nørre Farimagsgade 5 A, Room 10.0.27, 1014 Copenhagen, Denmark Larry A. Hickman, The Center for Dewey Studies, Southern Illinois University Carbondale, Carbondale, IL 62901, USA RobertRosenberger, School of Public Policy, Georgia Institute of Technology, DM Smith Building, 685 Cherry Street, Atlanta, GA 30332-0345, USA Robert C. Scharff, (...) University of New Hampshire, Durham, NH 03824-3574, USA Don Ihde, Stony Brook University, Harriman Hall 221, Stony Brook, NY 11794-3750, USA Journal Philosophy & Technology Online ISSN 2210-5441 Print ISSN 2210-5433. (shrink)
Algebraic/topological descriptions of living processes are indispensable to the understanding of both biological and cognitive functions. This paper presents a fundamental algebraic description of living/cognitive processes and exposes its inherent ambiguity. Since ambiguity is forbidden to computation, no computational description can lend insight to inherently ambiguous processes. The impredicativity of these models is not a flaw, but is, rather, their strength. It enables us to reason with ambiguous mathematical representations of ambiguous natural processes. The noncomputability of these structures means computerized (...) simulacra of them are uninformative of their key properties. This leads to the question of how we should reason about them. That question is answered in this paper by presenting an example of such reasoning, the demonstration of a topological strategy for understanding how the fundamental structure can form itself from within itself. (shrink)
I explicate the crucial role played by efficient cause in Robert Rosen’s characterization of life, by elaborating on the topic of Aristotelian causality, and exploring the many alternate descriptions of causal and inferential entailments. In particular, I discuss the concepts of functional entailment and immanent causation, and examine how they fit into Robert Rosen’s relational-biology universe of living, anticipatory, and complex systems.
Robert Rosen’s (M,R)-systems are a class of relational models that define organisms. The realization of relational models plays a central role in his study of life, itself. Biology becomes identified with the class of material realizations of a certain kind of relational organization, exhibited in (M,R)-systems. In this paper I describe several realizations of (M,R)-systems, and in particular alternate realizations of the replication component.
‘The problem with simulations is that they are doomed to succeed.’ So runs a common criticism of simulations—that they can be used to ‘prove’ anything and are thus of little or no scientific value. While this particular objection represents a minority view, especially among those who work with simulations in a scientific context, it raises a difficult question: what standards should we use to differentiate a simulation that fails from one that succeeds? In this paper we build on a structural (...) analysis of simulation developed in previous work to provide an evaluative account of the variety of ways in which simulations do fail. We expand the structural analysis in terms of the relationship between a simulation and its real-world target emphasizing the important role of aspects intended to correspond and also those specifically intended not to correspond to reality. The result is an outline both of the ways in which simulations can fail and the scientific importance of those various forms of failure. (shrink)
An emerging philosophical perspective called “postphenomenology,” which offers reflection upon human relations to technology, has the potential to increase our understanding of the functions performed by imaging technologies in scientific practice. In what follows, I review some relevant insights and expand them for use in the concrete analysis of practices of image interpretation in science. As a guiding example, I explore how these insights bear upon a contemporary debate in space science over images of the fossilized remains of a river (...) delta on the surface of Mars. These considerations include an analysis of the ways that the objects of study are transformed by the mediating imaging technologies, such as the Mars Orbiter Camera. (shrink)
Contemporary scientific research and public policy are not in agreement over what should be done to address the dangers that result from the drop in driving performance that occurs as a driver talks on a cellular phone. One response to this threat to traffic safety has been the banning in a number of countries and some states in the USA of handheld cell phone use while driving. However, research shows that the use of hands-free phones (such as headsets and dashboard-mounted (...) speakers) also accompanies a drop, leading some to recommend regulation of both kinds of mobile phones. In what follows, I draw out the accounts of the driving impairment associated with phone use implicit in research and policy and develop an alternative account grounded in philosophical considerations. Building on work in a school of thought called postphenomenology, I review and expand concepts useful for articulating human bodily and perceptual relations to technology. By applying these ideas to the case of driving while talking on the phone, I offer an account of the drop in driving performance which focuses on the embodied relationships users develop with the car and the phone, and I consider implications for research and policy. (shrink)
In a future world of ubiquitous computing, in which humans interact with computerized technologies even more frequently and in even more situations than today, interface design will have increased importance. One feature of interface that I argue will be especially relevant is what I call abstract relational strategies. This refers to an approach (in both a bodily and conceptual sense) toward the use of a technology, an approach that is general enough to be applied in many different concrete scenarios. Such (...) an abstract manner of approach is relevant, for example, when an interface design for a device to which users are already accustomed is applied to an entirely different device (such as a device used for a completely different purpose). To articulate this idea, I explore the history of keyboards, and consider how the habits of interface with one kind (e.g., piano keyboards) have historically enabled some users to approach other technologies fitted with similar keyboard interface (e.g., typewriters, electronic instrumentation). I conclude by brainstorming ways that abstract relational strategies, applicable to a variety of different devices, will have increased importance in a future world in which computing is even more ubiquitous than today. (shrink)
The philosophical tradition of phenomenology, with its focus on human bodily perception, can be used to explore the ways scientific instrumentation shapes a user’s experience. Building on Don Ihde’s account of technological embodiment, I develop a framework of concepts for articulating the experience of image interpretation in science. These concepts can be of practical value to the analysis of scientific debates over image interpretation for the ways they draw out the relationships between the image-making processes and the rival scientific explanations (...) of image content. As a guiding example, I explore a contemporary debate over images of the surface of Mars which reveal a landmass that resembles river delta formations on Earth, and which thus has important implications for the history of Martian climate and water flow. The phenomenological framework I develop can be used to help evaluate the different interpretations on offer for these images, and to analyze the roles in this discussion played by spacecraft equipped with cameras and laser and thermal imaging devices. (shrink)
Defenders of educational frog dissection tend to emphasize the claim that computer-simulated alternatives cannot replicate the same exact experience of slicing open a frog, with all its queasy and visceral impact. Without denying that point, I argue that this is not the only educational standard against which computer-simulated dissection should be evaluated. When real-world frog dissection is analyzed as a concrete technological practice rather than an assumed ideal, the particular educational advantages distinct to real-world dissection and virtual dissection can be (...) enumerated and compared. Building on the work of John Dewey and Don Ihde, I explore the still-expanding advantages of computer-simulated dissection, and in this proper context of comparison it becomes clear that virtual alternatives are increasingly the more educationally beneficial option. (shrink)
Insights from the phenomenological tradition of philosophy can be fruitfully applied to ongoing scientific investigations. In what follows, I review and refine a methodology I have developed for the application of concepts from the phenomenology of technology—concepts which articulate bodily and perceptual relations to technology—to a specific context of scientific practice: debate over the interpretation of laboratory images. As a guiding example, I introduce a case study of a contemporary debate over images of Mars which reveal evidence of fluid movement (...) on the planet’s surface in the last decade. Next, the framework of phenomenological concepts is applied to this example, and contrasts are made with the results of previous case studies. I conclude with reflections on the implications of this perspective for both the use of imaging technologies in scientific research specifically, and for the phenomenology of technology generally. (shrink)
Ideas developed within the philosophical tradition of phenomenology can be used to describe the experience of talking on the phone. In particular, I build on a contemporary brand of phenomenology called “postphenomenology,” a school of thought which specializes in the analysis of the relationships that form between users and technologies. Three central concepts are reviewed and developed: transparency, sedimentation, and what I call “field composition.” These concepts can be used for the description of the way that the content of a (...) telephone conversation can come to stand forward and capture a user’s overall field of awareness. I suggest that this account of the experience of the telephone can be useful for analyzing issues in scientific research and public policy regarding the topic of using the phone while driving. (shrink)
The experience of computer use can be productively articulated with concepts developed in the phenomenological tradition of philosophy. Building on the insights of classical phenomenologists, Ihde has advanced a sophisticated view of the ways humans relate to technology. I review and expand on his notions of “technological mediation,” “embodiment,” and “multistability,” and apply them to the experience of computer interface. In particular, I explore the experience of using a computer that fails to work properly. A revealing example is the experience (...) of a user who suddenly and unexpectedly encounters a slowly-loading webpage while using the Internet. This phenomenological framework provides an account of the ways a suddenly failing technology changes our relationships to the device, to the world, and to ourselves, and it also suggests how this experience can be usefully reconceptualized. (shrink)
Robustness has long been recognized as an important parameter for evaluating game-theoretic results, but talk of ‘robustness’ generally remains vague. What we offer here is a graphic measure for a particular kind of robustness (‘matrix robustness’), using a three-dimensional display of the universe of 2 × 2 game theory. In such a measure specific games appear as specific volumes (Prisoner’s Dilemma, Stag Hunt, etc.), allowing a graphic image of the extent of particular game-theoretic effects in terms of those games. The (...) measure also allows for an easy comparison between different effects in terms of matrix robustness. Here we use the measure to compare the robustness of Tit for Tat’s well-known success in spatialized games (Axelrod, R. (1984). The evolution of cooperation . New York: Basic Books; Grim, P. et al. (1998). The philosophical computer: Exploratory essays in philosophical computer modeling . Cambridge, Mass: MIT Press) with the robustness of a recent game-theoretic model of the contact hypothesis regarding prejudice reduction (Grim et al. 2005. Public Affairs Quarterly, 19 , 95–125). (shrink)
This essay is an attempt to construct an artificial dialog loosely modeled after that sought by Robert Maynard Hutchins who was a significant influence on many of us including and especially Robert Rosen. The dialog is needed to counter the deep and devastating effects of Cartesian reductionism on todayâ€™s world. The success of such a dialog is made more probable thanks to the recent book by A. Louie. This book makes a rigorous basis for a new paradigm, (...) the one pioneered by the late Robert Rosen. If we are to make such a paradigm shift happen, it has to be in the spirit of the dialog. The relationship between science, economics, technology and politics has to be openly recognized and dealt with. The message that Rosen sent to us has to be told outside small select circles of devotees. The situation has even been described by some as resembling a cult. This is no way for universal truths like these to be seen. The essay examines why this present situation has happened and identifies the systemic nature of the problem in terms of Rosenâ€™s concepts about systems. The dialog involves works by George Lakoff, W. Brian Arthur, N. Katherine Hayles, Robert Reich and Dorion Sagan. These scholars each have their own approach to identifying the nature of the interacting systems that involve human activity and the importance of identifying levels of abstraction in analyzing systems. Pooling their insights into different facets of a complex system is very useful in constructing a model of the self referential system that humans and their technology have shaped. The role of the human component in the whole earth system is the goal of the analysis. The impact of the Cartesian reductionist paradigm on science and the related aspects of human activity are examined to establish an explanation for the isolation of Rosenâ€™s paradigm. The possible way to proceed is examined in the conclusion. (shrink)
Kinetic models using enzyme kinetics are developed for the three ways that Louie proved that Rosen’s minimal (M-R)-System can be closed to efficient cause; i.e., how the “replication” component can itself be entailed from within the system. The kinetic models are developed using the techniques of network thermodynamics. As a demonstration, each model is simulated using a SPICE circuit simulator using arbitrarily chosen rate constants. The models are built from SPICE sub-circuits representing the key terms in the chemical rate (...) equations. The models include the addition of an ad hoc semi-permeable membrane so the system can achieve steady state fluxes and also to illustrate the need for all the efficient cause agents to be continually replaced. Comments are made about exactly what is being simulated. (shrink)
Drawing on Aristotle’s notion of “ultimate responsibility,” Robert Kane argues that to be exercising a free will an agent must have taken some character forming decisions for which there were no sufficient conditions or decisive reasons.1 That is, an agent whose will is free not only had the ability to develop other dispositions, but could have exercised that ability without being irrational. To say it again, a person has a free will just in case her character is the product (...) of decisions that she could have rationally avoided making. That one’s character is the product of such decisions entails ultimate responsibility for its manifestations, engendering a free will. (shrink)
In this paper I argue that Robert Kane’s defense of event-causal libertarianism, as presented in Responsibility, Luck, and Chance: Reflections on Free Will and Indeterminism, fails because his event-causal reconstruction is incoherent. I focus on the notions of efforts and self-forming actions essential to his defense.
This article discusses the theories of perception of Robert Kilwardby and Peter of John Olivi. Our aim is to show how in challenging certain assumptions of medieval Aristotelian theories of perception they drew on Augustine and argued for the active nature of the soul in sense perception. For both Kilwardby and Olivi, the soul is not passive with respect to perceived objects; rather, it causes its own cognitive acts with respect to external objects and thus allows the subject to (...) perceive them. We also show that Kilwardby and Olivi differ substantially regarding where the activity of the soul is directed to and the role of the sensible species in the process, and we demonstrate that there are similarities between their ideas of intentionality and the attention of the soul towards the corporeal world. (shrink)
For the last several decades, philosophers have wrestled with the proper place of religion in liberal societies. Usually, the debates among these philosophers have started with the articulation of various conceptions of liberalism and then proceeded to locate religion in the context of these conceptions. In the process, however, too little attention has been paid to the way religion is conceived. Drawing on the work of Robert Audi and Nicholas Wolterstorff, two scholars who are often read as holding opposing (...) views on these issues, I argue that, for the purposes of their argument about liberalism, both have implicitly accepted a concept of religion that has come under severe attack in recent work on the subject. Namely, they have accepted a concept of religion that identifies religion primarily with belief, ritual practice, and ecclesial institutions. Following recent scholarship, I suggest that religion is better conceived as a kind of culture. To conclude the essay, I gesture toward what the beginnings of a re-visioned debate about religion and liberal society might look like if one started from this revised conception of religion. (shrink)
This review presents the principal themes of Robert Spaemann's Persons: The Difference between ‘Someone’ and ‘Something.’ To be a person is not to be identical with one's teleological nature, but rather, to have that nature. Personal consciousness is necessarily temporal consciousness. Persons have a range of distinctively personal acts, such as recognizing and respecting one another, understanding their lives as wholes, making judgments of conscience, promising, and forgiving. All members of the human species, whatever their stage of development or (...) limitations, are persons. The present review also briefly considers certain objections that have been raised against Spaemann's position. (shrink)
A synthesis of the two primary theory structures in Robert Rosenâ€™s relational complexity, (1) relational entailment mapping based on category theory as described by Rosen and Louie, and (2) relational holism based on modeling relations, as described by Kineman, provides an integral foundation for relational complexity theory as a natural science and analytical method. Previous incompatibilities between these theory structures are resolved by re-interpreting Aristotleâ€™s four causes, identifying final and formal causes as relations with context. Category theory is (...) applied to introduce contextual entailment algebra needed to complete the synthesis. The modeling relation is represented as a recursive four-cause hierarchy, which is a unit of both whole and part analysis (a â€˜holonâ€™) that relates realized and contextual domains of nature as complementary inverse entailments between structure and function. Context is a non-localized domain of distributed potentials (models) for existence, as contrasted with the realized domain of localized interactive and measurable events. Synthesis is achieved by giving modeling relations an algebraic form in category theory and by expanding relational analysis to include contextual entailments. The revised form of analysis is applied and demonstrated to examine Rosenâ€™s M-R diagram, showing that structureâ€“function relations imply adaptive interaction with the environment, and that contextual relations imply three forms of the M-R entailment corresponding with the generally known three forms of life; Archaea , Bacteria , and Eukaryota , which can be represented by their holon diagrams. The result of this synthesis is a consistent foundation for relational science that should have important implications in many disciplines. (shrink)
We review and discuss A. H. Louie’s book “More than Life Itself: A Reflexion on Formal Systems and Biology” from an interdisciplinary viewpoint, involving both biology and mathematics, taking into account new developments and related theories.
This article responds to the question of the ‘implicit and presupposed theological turn of phenomenology’ by providing a close reading of Jacques Derrida’s Le Toucher—Jean-Luc Nancy (2000 French/2005 English translation), particularly concerning what Derrida alludes to as ‘the Christian thinking of the flesh’ in the French phenomenological tradition post-Husserl. In reading Derrida’s own text, the article identifies and then performs a ‘cryptonomy’ of references to the ‘Christian body,’ and of the ‘return of religion.’ The article also focuses on the (...) more recent writings of Jean-Luc Nancy, especially Corpus (2000 French), concerning the body and its relationship to the concept of corporality (Leiblichkeit) from Husserl’s Ideas II. (shrink)
Jean-Luc Nancy is a contemporary continental philosopher who argues that the hope of fully unifying a community through work is problematic. This is because people cannot be reduced to their function as workers. Thus, community is, at best, inoperative. This article takes Nancy’s ideas of community and applies them to the notion of teamwork in business. It shows how in some literature on business teamwork, there is a desire to build a team through shared work experiences. It then (...) explains Nancy’s view as to why this cannot work, and it enters into Nancy’s positive account of how a community should be seen as a web of people communicating and sharing with each other in a variety of ways. The practical conclusion the study draws is that team members need to be careful about allowing goal orientation to obfuscate the richness of the relationships that occur among team members. People need to explore all of the ways in which people share with each other rather than just those ways that advance a narrow set of goals. If the richness of those relationships is recognized, many new directions for business and for general human development may appear. (shrink)
An individual is in the lowest phase of moral development if he thinks only of his own personal interest and has only his own selfish agenda in his mind as he encounters other humans. This lowest phase corresponds well with sixteenth century British moral egoism which reflects the rise of the new economic order. Adam Smith (1723–1790) wanted to defend this new economic order which is based on economic exchange between egoistic individuals. Nevertheless, he surely did not want to support (...) the moral theory of British egoism. His book The Wealth of Nations suits well into the world view of British moral egoism, but in the book The Theory of Moral Sentiments, he presents a moral theory which is the total opposite of moral egoism. Contemporary German intellectuals saw contradiction in Adam Smith’s moral (social) philosophy which they called as Das Adam - Smith - Problem . Smith himself didn’t think that there is any contradiction in a situation where in economic sphere (civil society) individual act egoistically and in ethical sphere (encounter with the imagined Other) he feels humanity and compassion toward his fellow men. Hegel was a passionate reader of Adam Smith and he acknowledged Das Adam - Smith - Problem . He set the task of his social philosophy to overcome this paradox. He wanted to create a theory of a social totality where economic egoism and feelings of humanity are not in contradiction. In the same time Hegel wanted to create a theory on Bildung process where human spirit develops from moral un-freedom (heteronomy) to moral freedom and maturity (autonomy) taking care both aspect of love and reason. In certain Hegel’s texts notion of recognition plays crucial role. That is why modern Hegelians Ludwig Siep, Axel Honneth and Robert Williams consider the notion of recognition to be elementary in Hegel’s threefold theory of developing human spirit from family via civil society to sittliche state . For Hegel family is a sphere where people love their “concrete other” and where feeling surpasses reason. Civil Society is a sphere of private contracts and economic exchanges where cold egoistic and calculative reason surpasses feelings. In the sphere of State the contradiction between family and Civil Society ( Das Adam - Smith - Problem ) is solved by “rational feeling”. According to Hegel State should protect citizens from alienating effect of egoistic reason of Civil Society and cultivate “family-feelings” to rational feelings which integrate citizen into “sittliche community” through reciprocal process of recognition. In this article I want to consider Hegelians Honneth’s and Williams’s relevance to the theory of moral development. (shrink)
The aim of this paper is to describe and analyze the epistemological justification of a proposal initially made by the bio-mathematician Robert Rosen in 1958. In this theoretical proposal, Rosen suggests using the mathematical concept of « category » and the correlative concept of « natural equivalence » in mathematical modeling applied to living beings. Our questions are the following: according to Rosen, to what extent does the mathematical notion of category give access to more « natural » formalisms (...) in the modeling of living beings? Is the so-called « naturalness » of some kinds of equivalences (which the mathematical notion of category makes it possible to generalize and to put at the forefront) analogous to the naturalness of living systems? Rosen appears to answer « yes » and to ground this transfer of the concept of « natural equivalence » in biology on such an analogy. But this hypothesis, although fertile, remains debatable. Finally, this paper makes a brief account of the later evolution of Rosen’s arguments about this topic. In particular, it sheds light on the new role played by the notion of « category » in his more recent objections against computational models that since the 1990’s are pervading almost every domain of biology. (shrink)
This paper is an extended discussion of Robert Ulanowicz’s critique of mechanistic and reductionistic metaphysics of science. He proposes “process ecology” as an alternative. In this paper I discuss four sets of question coming out of Ulanowicz’s proposal. First, I argue that universality remains one of the hallmarks of the scientific enterprise even with his new process metaphysics. I then discuss the Second Law of Thermodynamics in the interpretation of the history of the universe. I question Ulanowicz’s use of (...) the terms “random” and “chance” in his definition of process. Finally, I discuss what difference a relational and process metaphysics might make in addressing the political and practical problems in the twenty-first century. (shrink)
In this article, using the recent work by Charles Taylor in A Secular Age as my point of departure, I will argue that Jean-Luc Nancy enables us to think past the competing binary of atheistic and religious experience and allows us to surpass the present narratives of secularism. In A Secular Age, Taylor himself seeks a middle ground between atheism and religion, arguing that it is possible to open ourselves to the cross-pressures of modern existence that find us caught (...) between scientific atheism and a need for spiritual and religious guidance. Here, Taylor finds a way of picturing ourselves within a secular age, remaining faithful to scientific rationalism, but still open to religion and a sense of a higher good. However, as I shall demonstrate, in his thesis Taylor misrepresents the Continental philosophical tradition (particularly Nietzsche and post-structuralism) that has itself sought to understand these cross-pressures of existence. Taking this misrepresentation, and specifically his reductive and colloquial analysis of Nietzsche, Camus, and Derrida, as my point of departure, I provide an alternative manner of thinking through the work of these writers, one that leads to a detailed analysis of Jean-Luc Nancy and his project the deconstruction of Christianity. In this analysis I argue that Nancy provides a manner of thinking that remains open and allows an experience of freedom, without seeking to close that sense of openness with explanation, nor maintaining that sense of openness with a conception of the divine. (shrink)
We commonly identify something seriously defective in a human life that is lived in ignorance of important but unpalatable truths. At the same time, some degree of misapprehension of reality may be necessary for individual health and success. Morally speaking, it is unclear just how insistent we should be about seeking the truth. Robert Sparrow has considered such issues in discussing the manufacture and marketing of robot ‘pets’, such as Sony’s doglike ‘AIBO’ toy and whatever more advanced devices may (...) supersede it. Though it is not his only concern, Sparrow particularly criticizes such robot pets for their illusory appearance of being living things. He fears that some individuals will subconsciously buy into the illusion, and come to sentimentalize interactions that fail to constitute genuine relationships. In replying to Sparrow, I emphasize that this would be continuous with much of the minor sentimentality that we already indulge in from day to day. Although a disposition to seek the truth is morally virtuous, the virtue concerned must allow for at least some categories of exceptions. Despite Sparrow’s concerns about robot pets (and robotics more generally), we should be lenient about familiar, relatively benign, kinds of self-indulgence in forming beliefs about reality. Sentimentality about robot pets seems to fall within these categories. Such limited self-indulgence can co-exist with ordinary honesty and commitment to truth. (shrink)
Robert Adams’s Finite and Infinite Goods is one of the most important and innovative contributions to theistic ethics in recent memory. This article identifies two major flaws at the heart of Adams’s theory: his notion of intrinsic value and his claim that ‘excellence’ or finite goodness is constituted by resemblance to God. I first elucidate Adams’s complex, frequently misunderstood claims concerning intrinsic value and Godlikeness. I then contend that Adams’s notion of intrinsic value cannot explain what it could mean (...) for countless finite goods to be intrinsically valuable. Next, I articulate a criticism of his Godlikeness thesis altogether unlike those he has previously addressed: I show that, on Adams’s own account of Godlikeness, a diverse myriad of excellences could not possibly count as resembling God. His theory thus fails to account for a whole world of finite goods. I defend my two criticisms against objections and briefly sketch a more Aristotelian and Christian way forward. (shrink)
Abstract Two distinguishing marks of voluntaristic conceptions of human action can be found already in the 12th century, not only in the work of Bonaventura's successors: 1. the will is free to act against reasons's dictates; 2. moral responsibility depends on this conception of the will's freedom. A number of theologians from the 1130s to the 1170s accepted those claims, which have been originally formulated by Bernard of Clairvaux. Robert of Melun elaborated them in a systematical way and coined (...) the terminological distinctions which were controversely discussed in the following centuries. The paper edits and interprets some of his texts about voluntary action. Furthermore, it shows that Bernard's and Robert's ideas have been transported by their intellectualist critics in the 13th century. (shrink)
On the publication of Robert Lowell’s Life Studies in 1959, some critics were shocked by the poet’s use of seemingly frank autobiographical material, in particular the portrayal of his hospitalizations for bipolar disorder. During the late fifties and throughout the sixties, a rich vein, influenced by Lowell , developed in American poetry. Also during this time, the nascent science of psychopharmacology competed with and complemented the more established somatic treatments, such as psychosurgery, shock treatments, and psychoanalytical therapies. The development (...) of Thorazine was a remarkable breakthrough allowing patients previously thought incurable to leave hospital. In 1955, the release of Miltown, the first ‘minor’ tranquilizer, was heralded with a media fanfare promising a new dawn of psychological cure-all. These two events blurred the boundary between ‘normality’ and madness by making treatment in the community more widely possible and by medicalizing more commonplace distress. Lowell’s early depictions of madness situate it as emblematic of the cultural malaise of ‘the tranquilized fifties. ’ By his final collection, Day by Day (1977), mental illness had lost its symbolic power. These late poems explore the power of art as a way of representing and remedying suffering in a culture where psychopharmacology has normalized madness. (shrink)
This essay proposes to read Jean-Luc Nancy’s references to creation ex nihilo as both an intervention in the French debate concerning eventness, and as a transformative rethinking of the status of phenomenality. Nancy’s position is roughly triangulated relative to key remarks from other thinkers and, above all, its distinctive components (temporality, negativity, spatiality) are elucidated through historical glosses. Articulating the overall architecture of this theory serves to illustrate the Heideggerian access to the event debate. It also deepens aspects (...) only elliptically alluded to in Nancy’s own writing. (shrink)
En 1974, Robert Nozick publicó *Anarquía, Estado y Utopía*, una obra que, por primera vez, otorgaba estatus teórico a una de las corrientes del pensamiento neoliberal: el libertarianismo. En buena medida, el texto de Nozick se reclama como una relectura en clave de filosofía analítica de la teoría política de John Locke. En este artículo se ofrecen algunos argumentos para mostrar que, aunque la perspectiva de Nozick presenta ciertas similitudes retóricas con la obra del filósofo inglés, en cada uno (...) de los puntos fundamentales (como por ejemplo la idea de derecho, la noción de persona, el papel de la política y los conceptos de justicia y bien público) Nozick se aparta claramente de las premisas lockeanas. Como conclusión, se sostiene que al alejarse de la concepción lockeana, Nozick defiende una sociedad en la que la política está ausente y en la que el Estado aparece, paradójicamente, menos limitado que en las concepciones liberales clásicas. (shrink)
In one of the essays in his recent book on Christianity, La déclosion (2005), Nancy discusses the relationship between Judaism and Christianity. Nancy opens this discussion with a reference to Lyotard’s book on this relationship: Un trait d’union (1993). Both Lyotard and Nancy examine a very early figure in the emergence of Christianity from Judaism—whereas Lyotard focuses on the epistles of Paul, Nancy reads the epistle of James. Lyotard concludes that the hyphen in the expression ‘Judeo-Christian’ (...) actually conceals ‘the most impenetrable abyss within Western thought’. With this abyss, Lyotard refers to the point of departure of Judaism: the event in which a Voice has left behind letters, inaugurating an interminable work of interpretation. For Nancy, however, it is rather Christianity, and therefore, Western culture, which is deconstructive in nature. Its composition is co-original with a decomposition, and therefore, with an openness. In James, Nancy finds an emphasis on praxis, in such a way that existence is to be understood as transcendent within itself. With this reading of James, Nancy seems to deny that there is a fundamental difference between Judaism and Christianity. In order to clarify the differences between Lyotard and Nancy, it is shown that, in Lyotard’s view, an unsublatable alterity comes with aisthèsis, whereas in Nancy’s view, alterity comes with existence as such. (shrink)
Current sociology of knowledge tends to take for granted Robert K. Merton’s theory of cumulative advantage: successful ideas bring recognition to their authors, successful authors have their ideas recognized more easily than unknown ones. This article argues that this theory should be revised via the introduction of the differential between the status of an idea and that of its creator: when an idea is more important than its creator, the latter becomes identified with the former, and this will hinder (...) recognition of the intellectual’s new ideas as they differ from old ones in their content or style. Robert N. Bellah’s performance during the “civil religion debate” of the 1970s is reconstructed as an example of how this mechanism may work. Implications for further research are considered in the concluding section. (shrink)
O artigo apresenta os argumentos centrais da política deliberativa de Jürgen Habermas (1), e as perspectivas críticas de Axel Honneth (2) e Nancy Fraser (3) de forma a conferir à política habermasiana uma dimensão mais realista, um conteúdo político de vínculo mais concreto com a orientação emancipatória da práxis, e capaz de lidar melhor com a diferença, a diversidade e o conflito.