New perspectives on Pierre Duhem’s The aim and structure of physical theory Content Type Journal Article DOI 10.1007/s11016-010-9467-3 Authors Anastasios Brenner, Department of Philosophy, Paul Valéry University-Montpellier III, Route De Mende, 34199 Montpellier cedex 5, France Paul Needham, Department of Philosophy, University of Stockholm, 10691 Stockholm, Sweden David J. Stump, Department of Philosophy, University of San Francisco, 2130 Fulton Street, San Francisco, CA 94117, USA Robert Deltete, Department of Philosophy, Seattle University, 901 12th Avenue, Seattle, WA 98122-1090, USA (...) Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796 Journal Volume Volume 20 Journal Issue Volume 20, Number 1. (shrink)
Drawing on Aristotle’s notion of “ultimate responsibility,” Robert Kane argues that to be exercising a free will an agent must have taken some character forming decisions for which there were no sufficient conditions or decisive reasons.1 That is, an agent whose will is free not only had the ability to develop other dispositions, but could have exercised that ability without being irrational. To say it again, a person has a free will just in case her character is the product (...) of decisions that she could have rationally avoided making. That one’s character is the product of such decisions entails ultimate responsibility for its manifestations, engendering a free will. (shrink)
This article discusses the theories of perception of Robert Kilwardby and Peter of John Olivi. Our aim is to show how in challenging certain assumptions of medieval Aristotelian theories of perception they drew on Augustine and argued for the active nature of the soul in sense perception. For both Kilwardby and Olivi, the soul is not passive with respect to perceived objects; rather, it causes its own cognitive acts with respect to external objects and thus allows the subject to (...) perceive them. We also show that Kilwardby and Olivi differ substantially regarding where the activity of the soul is directed to and the role of the sensible species in the process, and we demonstrate that there are similarities between their ideas of intentionality and the attention of the soul towards the corporeal world. (shrink)
The aim of this paper is to describe and analyze the epistemological justification of a proposal initially made by the biomathematician Robert Rosen in 1958. In this theoretical proposal, Rosen suggests using the mathematical concept of “category” and the correlative concept of “natural equivalence” in mathematical modeling applied to living beings. Our questions are the following: According to Rosen, to what extent does the mathematical notion of category give access to more “natural” formalisms in the modeling of living beings? (...) Is the so -called “naturalness” of some kinds of equivalences (which the mathematical notion of category makes it possible to generalize and to put at the forefront) analogous to the naturalness of living systems? Rosen appears to answer “yes” and to ground this transfer of the concept of “natural equivalence” in biology on such an analogy. But this hypothesis, although fertile, remains debatable. Finally, this paper makes a brief account of the later evolution of Rosen’s arguments about this topic. In particular, it sheds light on the new role played by the notion of “category” in his more recent objections to the computational models that have pervaded almost every domain of biology since the 1990s. (shrink)
Quentin Skinner’s appropriation of speech act theory for intellectual history has been extremely influential. Even as the model continues to be important for historians, however, philosophers now regard the original speech act theory paradigm as dated. Are there more recent initiatives that might reignite theoretical work in this area? This article argues that the inferentialism of Robert Brandom is one of the most interesting contemporary philosophical projects with historical implications. It shows how Brandom’s work emerged out of the broad (...) shift in the philosophy of language from semantics to pragmatics that also informed speech act theory. The article then goes on to unpack the rich implications of Brandom’s inferentialism for the theory and practice of intellectual history. It contends that inferentialism clarifies, legitimizes, and informs intellectual historical practice, and it concludes with a consideration of the challenges faced by inferentialist intellectual history, together with an argument for the broader implications of Brandom’s work. (shrink)
In his new book, "The Romantic Conception of Life: Science and Philosophy in the Age of Goethe," Robert J. Richards argues that Charles Darwin's true evolutionary roots lie in the German Romantic biology that flourished around the beginning of the nineteenth century. It is argued that Richards is quite wrong in this claim and that Darwin's roots are in the British society within which he was born, educated, and lived.
Traditional eschatology clashes with the theory of entropy. Trying to bridge the gap, Robert John Russell assumes that theology and science are based on contradictory, yet equally valid, metaphysical assumptions, each one capable of questioning and impacting the other. The author doubts that Russell's proposal will convince empirically oriented scientists and attempts to provide a viable alternative. Historical‐critical analysis suggests that biblical future expectations were redemptive responses to changing human needs. Apocalyptic visions were occasioned by heavy suffering in postexilic (...) times. Interpreted in realistic terms, they have since proved to be untenable. The expectation of a new creation without evil, suffering, and death is not constitutive for the substantive content of the biblical message as such. Biblical future expectations must be reconceptualized in terms of best contemporary insight and in line with a dynamic reading of the biblical witness as God's vision of comprehensive optimal well‐being that operates like a shifting horizon and opens up ever new vistas, challenges, and opportunities. (shrink)
We commonly identify something seriously defective in a human life that is lived in ignorance of important but unpalatable truths. At the same time, some degree of misapprehension of reality may be necessary for individual health and success. Morally speaking, it is unclear just how insistent we should be about seeking the truth. Robert Sparrow has considered such issues in discussing the manufacture and marketing of robot ‘pets’, such as Sony’s doglike ‘AIBO’ toy and whatever more advanced devices may (...) supersede it. Though it is not his only concern, Sparrow particularly criticizes such robot pets for their illusory appearance of being living things. He fears that some individuals will subconsciously buy into the illusion, and come to sentimentalize interactions that fail to constitute genuine relationships. In replying to Sparrow, I emphasize that this would be continuous with much of the minor sentimentality that we already indulge in from day to day. Although a disposition to seek the truth is morally virtuous, the virtue concerned must allow for at least some categories of exceptions. Despite Sparrow’s concerns about robot pets (and robotics more generally), we should be lenient about familiar, relatively benign, kinds of self-indulgence in forming beliefs about reality. Sentimentality about robot pets seems to fall within these categories. Such limited self-indulgence can co-exist with ordinary honesty and commitment to truth. (shrink)
In this paper I argue that Robert Kane’s defense of event-causal libertarianism, as presented in Responsibility, Luck, and Chance: Reflections on Free Will and Indeterminism, fails because his event-causal reconstruction is incoherent. I focus on the notions of efforts and self-forming actions essential to his defense.
Robert Adams’s Finite and Infinite Goods is one of the most important and innovative contributions to theistic ethics in recent memory. This article identifies two major flaws at the heart of Adams’s theory: his notion of intrinsic value and his claim that ‘excellence’ or finite goodness is constituted by resemblance to God. I first elucidate Adams’s complex, frequently misunderstood claims concerning intrinsic value and Godlikeness. I then contend that Adams’s notion of intrinsic value cannot explain what it could mean (...) for countless finite goods to be intrinsically valuable. Next, I articulate a criticism of his Godlikeness thesis altogether unlike those he has previously addressed: I show that, on Adams’s own account of Godlikeness, a diverse myriad of excellences could not possibly count as resembling God. His theory thus fails to account for a whole world of finite goods. I defend my two criticisms against objections and briefly sketch a more Aristotelian and Christian way forward. (shrink)
This paper is an extended discussion of Robert Ulanowicz’s critique of mechanistic and reductionistic metaphysics of science. He proposes “process ecology” as an alternative. In this paper I discuss four sets of question coming out of Ulanowicz’s proposal. First, I argue that universality remains one of the hallmarks of the scientific enterprise even with his new process metaphysics. I then discuss the Second Law of Thermodynamics in the interpretation of the history of the universe. I question Ulanowicz’s use of (...) the terms “random” and “chance” in his definition of process. Finally, I discuss what difference a relational and process metaphysics might make in addressing the political and practical problems in the twenty-first century. (shrink)
On the publication of Robert Lowell’s Life Studies in 1959, some critics were shocked by the poet’s use of seemingly frank autobiographical material, in particular the portrayal of his hospitalizations for bipolar disorder. During the late fifties and throughout the sixties, a rich vein, influenced by Lowell , developed in American poetry. Also during this time, the nascent science of psychopharmacology competed with and complemented the more established somatic treatments, such as psychosurgery, shock treatments, and psychoanalytical therapies. The development (...) of Thorazine was a remarkable breakthrough allowing patients previously thought incurable to leave hospital. In 1955, the release of Miltown, the first ‘minor’ tranquilizer, was heralded with a media fanfare promising a new dawn of psychological cure-all. These two events blurred the boundary between ‘normality’ and madness by making treatment in the community more widely possible and by medicalizing more commonplace distress. Lowell’s early depictions of madness situate it as emblematic of the cultural malaise of ‘the tranquilized fifties. ’ By his final collection, Day by Day (1977), mental illness had lost its symbolic power. These late poems explore the power of art as a way of representing and remedying suffering in a culture where psychopharmacology has normalized madness. (shrink)
This article examines the nature of Robert Grosseteste's commentary on Aristotle's Posterior Analytics with particular reference to his “conclusions” . It is argued that the simple demonstrative appearance of the commentary, which is very much the result of the 64 conclusions, is in part an illusion. Thus, the exposition in the commentary is not simply based on the strict principles of the Posterior Analytics and on the proof-procedures of Euclidean geometry; rather the commentary is a complicated mixture of different (...) elements of twelfth-century texts and the scholarship of Grosseteste's day. (shrink)
THERE IS WIDESPREAD AGREEMENT among historians that the writings of Robert Boyle (1697-1691) constitute a valuable archive for understanding the concerns of seventeenth-century British natural philosophers. His writings have often been seen as representing, in one fashion or another, all of the leading intellectual currents of his day. ~ There is somewhat less consensus, however, on the proper historiographic method for interpreting these writings, as well as on the specific details of the beliefs expressed in them. Studies seeking to (...) explicate Boyle's thought have been, roughly speaking, of two general sorts. On the one hand there are those studies of a broadly "intellectualist" orientation which situate his natural philosophy within the intellectual context provided by metaphysics, religion, and early modern science. In this connection his corpuscularianism has been shown to be motivated by specific epistemological, theological, as well as empirical concerns. One of the central aims of such studies has been to show that apparently discordant elements in his scientific thought are rendered coherent by referring them to such "non-scientific" commitments. Among studies of this sort might be mentioned the works of John Hedley Brooke, E. A. Burtt, Gary B. Deason, J. E. McGuire, R. Hooykaas, Robert H. Kargon, Eugene M. Klaaren, P. M. Rattansi, and Richard S. Westfall. (shrink)
I summarise Robert Audi's 'Moral Perception.' I concede that there is such a thing as moral perception. However, moral perceptions are culturally-relative, which refutes Audi’s claims that moral perception may ground moral knowledge and that it provides inter-subjectively accessible grounds which make ethical objectivity possible. Audi's attempt to avoid the refutation tends to convert rational disputes into ad hominem ones. I illustrate that with the example of the ethics of prostitution.
The aim of this paper is to describe and analyze the epistemological justification of a proposal initially made by the bio-mathematician Robert Rosen in 1958. In this theoretical proposal, Rosen suggests using the mathematical concept of « category » and the correlative concept of « natural equivalence » in mathematical modeling applied to living beings. Our questions are the following: according to Rosen, to what extent does the mathematical notion of category give access to more « natural » formalisms (...) in the modeling of living beings? Is the so-called « naturalness » of some kinds of equivalences (which the mathematical notion of category makes it possible to generalize and to put at the forefront) analogous to the naturalness of living systems? Rosen appears to answer « yes » and to ground this transfer of the concept of « natural equivalence » in biology on such an analogy. But this hypothesis, although fertile, remains debatable. Finally, this paper makes a brief account of the later evolution of Rosen’s arguments about this topic. In particular, it sheds light on the new role played by the notion of « category » in his more recent objections against computational models that since the 1990’s are pervading almost every domain of biology. (shrink)
For the last several decades, philosophers have wrestled with the proper place of religion in liberal societies. Usually, the debates among these philosophers have started with the articulation of various conceptions of liberalism and then proceeded to locate religion in the context of these conceptions. In the process, however, too little attention has been paid to the way religion is conceived. Drawing on the work of Robert Audi and Nicholas Wolterstorff, two scholars who are often read as holding opposing (...) views on these issues, I argue that, for the purposes of their argument about liberalism, both have implicitly accepted a concept of religion that has come under severe attack in recent work on the subject. Namely, they have accepted a concept of religion that identifies religion primarily with belief, ritual practice, and ecclesial institutions. Following recent scholarship, I suggest that religion is better conceived as a kind of culture. To conclude the essay, I gesture toward what the beginnings of a re-visioned debate about religion and liberal society might look like if one started from this revised conception of religion. (shrink)
Current sociology of knowledge tends to take for granted Robert K. Merton’s theory of cumulative advantage: successful ideas bring recognition to their authors, successful authors have their ideas recognized more easily than unknown ones. This article argues that this theory should be revised via the introduction of the differential between the status of an idea and that of its creator: when an idea is more important than its creator, the latter becomes identified with the former, and this will hinder (...) recognition of the intellectual’s new ideas as they differ from old ones in their content or style. Robert N. Bellah’s performance during the “civil religion debate” of the 1970s is reconstructed as an example of how this mechanism may work. Implications for further research are considered in the concluding section. (shrink)
Abstract Two distinguishing marks of voluntaristic conceptions of human action can be found already in the 12th century, not only in the work of Bonaventura's successors: 1. the will is free to act against reasons's dictates; 2. moral responsibility depends on this conception of the will's freedom. A number of theologians from the 1130s to the 1170s accepted those claims, which have been originally formulated by Bernard of Clairvaux. Robert of Melun elaborated them in a systematical way and coined (...) the terminological distinctions which were controversely discussed in the following centuries. The paper edits and interprets some of his texts about voluntary action. Furthermore, it shows that Bernard's and Robert's ideas have been transported by their intellectualist critics in the 13th century. (shrink)
This paper asks questions about 'trauma' and its cultural representation specifically, trauma's representation in the cinema. In this respect, it compares and contrasts the work of Robert Bresson, in particular his 1967 masterpiece, Mouchette , with contemporary Hollywood film. James Mangold's 1999 'Oscar-winning' Girl, Interrupted offers an interesting example for cultural comparison. In both Mouchette and Girl, Interrupted the subject matter includes, amongst other traumatic experiences, rape, childhood abuse and suicide. The paper ponders the question of whether such aspects (...) of trauma can ever be authentically represented on film; or, whether, on the contrary, through the deployment of cultural stereotypes, cinematic representation tends rather to reproduce the very forms of structural power which are, in the first place, trauma's primary cause. Bresson emerges from this analysis in a favourable light for, whereas Mangold stereotypes victims of trauma and represents traumatic experience itself as inevitable and over-determined, Bresson always retains for the victim a sense of critical agency. By contrasting key scenes from both films, the paper suggests that contemporary popular cinema (the 'Hollywood-ized' form), working in tandem with institutions of social control, such as psychiatry, does not subvert but, in fact, reproduces patterns of structural power. This argument has particular significance for the cultural representation of women. The paper is theoretically framed by Simone Weil's reflections on both 'representation' and 'structural power'. (shrink)
An individual is in the lowest phase of moral development if he thinks only of his own personal interest and has only his own selfish agenda in his mind as he encounters other humans. This lowest phase corresponds well with sixteenth century British moral egoism which reflects the rise of the new economic order. Adam Smith (1723–1790) wanted to defend this new economic order which is based on economic exchange between egoistic individuals. Nevertheless, he surely did not want to support (...) the moral theory of British egoism. His book The Wealth of Nations suits well into the world view of British moral egoism, but in the book The Theory of Moral Sentiments, he presents a moral theory which is the total opposite of moral egoism. Contemporary German intellectuals saw contradiction in Adam Smith’s moral (social) philosophy which they called as Das Adam - Smith - Problem . Smith himself didn’t think that there is any contradiction in a situation where in economic sphere (civil society) individual act egoistically and in ethical sphere (encounter with the imagined Other) he feels humanity and compassion toward his fellow men. Hegel was a passionate reader of Adam Smith and he acknowledged Das Adam - Smith - Problem . He set the task of his social philosophy to overcome this paradox. He wanted to create a theory of a social totality where economic egoism and feelings of humanity are not in contradiction. In the same time Hegel wanted to create a theory on Bildung process where human spirit develops from moral un-freedom (heteronomy) to moral freedom and maturity (autonomy) taking care both aspect of love and reason. In certain Hegel’s texts notion of recognition plays crucial role. That is why modern Hegelians Ludwig Siep, Axel Honneth and Robert Williams consider the notion of recognition to be elementary in Hegel’s threefold theory of developing human spirit from family via civil society to sittliche state . For Hegel family is a sphere where people love their “concrete other” and where feeling surpasses reason. Civil Society is a sphere of private contracts and economic exchanges where cold egoistic and calculative reason surpasses feelings. In the sphere of State the contradiction between family and Civil Society ( Das Adam - Smith - Problem ) is solved by “rational feeling”. According to Hegel State should protect citizens from alienating effect of egoistic reason of Civil Society and cultivate “family-feelings” to rational feelings which integrate citizen into “sittliche community” through reciprocal process of recognition. In this article I want to consider Hegelians Honneth’s and Williams’s relevance to the theory of moral development. (shrink)
This review presents the principal themes of Robert Spaemann's Persons: The Difference between ‘Someone’ and ‘Something.’ To be a person is not to be identical with one's teleological nature, but rather, to have that nature. Personal consciousness is necessarily temporal consciousness. Persons have a range of distinctively personal acts, such as recognizing and respecting one another, understanding their lives as wholes, making judgments of conscience, promising, and forgiving. All members of the human species, whatever their stage of development or (...) limitations, are persons. The present review also briefly considers certain objections that have been raised against Spaemann's position. (shrink)
En 1974, Robert Nozick publicó *Anarquía, Estado y Utopía*, una obra que, por primera vez, otorgaba estatus teórico a una de las corrientes del pensamiento neoliberal: el libertarianismo. En buena medida, el texto de Nozick se reclama como una relectura en clave de filosofía analítica de la teoría política de John Locke. En este artículo se ofrecen algunos argumentos para mostrar que, aunque la perspectiva de Nozick presenta ciertas similitudes retóricas con la obra del filósofo inglés, en cada uno (...) de los puntos fundamentales (como por ejemplo la idea de derecho, la noción de persona, el papel de la política y los conceptos de justicia y bien público) Nozick se aparta claramente de las premisas lockeanas. Como conclusión, se sostiene que al alejarse de la concepción lockeana, Nozick defiende una sociedad en la que la política está ausente y en la que el Estado aparece, paradójicamente, menos limitado que en las concepciones liberales clásicas. (shrink)
Although Darwinian concepts have largely been banned from the social sciences of the last century, they have recently seen a revival in several disciplines such as sociology, anthropology, or economics. Most of the current proponents of evolutionary theorizing in the social sciences avoid references to the older literature on social evolution. On that background, this article presents a contribution to Darwinist thinking in early American sociology that has mainly been overlooked in the literature. As the leading figure of the Human (...) Ecology Approach, which was established during the 1920s and 1930s, Robert Ezra Park drew heavily on evolutionary concepts to explain human evolution. A systematic presentation of these concepts in the light of the modern discussion on sociocultural evolution is given, followed by a conclusion about what can be learned from Park today. (shrink)
El presente texto pretende presentar dos propuestas de actualización de la crítica de la economía política marxiana: las de Moishe Postone y Robert Kurz. Sus planteamientos, gestados a partir de los años ochenta, ofrecen claves para superar las insuficiencias del marxismo tradicional y abren perspectivas fructíferas para actualizar la teoría crítica. Partiendo de una reinterpretación común de las categorías de Marx, ambos autores presentan sin embargo diagnósticos diferentes: mientras Postone incide en cómo el capitalismo origina la posibilidad de un (...) nuevo orden social, Kurz señala que el capitalismo contemporáneo habría alcanzado su límite interno y entrado en una fase irreversible de declive y desintegración. (shrink)
Hegel’s aesthetic ideal is the perfect integration of form and content within a work of art. This ideal is incompatible with the predominant 20th-century principle of formalist criticism, that form is the sole important factor in a work of art. Although the formalist dichotomy between form and content has been criticized on philosophical grounds, that does not suffice to justify Hegel’s ideal. Justifying Hegel’s ideal requires detailed art criticism that shows how form and content are, and why they should be, (...) integrated in good works of art. This essay provides some of this criticism. By focusing on the work of the contemporary artist, Robert Turner, this criticism further suggests that Hegel’s aesthetic ideal is still relevant. Moreover, the nature of Turner’s work suggests that art is still relevant in our day in ways Hegel did not expect. (shrink)
les problèmes éthiques et politiques dans la philosophie anglo-saxonne John Rawls et Robert Nozick Otfried Höffe. PRÉFACE Depuis quelque temps se manifeste un intérêt croissant des milieux philosophiques pour des questions d' éthique ...
Robert Grosseteste was the initiator of the English scientific tradition, one of the first chancellors of Oxford University, and a famous teacher and commentator on the newly discovered works of Aristotle. In this book, James McEvoy provides the first general, inclusive overview of the entire range of Grosseteste's massive intellectual achievement.
[Robert Stalnaker] Saul Kripke made a convincing case that there are necessary truths that are knowable only a posteriori as well as contingent truths that are knowable a priori. A number of philosophers have used a two-dimensional model semantic apparatus to represent and clarify the phenomena that Kripke pointed to. According to this analysis, statements have truth-conditions in two different ways depending on whether one considers a possible world 'as actual' or 'as counterfactual' in determining the truth-value of the (...) statement relative to that possible world. There are no necessary a posteriori or contingent a priori propositions: rather, contingent a priori and necessary a posteriori statements are statements that are necessary when evaluated one way, and contingent when evaluated the other way. This paper distinguishes two ways that the two-dimensional framework can be interpreted, and argues that one of them gives the better account of what it means to 'consider a world as actual', but that it provides no support for any notion of purely conceptual a priori truth. /// [Thomas Baldwin] Two-dimensional possible world semantic theory suggests that Kripke's examples of the necessary a posteriori and contingent a priori should be handled by interpreting names as implicitly indexical. Like Stalnaker, I reject this account of names and accept that Kripke's examples have to be accommodated within a metasemantic theory. But whereas Stalnaker maintains that a metasemantic approach undermines the conception of a priori truth, I argue that it offers the opportunity to develop a conception of the a priori aspect of stipulations, conceived as linguistic performances. The resulting position accommodates Kripke's examples in a way which is both intrinsically plausible and fits with Kripke's actual discussion of them. (shrink)
[Philip Percival] I aim to illuminate foundational epistemological issues by reflecting on 'epistemic consequentialism'-the epistemic analogue of ethical consequentialism. Epistemic consequentialism employs a concept of cognitive value playing a role in epistemic norms governing belief-like states that is analogous to the role goodness plays in act-governing moral norms. A distinction between 'direct' and 'indirect' versions of epistemic consequentialism is held to be as important as the familiar ethical distinction on which it is based. These versions are illustrated, respectively, by cognitive (...) decision-theory and reliabilism. Cognitive decision-theory is defended, and various conceptual issues concerning it explored. A simple dilemma suggests that epistemic consequentialism has radical consequences. /// [Robert Stalnaker] After reviewing the general ideas of the consequentialist framework, I take a critical look at two of the epistemic consequentialist projects that Philip Percival considers in his paper: the first assumes that there is a notion of acceptance that contrasts with belief and that can be evaluated by its expected epistemic utility. The second uses epis utility to evaluate beliefs and partial beliefs themselves, as well as actions, such as gathering information in the course of an inquiry. I express scepticism about the notion of acceptance required for the first project, and argue that the second kind of project can be fruitful only with a richer notion of epistemic utility than has yet been developed. (shrink)
Algebraic/topological descriptions of living processes are indispensable to the understanding of both biological and cognitive functions. This paper presents a fundamental algebraic description of living/cognitive processes and exposes its inherent ambiguity. Since ambiguity is forbidden to computation, no computational description can lend insight to inherently ambiguous processes. The impredicativity of these models is not a flaw, but is, rather, their strength. It enables us to reason with ambiguous mathematical representations of ambiguous natural processes. The noncomputability of these structures means computerized (...) simulacra of them are uninformative of their key properties. This leads to the question of how we should reason about them. That question is answered in this paper by presenting an example of such reasoning, the demonstration of a topological strategy for understanding how the fundamental structure can form itself from within itself. (shrink)