Avec un titre comme Luther et la philosophie, depuis le xviiie siècle et dans les milieux « libéraux » du xixe siècle, on aurait pu s’attendre à un exposé, bien sûr complet, de la philosophie du Réformateur. On trouve l’expression, par exemple, dans les tables analytiques de L’Encyclopédie, à l’entrée « luthéranisme ». Bien que Philippe Büttgen se soit donné comme objet, pour d’autres travaux, « la confessionnalisation de la philosophie ..
Jean-Luc Nancy discusses his life's work with Pierre-Philippe Jandin. As Nancy looks back on his philosophical texts, he thinks anew about democracy, community, jouissance, love, Christianity, and the arts.
In this paper we introduce the concept of d-fuzzy function which generalizes the concept of fuzzy subalgebra to a much larger class of functions in a natural way. In addition we discuss a method of fuzzification of a wide class of algebraic systems onto [0, 1] along with some consequences.
The concept of the cortical column refers to vertical cell bands with similar response properties, which were initially observed by Vernon Mountcastle’s mapping of single cell recordings in the cat somatic cortex. It has subsequently guided over 50 years of neuroscientific research, in which fundamental questions about the modularity of the cortex and basic principles of sensory information processing were empirically investigated. Nevertheless, the status of the column remains controversial today, as skeptical commentators proclaim that the vertical cell bands are (...) a functionally insignificant by-product of ontogenetic development. This paper inquires how the column came to be viewed as an elementary unit of the cortex from Mountcastle’s discovery in 1955 until David Hubel and Torsten Wiesel’s reception of the Nobel Prize in 1981. I first argue that Mountcastle’s vertical electrode recordings served as criteria for applying the column concept to electrophysiological data. In contrast to previous authors, I claim that this move from electrophysiological data to the phenomenon of columnar responses was concept-laden, but not theory-laden. In the second part of the paper, I argue that Mountcastle’s criteria provided Hubel Wiesel with a conceptual outlook, i.e. it allowed them to anticipate columnar patterns in the cat and macaque visual cortex. I argue that in the late 1970s, this outlook only briefly took a form that one could call a ‘theory’ of the cerebral cortex, before new experimental techniques started to diversify column research. I end by showing how this account of early column research fits into a larger project that follows the conceptual development of the column into the present. (shrink)
In 1981, David Hubel and Torsten Wiesel received the Nobel Prize for their research on cortical columns—vertical bands of neurons with similar functional properties. This success led to the view that “cortical column” refers to the basic building block of the mammalian neocortex. Since the 1990s, however, critics questioned this building block picture of “cortical column” and debated whether this concept is useless and should be replaced with successor concepts. This paper inquires which experimental results after 1981 challenged the building (...) block picture and whether these challenges warrant the elimination “cortical column” from neuroscientific discourse. I argue that the proliferation of experimental techniques led to a patchwork of locally adapted uses of the column concept. Each use refers to a different kind of cortical structure, rather than a neocortical building block. Once we acknowledge this diverse-kinds picture of “cortical column”, the elimination of column concept becomes unnecessary. Rather, I suggest that “cortical column” has reached conceptual retirement: although it cannot be used to identify a neocortical building block, column research is still useful as a guide and cautionary tale for ongoing research. At the same time, neuroscientists should search for alternative concepts when studying the functional architecture of the neocortex. keywords: Cortical column, conceptual development, history of neuroscience, patchwork, eliminativism, conceptual retirement. (shrink)
is an extension of public announcement logic. It is based on a modal operator that expresses what is true after any arbitrary announcement. An incorrect Truth Lemma has been stated and ‘demonstrated’ in Balbiani et al. . In this paper, we put right the wording and the proof of the Truth Lemma for.
We construct a model for the level by level equivalence between strong compactness and supercompactness in which below the least supercompact cardinal κ, there is a stationary set of cardinals on which SCH fails. In this model, the structure of the class of supercompact cardinals can be arbitrary.
Ariès traces Western man's attitudes toward mortality from the early medieval conception of death as the familiar collective destiny of the human race to the modern tendency, so pronounced in industrial societies, to hide death as if it were an embarrassing family secret.
I will present and criticise the two theories of truthmaking David Armstrong offers us in Truth and Truthmakers (Armstrong 2004), show to what extent they are incompatible and identify troublemakers for both of them, a notorious – Factualism, the view that the world is a world of states of affairs – and a more recent one – the view that every predication is necessary. Factualism, combined with truthmaker necessitarianism – ‘truthmaking is necessitation’ – leads Armstrong to an all-embracing totality state (...) of affairs that necessitates not only everything that is the case but also everything else – that which is not the case, that which is merely possible or even impossible. All the things so dear to realists – rocks, natural properties, real persons – become mere abstractions from this ontological monster. The view that every predication is necessary does in some sense the opposite: it does away with totality states of affairs and, arguably, also with states of affairs. We have particulars and universals, partially identical and necessarily connected to everything else. Just by the existence of anything, everything is necessitated – the whole world mirrored in every monad. Faced with the choice between these two equally unappealing alternatives, I suggest returning to Armstrong’s more empiricist past: the world is not an all-inclusive One, nor necessitated by every single particular and every single universal, but a plurality of particulars and universals, interconnected by a contingent and internal relation of exemplification. While a close variant, truthmaker essentialism, can perhaps be saved, this means giving up on truthmaker necessitarianism. This, I think, what it takes to steer a clear empiricist course between the Scylla of Spinozist general factness and the Charybdis of a Leibnizian overdose of brute necessities. (shrink)
Quelle politique peut-on faire quand on est un idiot? Loin d'être saugrenue, c'est bien la question qu'on est conduit à se poser inévitablement en lisant l'oeuvre de Gilles Deleuze. L'"idiot" joue, en effet, un rôle incontournable et essentiel dans la philosophie de Deleuze. Il est le personnage conceptuel qui fait tenir cette philosophie dans sa consistance propre. Il se situe à la charnière de l'image de la pensée que le philosophe invoque et suppose plus ou moins implicitement et de la (...) création de concepts qu'il produit explicitement. Aussi, faire de la philosophie, tout comme agir et penser politiquement, c'est toujours une manière de faire l'idiot. Les conséquences de cette approche sont capitales, du fait des questions posées à la réflexion politique classique centrée sur le Droit et l'Etat, du fait aussi des problématiques qui se voient invalidées dans ce champ. La place qui revient au contrôle et au bio-pouvoir, aux zones d'indétermination et aux espaces lisses, permet de prendre une saine distance vis-à-vis de la politique "majoritaire" et de faire apparaître, comme déjà de vieux clichés, bon nombre de thèmes et de concepts revendiqués par les organisations alternatives, mondialistes et subversives récentes. C'est une tout autre idée de la politique, centrée sur les devenirs, que nous invite à méditer la politique deleuzienne de l'idiot. Elle nous ouvre à d'autres espérances, nouvelles, en lien avec un "peuple absent" qui naîtrait avec chaque devenir. (shrink)
Philippe Descola has become one of the most important anthropologists working today, and Beyond Nature and Culture has been a major influence in European intellectual life since its French publication in 2005. Here, finally, it is brought to English-language readers. At its heart is a question central to both anthropology and philosophy: what is the relationship between nature and culture? Culture—as a collective human making, of art, language, and so forth—is often seen as essentially different from nature, which is portrayed (...) as a collective of the nonhuman world, of plants, animals, geology, and natural forces. Descola shows this essential difference to be, however, not only a specifically Western notion, but also a very recent one. Drawing on ethnographic examples from around the world and theoretical understandings from cognitive science, structural analysis, and phenomenology, he formulates a sophisticated new framework, the “four ontologies”— animism, totemism, naturalism, and analogism—to account for all the ways we relate ourselves to nature. By thinking beyond nature and culture as a simple dichotomy, Descola offers nothing short of a fundamental reformulation by which anthropologists and philosophers can see the world afresh. (shrink)
This paper concerns Aristotle's kind‐crossing prohibition. My aim is twofold. I argue that the traditional accounts of the prohibition are subject to serious internal difficulties and should be questioned. According to these accounts, Aristotle's prohibition is based on the individuation of scientific disciplines and the general kind that a discipline is about, and it says that scientific demonstrations must not cross from one discipline, and corresponding kind, to another. I propose a very different account of the prohibition. The prohibition is (...) based on Aristotle's scientific and metaphysical essentialism, according to which a scientific demonstration must take as its starting point a set of per se properties of a subject, if these make up a single, unitary definition. The subject of demonstration here is a kind, although not the general kind associated with a discipline, but rather the particular kind that the particular demonstration is about. (shrink)
This book aims to make the pragmatist intellectual framework accessible to organization and management scholars. It presents some fundamental concepts of Pragmatism, their potential application to the study of organizations and the resulting theoretical, methodological, and practical issues.
The focus of this paper are the ethical, legal and social challenges for ensuring the responsible use of “big brain data”—the recording, collection and analysis of individuals’ brain data on a large scale with clinical and consumer-directed neurotechnological devices. First, I highlight the benefits of big data and machine learning analytics in neuroscience for basic and translational research. Then, I describe some of the technological, social and psychological barriers for securing brain data from unwarranted access. In this context, I then (...) examine ways in which safeguards at the hardware and software level, as well as increasing “data literacy” in society, may enhance the security of neurotechnological devices and protect the privacy of personal brain data. Regarding ethical and legal ramifications of big brain data, I first discuss effects on the autonomy, the sense of agency and authenticity, as well as the self that may result from the interaction between users and intelligent, particularly closed-loop, neurotechnological devices. I then discuss the impact of the “datafication” in basic and clinical neuroscience research on the just distribution of resources and access to these transformative technologies. In the legal realm, I examine possible legal consequences that arises from the increasing abilities to decode brain states and their corresponding subjective phenomenological experiences on the hitherto inaccessible privacy of these information. Finally, I discuss the implications of big brain data for national and international regulatory policies and models of good data governance. (shrink)
The aim of this paper is to discuss the “Austro-American” logical empiricism proposed by physicist and philosopher Philipp Frank, particularly his interpretation of Carnap’s Aufbau, which he considered the charter of logical empiricism as a scientific world conception. According to Frank, the Aufbau was to be read as an integration of the ideas of Mach and Poincaré, leading eventually to a pragmatism quite similar to that of the American pragmatist William James. Relying on this peculiar interpretation, Frank intended to (...) bring about a rapprochement between the logical empiricism of the Vienna Circle in exile and American pragmatism. In the course of this project, in the last years of his career, Frank outlined a comprehensive, socially engaged philosophy of science that could serve as a “link between science and philosophy”. (shrink)
Polysemous concepts with multiple related meanings pervade natural languages, yet some philosophers argue that we should eliminate them to avoid miscommunication and pointless debates in scientific discourse. This paper defends the legitimacy of polysemous concepts in science against this eliminativist challenge. My approach analyses such concepts as patchworks with multiple scale-dependent, technique-involving, domain-specific and property-targeting uses (patches). I demonstrate the generality of my approach by applying it to "hardness" in materials science, "homology" in evolutionary biology, "gold" in chemistry and "cortical (...) column" in neuroscience. Such patchwork concepts are legitimate if the techniques used to apply them produce reliable results, the domains to which they are applied are homogenous, and the properties they refer to are significant to describe, classify or explain the behavior of entities in the extension of the concept. By following these normative constraints, researchers can avoid miscommunication and pointless debates without having to eliminate polysemous patchwork concepts in scientific discourse. (shrink)
A great mathematician and teacher, and a physicist and philosopher in his own right, bridges the gap between science and the humanities in this exposition of the philosophy of science. He traces the history of science from Aristotle to Einstein to illustrate philosophy's ongoing role in the scientific process. In this volume he explains modern technology's gradual erosion of the rapport between physical theories and philosophical systems, and offers suggestions for restoring the link between these related areas. This book is (...) suitable for undergraduate students and other readers. 1962 ed. Index. 36 figures. (shrink)
This paper proposes an analysis of the discursive dynamics of high-impact concepts in the humanities. These are concepts whose formation and development have a lasting and wide-ranging effect on research and our understanding of discursive reality in general. The notion of a conceptual practice, based on a normative conception of practice, is introduced, and practices are identified, on this perspective, according to the way their respective performances are held mutually accountable. This normative conception of practices is then combined with recent (...) work from philosophy of science that characterizes concepts in terms of conceptual capacities that are productive, open-ended, and applicable beyond the original context they were developed in. It is shown that the formation of concepts can be identified by changes in how practitioners hold exercise of their conceptual capacities accountable when producing knowledge about a phenomenon. In a manner similar to the use of operational definitions in scientific practices, such concepts can also be used to intervene in various discourses within or outside the conceptual practice. Using the formation of the concepts “mechanism” and “performative” as examples, the paper shows how high-impact concepts reconfigure what is at issue and at stake in conceptual practices. As philosophy and other humanities disciplines are its domain of interest, it is a contribution to the methodology of the humanities. (shrink)