New concepts may prove necessary to profit from the avalanche of sequence data on the genome, transcriptome, proteome and interactome and to relate this information to cell physiology. Here, we focus on the concept of large activity-based structures, or hyperstructures, in which a variety of types of molecules are brought together to perform a function. We review the evidence for the existence of hyperstructures responsible for the initiation of DNA replication, the sequestration of newly replicated origins of replication, cell division (...) and for metabolism. The processes responsible for hyperstructure formation include changes in enzyme affinities due to metabolite-induction, lipid-protein affinities, elevated local concentrations of proteins and their binding sites on DNA and RNA, and transertion. Experimental techniques exist that can be used to study hyperstructures and we review some of the ones less familiar to biologists. Finally, we speculate on how a variety of in silico approaches involving cellular automata and multi-agent systems could be combined to develop new concepts in the form of an Integrated cell (I-cell) which would undergo selection for growth and survival in a world of artificial microbiology. (shrink)
The Empowered Self: Law and Society in the Age of Individualism examines the gradual emancipation of the individual in national and international law and the changing social attitudes towards personal choice in constituting identity. It demonstrates that this desire of persons for choice is not limited to Western industrial society but a historical development powered by such independent variables as urbanization, the communications revolution, education, and economic development. These factors are changing the way persons affiliate: their attitudes towards nationality, religion, (...) careers, sexuality, and gender roles. In the new climate of personal freedom, individuals increasingly select the components of their identity, choosing one or several from among multiple possible affiliations and questioning---even sometimes rejecting---the imposed or inherited forms of socialization, but despite such resistance, Thomas Franck demonstrates that we are now entering the age of the individual. (shrink)
The abilities to attribute an action to its proper agent and to understand its meaning when it is produced by someone else are basic aspects of human social communication. Several psychiatric syndromes, such as schizophrenia, seem to lead to a dysfunction of the awareness of one’s own action as well as of recognition of actions performed by other. Such syndromes offer a framework for studying the determinants of agency, the ability to correctly attribute actions to their veridical source. Thirty normal (...) subjects and 30 schizophrenic patients with and without hallucinations and/or delusional experiences were required to execute simple finger and wrist movements, without direct visual control of their hand. The image of either their own hand or an alien hand executing the same or a different movement was presented on a TV-screen in real time. The task for the subjects was to discriminate whether the hand presented on the screen was their own or not. Hallucinating and deluded schizophrenic patients were more impaired in discriminating their own hand from the alien one than the non-hallucinating ones, and tended to misattribute the alien hand to themselves. Results are discussed according to a model of action control. A tentative description of the mechanisms leading to action consciousness is proposed. (shrink)
Panpsychism is the doctrine that mind is a fundamental feature of the world existing throughout the universe. One problem with panpsychism is that it is a purely theoretical concept so far. For progress towards an operationalization of the idea, this paper suggests to make use of an ontological difference involved in the mind-matter distinction. The mode in which mental phenomena exist is called presence. The mode in which matter and radiation exist is called reality Physical theory disregards presence in both (...) the form of mental presence and the form of the temporal present In contrast to mental presence the temporal present is objective in the perspective of the third person. This relative kind of objectivity waits to be utilized for a hypothesis of how the mental and the physical are interrelated In order to do so this paper translates the mind-matter distinction into the distinction between mental and physical time and addresses the problem that panpsychism tries to attack head-on in these temporal terms. There are in particular , two issues thus getting involved: discussions about a time observable and the quantum Zeno effect. (shrink)
Pierre Maquet1,2,6, Steven Laureys1,2, Philippe Peigneux1,2,3, Sonia Fuchs1, Christophe Petiau1, Christophe Phillips1,6, Joel Aerts1, Guy Del Fiore1, Christian Degueldre1, Thierry Meulemans3, André Luxen1, Georges Franck1,2, Martial Van Der Linden3, Carlyle Smith4 and Axel Cleeremans5.
The nature of the relationship between language and thought has been quite elusive. We believe that its understanding is crucially dependent on the available notions of language and thought. Foundations of Language offers an unusually clear and complete account of both, providing a fruitful and much needed framework for future research. No doubt it will help us think better about these elusive complexities.
Whoever paid the bill at the restaurant last night, will clearly remember doing it. Independently from the type of action, it is a common experience that being the agent provides a special strength to our memories. Even if it is generally agreed that personal memories (episodic memory) rely on separate neural substrates with respect to general knowledge (semantic memory), little is known on the nature of the link between memory and the sense of agency. In the present paper, we review (...) results from two experiments investigating the effects of agency on both explicit and implicit memory traces. Performance of normal subjects is compared to that of schizophrenic patients in order to explore the role of awareness of action on memory. It is proposed that reliable first-person information is necessary to create a stable and coherent motor memory trace. (shrink)
This study aimed at evaluating the role of proprioception in the process of matching the final position of one's limbs with an intentional movement. Two experiments were realised with the same paradigm of conscious recognition of one's own limb position from a distorted position. In the first experiment, 22 healthy subjects performed the task in an active and in a passive condition. In the latter condition, proprioception was the only available information since the central signals related to the motor command (...) were likely to be absent. The second experiment was realised with a deafferented patient who suffers from a complete haptic deafferentation, including loss of proprioception. The results first argue in favour of a dominant role of proprioception in action recognition, but they also stress the possible role of central signals. The process of matching the final position of one's limbs with an intended movement and thus of action recognition would be achieved through a comparison process between the predicted sensory consequences of the action, which are stored in its internal model, and the actual sensory consequences of that action. (shrink)
Summary. It is proposed to translate the mind-matter distinction into terms of mental and physical time. In the spirit of this idea, we hypothesize a relation between the intensity of mental presence and a crucial time scale (some seconds) often referred to as a measure for the duration of nowness. This duration is experimentally accessible and might, thus, offer a suitable way to characterize the intensity of mental presence. Interesting consequences with respect to the idea of a generalized notion of (...) mental presence, with human consciousness as a special case, are outlined. Our approach includes some features consistent with other, related ideas which are indicated. (shrink)
European animal disease policy seems to find its justification in a “harm to other” principle. Limiting the freedom of animal keepers—e.g., by culling their animals—is justified by the aim to prevent harm, i.e., the spreading of the disease. The picture, however, is more complicated. Both during the control of outbreaks and in the prevention of notifiable, animal diseases the government is confronted with conflicting claims of stakeholders who anticipate running a risk to be harmed by each other, and who ask (...) for government intervention. In this paper, we first argue that in a policy that aims to prevent animal diseases, the focus shifts from limiting “harm” to weighing conflicting claims with respect to “risks of harm.” Therefore, we claim that the harm principle is no longer a sufficient justification for governmental intervention in animal disease prevention. A policy that has to deal with and distribute conflicting risks of harm needs additional value assumptions that guide this process of assessment and distribution. We show that currently, policies are based on assumptions that are mainly economic considerations. In order to show the limitations of these considerations, we use the interests and position of keepers of backyard animals as an example. Based on the problems they faced during and after the recent outbreaks, we defend the thesis that in order to develop a sustainable animal disease policy other than economic assumptions need to be taken into account. (shrink)
First, the importance of language in cognition is recognized. Nevertheless, this does not necessarily imply that the locus of thought is natural language (words, syntax, phonology). Then, difficulties with some of Carruthers’ hypotheses are stated: Is an account based on LFs capable of dealing with the complexities involved in what we call thought? Finally, mention of the issue of language production is made.
From the resurrection of body to eternal recurrence -- The shadow of God -- The guiding thread -- The logic of the body -- The system of identical cases -- From eternal recurrence to the resurrection of body.
Rosmini’s philosophy is a comprehensive effort toward the renovation of Christian thought in modern times. An intense discussion of the problem of knowledge led him to reformulate Augustine’s theory of illumination in terms of the ideal presence of universal being to the mind. Universal being is the lumen intellectus and our mind’s first object: it is implied in all our thoughts and makes them possible. Although devoid of reality, it shows remarkable features, such as infinity, necessity, and eternity. Without being (...) God, it may be called “divine,” and confers a special value to intelligent creatures, whose dignity comes from their being enlightened by universal being. The “divine” is also the seal of God’s presence in nature. The present article supports the logic of this argument. (shrink)
In an inspirational act of faith and hope, nearly one hundred contributors--social activists, thinkers, artists and spiritual leaders--reflect with poignant candor on our shared human condition and attempt to define a core set of human values in our rapidly changing socity. Contributors include: * The Dalai Lama * Wilma Mankiller * Oscar Arias * Jimmy Carter * Cornel West * Jack Miles * Mother Teresa * Nancy Willard * Elie Wiesel * James Earl Jones * Joan Chittister * Mary Evelyn (...) Tucker * Vaclav Havel * Archbishop Desmund Tutu What Does It Mean To Be Human? is a vital meditation on the endless possibilities of our humanity. (shrink)
Este artigo tem por objetivo expor algumas ideias para elaborar uma perspectiva teórica sobre a negaçáo que permita comparar as abordagens dos filósofos, lógicos e linguistas sobre esse tema. Na introduçáo sáo apresentadas algumas das questões discutidas sobre a negaçáo. Na primeira parte é feita a distinçáo entre frase, enunciado e proposiçáo. Essa análise nos permitirá por um lado diferenciar entidades linguísticas (frases, enunciados) de entidades lógicas (proposições), e pelo outro separar três planos na abordagem das questões sobre a negaçáo: (...) o plano do código linguístico, o do uso do código linguístico ou plano da enunciaçáo, e o plano da lógica. Na segunda parte distinguimos entre os conceitos de oposiçáo, de negaçáo e de expressáo intrínsecamente negativa. Na terceira parte abordamos a teoria fregeana da negaçáo a partir das distinções feitas anteriormente. Na quarta parte discutimos as concepções de Platáo e de Aristóteles sobre a negaçáo. Na quinta parte distinguimos entre a negaçáo de um enunciado, a negaçáo de uma componente de enunciado, a negaçáo proposicional e a negaçáo predicativa. Na sexta parte discutimos a questáo de se haveria um ato ilocucionário de negar oposto ao ato de afirmar. Na última parte apresentamos nossas conclusões. (shrink)
When a great thinker of the Spanish Golden Age, such as Vitoria, Molina or Suárez, inquires about the fundamental cause which justifies a licit declaration of war, “injury” is included as one of these causes. Here, “injury” is understood as an infringement of a right, an injustice committed and for which restitution has not been made. Among the injuries which may licitly be considered a justification for war, there is the “insult to honor”, especially to the honor of the (...) Nation and the honor of the Sovereign. The difficulty in accepting this justification for war lies in making a demarcation between the interest of the Sovereign and the interest of the People, i.e. the men who make up the community, who are the immediate depositaries of power. For these thinkers, honor is always owed to the People. This is because a State, a Nation, a People, has the right to enjoy respect for its institutions, laws and customs, as an integral part of its own life. (shrink)
A leading figure in sixteenth-century Iberian scholasticism, Molina was one of the most controversial thinkers in the history of Catholic thought. In keeping with the strongly libertarian account of human free choice that marked the early Jesuit theologians, Molina held that God's causal influence on free human acts does not by its intrinsic nature uniquely determine what those acts will be or whether they will be good or evil. Because of this, Molina asserted against his Dominican rivals (...) that God's comprehensive providential plan for the created world and infallible foreknowledge of future contingents do not derive just from the combination of his antecedent "natural" knowledge of metaphysically necessary truths and his "free" knowledge of the causal influence - both natural (general concurrence) and supernatural (grace) - by which he wills to cooperate with free human acts. Rather, in addition to God's natural knowledge, Molina posited a distinct kind of antecedent divine knowledge, dubbed "middle" knowledge, by which God knows pre-volitionally, i.e., prior to any free decree of his own will regarding contingent beings, how any possible rational creature would in fact freely choose to act in any possible circumstances in which it had the power to act freely. And on this basis Molina proceeded to forge his controversial reconciliation of free choice with the Catholic doctrines of grace, divine foreknowledge, providence, and predestination. In addition to his work in dogmatic theology, Molina was also an accomplished moral and political philosopher who wrote extensive and empirically well-informed tracts on political authority, slavery, war, and economics. (shrink)
Franck Grammont, Dorothée Legrand, and Pierre Livet (eds): Naturalizing Intention in Action Content Type Journal Article Category Book Review Pages 1-6 DOI 10.1007/s10746-012-9217-1 Authors Brian W. Dunst, University of South Florida, Tampa, FL, USA Journal Human Studies Online ISSN 1572-851X Print ISSN 0163-8548.
The doctrine of inerrant divine “middle knowledge” of future contingent events, first developed by the sixteenth century Jesuit theologian Luis de Molina, has resurfaced as a prominent position within contemporary debates over divine foreknowledge, creaturely freedom, and the ontological status of possibilities. As yet, the only substantive response to the new Molinism from a process perspective has come in a brief section on “Hartshorne and the Challenge of Molinism,” in an essay on Hartshorne’s view of “The Logic of Future (...) Contingents” by George W. Shields and Donald W. Viney, in Shields’ edited anthology Process and Analysis.Shields and Viney offer an insightful critique of Molinism. However, their use of Hartshorne’s understanding of possibility presents problems for those, like me, who prefer Whitehead’s more robustly realist notion of eternal objects. Here, I defend Whitehead’s Platonism from the main lines of criticism leveled against itby Hartshorne, while demonstrating that a “thick” conception of the objective content of the possible within the context of the divine understanding need not crossover into a deterministic conception of God’s foreknowledge, à la Molina. (shrink)
Franck L. B. Meijboom: Problems of Trust: A Question of Trustworthiness Content Type Journal Article DOI 10.1007/s10806-010-9300-4 Authors Martha L. Henderson, Master of Environmental Studies Program, The Evergreen State College, Olympia, WA 98505, USA Journal Journal of Agricultural and Environmental Ethics Online ISSN 1573-322X Print ISSN 1187-7863.
Contributors: Maria Aloni, Berit Brogaard, Paul Egré, Pascal Engel, Stephen Hetherington, Christopher Hookway, Franck Lihoreau, Martin Montminy, Duncan Pritchard, Ian Rumfitt, Daniele Sgaravatti, Claudine Tiercelin, Elia Zardini.
Ethics and Sustainability: Guest or Guide? On Sustainability as a Moral Ideal Content Type Journal Article Pages 1-5 DOI 10.1007/s10806-011-9322-6 Authors Franck L. B. Meijboom, Ethics Institute, Utrecht University, Janskerkhof 13a, 3512 BL Utrecht, The Netherlands Frans W. A. Brom, Ethics Institute, Utrecht University, Janskerkhof 13a, 3512 BL Utrecht, The Netherlands Journal Journal of Agricultural and Environmental Ethics Online ISSN 1573-322X Print ISSN 1187-7863.
The essays collected in this volume are all concerned with the connection between fiction and truth. This question is of utmost importance to metaphysics, philosophy of language, philosophical logic and epistemology, raising in each of these areas and at their intersections a large number of issues related to creation, existence, reference, identity, modality, belief, assertion, imagination, pretense, etc. All these topics and many more are addressed in this collection, which brings together original essays written from various points of view by (...) philosophers of diverse trends. These essays constitute major contributions to the current debates that the connection between truth and fiction continually enlivens, and give a sense of the directions in which research on this question is heading. Contributors: Fred Adams, Frederick Kroon, Robert Howell, Brendan Murday, Terence Parsons, Graham Priest, Erich Rast, Manuel Rebuschi, Marion Renauld, R.M. Sainsbury, Grant Tavinor, Alberto Voltolini. (shrink)
Molinism, named after Luis de Molina, is a theological system for reconciling human freedom with God's grace and providence. Presupposing a strongly libertarian account of freedom, Molinists assert against their rivals that the grace whereby God cooperates with supernaturally salvific acts is not intrinsically efficacious. To preserve divine providence and foreknowledge, they then posit "middle knowledge", through which God knows, prior to his own free decrees, how any possible rational agent would freely act in any possible situation. Beyond this, (...) they differ among themselves regarding the ground for middle knowledge and the doctrines of efficacious grace and predestination. (shrink)
A knowledge-how attributing sentence of the form ' S knows how to F ' may yield an 'ability-entailing' reading as well as an 'ability-neutral' reading. The present paper offers an epistemological account of the availability of both readings, based on two conceptual distinctions: first, a distinction between a 'practical' and a 'theoretical' kind of knowledge of how to do something; second, a distinction between an 'intrinsic' and an 'extrinsic' kind of ability to do something. The first part of the paper (...) presents the double distinction that constitutes the proposed account; the second part presents a number of theoretical, mainly epistemological motivations for accepting the account. (shrink)
The traditional doctrine of the Incarnation maintains that God became man. But was it necessary that God become the particular man He in fact became? Could some man or woman other than the man born in Bethlehem roughly two thousand years ago have been assumed by the Son to effect our salvation? This essay addresses such questions from the perspective of one embracing Molina's picture of divine providence. After showing how Molina thought his theory of middle knowledge helps (...) alleviate a traditional Christological puzzle, the essay turns to the aforementioned questions concerning God's incarnational alternatives and suggests some fairly radical answers. Finally, the essay presents two substantial objections to these radical answers and argues that these objections fail. (shrink)
In a series of recent papers Stephen Kearns and Daniel Star argue that normative reasons to ϕ simply are evidence that one ought to ϕ, and suggest that “evidence” in this context is best understood in standard Bayesian terms. I contest this suggestion.
According to orthodox Christianity, salvation depends on faith in Christ. If, however, God eternally punishes those who die ignorant of Christ, it appears that we have special instance of the problem of evil: the punishment of the religiously innocent. This is called the soteriological problem of evil. Using Molina's concept of middle knowledge, William Lane Craig develops a solution to this problem which he considers a theodicy. As developed by Craig, the Molinist theodicy rests on the problematic assumption that (...) all informed persons who would freely reject Christ are culpable. Using an informed Muslim as a counter-example, I try to show that Craig's Molinist solution begs the question. (shrink)
The many well-publicized food scandals in recent years have resulted in a general state of vulnerable trust. As a result, building consumer trust has become an important goal in agri-food policy. In their efforts to protect trust in the agricultural and food sector, governments and industries have tended to consider the problem of trust as merely a matter of informing consumers on risks. In this article, we argue that the food sector better addresses the problem of trust from the perspective (...) of the trustworthiness of the food sector itself. This broad idea for changing the focus of trust is the assumption that if you want to be trusted, you should be trustworthy. To provide a clear understanding of what being trustworthy means within the food sector, we elaborate on both the concept of trust and of responsibility. In this way we show that policy focused on enhancing transparency and providing information to consumers is crucial, but not sufficient for dealing with the problem of consumer trust in the current agri-food context. (shrink)
Using the philosophy of Jean-Luc Nancy as an anchoring point, Jacques Derrida in this book conducts a profound review of the philosophy of the sense of touch, from Plato and Aristotle to Jean-Luc Nancy, whose ground-breaking book Corpus he discusses in detail. Emmanuel Levinas, Maurice Merleau-Ponty, Edmund Husserl, Didier Franck, Martin Heidegger, Francoise Dastur, and Jean-Louis Chre;tien are discussed, as are Rene; Descartes, Diderot, Maine de Biran, Fe;lix Ravaisson, Immanuel Kant, Sigmund Freud, and others. The scope of Derrida’s deliberations (...) makes this book a virtual encyclopedia of the philosophy of touch (and the body). Derrida gives special consideration to the thinking of touch in Christianity and, in discussing Jean-Luc Nancy’s essay “Deconstruction of Christianity,” devotes a section of the book to the sense of touch in the Gospels. Another section concentrates on “the flesh,” as treated by Merleau-Ponty and others in his wake. Derrida’s critique of intuitionism, notably in the phenomenological tradition, is one of the guiding threads of the book. On Touching includes a wealth of notes that provide an extremely useful bibliographical resource. Personal and detached all at once, this book, one of the first published in English translation after Jacques Derrida’s death, serves as a useful and poignant retrospective on the work of the philosopher. A tribute by Jean-Luc Nancy, written a day after Jacques Derrida’s death, is an added feature. (shrink)
This paper proposes an extensionalist analysis of computer simulations (CSs). It puts the emphasis not on languages nor on models, but on symbols, on their extensions, and on their various ways of referring. It shows that chains of reference of symbols in CSs are multiple and of different kinds. As they are distinct and diverse, these chains enable different kinds of remoteness of reference and different kinds of validation for CSs. Although some methodological papers have already underlined the role of (...) these various relationships of reference in CSs and of cross-validations, this diversity is still overlooked in the epistemological literature on CSs. As a consequence, a particular outcome of this analytical view is an ability to classify existing epistemological theses on CSs according to what their authors choose to select and put at the forefront: either the extensions of symbols, or the symbol-types, or the symbol-tokens, or the internal denotational hierarchies of the CS or the reference of these hierarchies to external denotational hierarchies. Through the adoption of this extensionalist view, it also becomes possible to explain more precisely the reasons why some complete reduction of CSs to classical epistemic paradigms such as “experiment” or “theoretical argument” remains doubtful. On this last point, in particular, this paper is in agreement with what many epistemologists already have acknowledged. (shrink)
We discuss the justification of Bickle's “ruthless” reductionism. Bickle intends to show that we know enough about neurons to draw conclusions about the “whole” brain and about the mind. However, his reductionism does not take into account the complexity of the nervous system and the fact that new properties emerge at each significant level of integration from the coupled functioning of elementary components. From a methodological point of view, we argue that neuronal and cognitive models have to exert a mutual (...) constraint(MC) on each other. This approach would refuse to award any priority of cognitive approaches over neuroscience, and reciprocally, to refuse any priority of neuroscience over cognitive approaches. MC thus argues against radicalreductionism at the methodological level. (shrink)
ThomasFranck believes that the strict constraints imposed by the UN Charter on military intervention in other countries have become too constraining and that, so long as the Charter text remains unrevised, we should condone violations of these rules as legitimated by a jurying process. The relevant UN Charter constraints he seeks to subvert are two in particular. First, the Charter suggests that, outside the UN system, military force may be used across national borders only in “individual or (...) collective self-defense if an armed attack occurs against a Member of the United Nations, until the Security Council has taken measures necessary to maintain international peace and security” (Article 51). Apart from this, “All Members shall refrain in their international relations from the threat or use of force against the territorial integrity or political independence of any state, or in any other manner inconsistent with the Purposes of the United Nations” (Article 2(4)). Second, regarding the use of force by the UN itself, the Charter proclaims that “Nothing contained in the present Charter shall authorize the United Nations to intervene in matters which are essentially within the domestic jurisdiction of any state or shall require the Members to submit such matters to settlement under the present Charter; but this principle shall not prejudice the application of enforcement measures under Chapter VII” (Article 2(7)).1.. (shrink)
Thomas & Karmiloff-Smith (T&K-S) claim that “Residual Normality” is a priori unlikely, that is, that specific cognitive deficits should not exist in developmental disorders. Here I review evidence that a specific cognitive deficit is at the core of developmental dyslexia and I provide a possible neurological account thereof.
The aim of this paper is to discuss the “Framework for M&S with Agents” (FMSA) proposed by Zeigler et al. [2000, 2009] in regard to the diverse epistemological aims of agent simulations in social sciences. We first show that there surely are great similitudes, hence that the aim to emulate a universal “automated modeler agent” opens new ways of interactions between these two domains of M&S with agents. E.g., it can be shown that the multi-level conception at the core of (...) the FMSA is similar in both contexts: notions of “levels of system specifi cation”, “behavior of models”, “simulator”and “endomorphic agents” can be partially translated in the terms linked to the “denotational hierarchy” (DH) and recently introduced in a multi-level centered epistemology of M&S. Second, we suggest considering the question of “credibility” of agent M&S in social sciences when we do not try to emulate but only to simulate target systems. Whereas a stringent and standardized treatment of the heterogeneous internal relations (in the DH) between systems of formalisms is the key problem and the essential challenge in the scope of Agent M&S driven engineering, it is urgent too to address the problem of the external relations (and of the external validity, hence of the epistemic power and credibility) of such levels of formalisms in the specific domains of agent M&S in social sciences, especially when we intend to introduce the concepts of activity tracking. (shrink)
Cet article ne se veut pas un commentaire suivi de la réflexion de Wittgenstein sur les règles. Ce ne sera pas non plus un commentaire de l’interprétation que Kripke fait du « suivi de la règle » chez Wittgenstein. Il ne sera pas davantage une application des thèses de Wittgenstein ni une tentative d’application directe d’une interprétation de ces thèses à l’épistémologie de la simulation du vivant ; ce qui serait, en soi, d’ailleurs contestable. Ce travail vise seulement à approfondir (...) la réflexion sur le statut cognitif de la simulation informatique du vivant. À ce titre, qui est donc essentiellement épistémologique et ciblé, il se veut une suggestion d’interprétation conceptuelle de certaines formes de simulation informatique du vivant, suggestion elle-même adossée à une prolongation de certaines distinctions déjà effectuées par Wittgenstein et ses commentateurs au sujet des règles et de leur suivi. L’objectif est de chercher à voir si, par ce moyen, la simulation informatique du vivant, par contraste avec les pratiques plus traditionnelles de modélisation, ne pourrait pas être plus précisément expliquée et légitimée, dans ses apports épistémologiques, comme dans ses limites aussi. (shrink)
Surging from the ontopoietic vital timing of life, human self-consciousness prompts the innermost desire to rise above its brute facts. Imaginatio creatrix inspires us to fabulate these facts into events and plots with personal significance attempting to delineate a life-course in life-stories within the ever-flowing stream – existence. Seeking their deep motivations, causes and concatenations, we fabulate relatively stabilized networks of interconnecting meaning – history. But to understand the meaning and sense of these networks’ reconfigurations call for the purpose and (...) telos of our endless undertaking; they remain always incomplete, carried onwards with the current of life, while fluctuating with personal experience in the play of memory. Facts and life stories, subjective desires and propensities, the circumambient world in its historical moves, creative logos and mythos, personal freedom and inward stirrings thrown in an enigmatic interplay, prompt our imperative thirst for the meaning of this course, its purpose and its fulfillment – the sense of it all. To disentangle all this animates the passions of the literary genius. The focus of this collection is to isolate the main arteries running through the intermingled forces prompting our quest to endow life with meaning. Papers by: Jadwiga Smith, Lawrence Kimmel, Alira Ashvo-Munoz, William D. Melaney, Imafedia Okhamafe, Michel Dion, Franck Dalmas, Ludmila Molodkina, Victor Gerald Rivas, Rebecca M. Painter, Matti Itkonen, Raymond J. Wilson III, Christopher S. Schreiner, Bruce Ross, Bernadette Prochaska, Tsung-I Dow, Jerre Collins, Cezary Jozef Olbromski, Victor Kocay, Roberto Verolini. (shrink)
The problems of the social responsibility of the scientist became a subject of public debate after the World War II in Japan, thanks to the activities and publications of Yukawa and Tomonaga. And such authors as J. Karaki, M.Taketani, Y. Murakami, and S. Fujinaga continued discussion in their books. However, many people seem to be still unaware of the most important source of these problems. As I see it, one of the most important treatments of these problems was the (...) class='Hi'>Franck Report (June 11, 1945) submitted to the US government by James Franck (chairman) toward the end of the war. This Report contains many important ideas and suggestions as regards the responsibility of the scientist, the morality of the use of atomic bombs, the prospective nuclear armaments race, and the possibility of international control of nuclear power. However, I should like to concentrate only on the first topic in this paper. Why did Franck and his committee at Metallurgical Laboratory of the University of Chicago feel the urgent need for writing and submitting this Report? According to the Report, "in the past, scientists could disclaim direct responsibility for the use to which mankind had put their disinterested discoveries. We cannot take the same attitude now because the success which we have achieved in the development of nuclear power is fraught with infinitely greater dangers than were all the inventions of the past." (I. Preamble) This passage seems to contain the crux of our problem: These scientists clearly recognize the "new" responsibility for them, and the ground of this responsibility is also clear enough; i.e., when a new scientific discovery or invention turns out to have grave bearings on human interests, the scientists who became aware of that are responsible for notifying people of this and advise to look for suitable means for avoiding prospective dangers. In the rest of the paper, I elaborate the reasoning behind the preceding passage, and confirm that basically the same idea and reasoning has been repeated and developed in Russell-Einstein Manifesto (1955), by Pugwash Conferences (first in 1957), by Tomogana, and by Rotblat (the long-time Secretary of Pugwash, who received the Nobel Peace Prize together with the Conference in 1995). (shrink)
Managers’ commitment to contribute to sustainable development holds the key to their long-term business success and may be a source of competitive advantage. The managerial perception of business ethics is influenced by the level of moral development and personal characteristics of managers. These perceptions are also shaped by forces existing in the environment of the firm, including available resources, societal expectations, sector, and regulations. The resource-based perspective can thus contribute to the analysis of ethical issues offering important insights on how (...) they can influence the environmental strategy of the firm. The findings of this study show that firm resources have a strong influence on business managers’ ethical attitudes. In addition, the application of resource-based rationales to ethical issues can be justified in the following several ways: it influences a managerial perception of natural environment as a competitive opportunity, it requires investments of financial and human resources, flexibility and speed in the adaptation to environmental changes, and it creates new resource-based opportunities through changes in prevention pollution technology, policy process, and market forces. (shrink)
Since the 1990’s, social sciences are living their computational turn. This paper aims to clarify the epistemological meaning of this turn. To do this, we have to discriminate between different epistemic functions of computation among the diverse uses of computers for modeling and simulating in the social sciences. Because of the introduction of a new – and often more user-friendly – way of formalizing and computing, the question of realism of formalisms and of proof value of computational treatments reemerges. Facing (...) the spreading of computational simulations in all disciplines, some enthusiastic observers are claiming that we are entering a new era of unity for social sciences. Finally, the article shows that the conceptual and epistemological distinctions presented in the first sections lead to a more mitigated position: the transdisciplinary computational turn is a great one, but it is of a methodological nature. (shrink)
Par un procédé d'objections/réponses, nous passons d'abord en revue certains des arguments en faveur ou en défaveur du caractère empirique de la simulation informatique. A l'issue de ce chemin clarificateur, nous proposons des arguments en faveur du caractère concret des objets simulés en science, ce qui légitime le fait que l'on parle à leur sujet d'une expérience, plus spécifiquement d'une expérience concrète du second genre.
The credibility of digital computer simulations has always been a problem. Today, through the debate on verification and validation, it has become a key issue. I will review the existing theses on that question. I will show that, due to the role of epistemological beliefs in science, no general agreement can be found on this matter. Hence, the complexity of the construction of sciences must be acknowledged. I illustrate these claims with a recent historical example. Finally I temperate this diversity (...) by insisting on recent trends in environmental sciences and in industrial sciences. (shrink)