In this article, we document the growing influence of non-governmental organizations (NGOs) in the realm of socially responsible investing (SRI). Drawing from ethical and economic perspectives on stakeholder management and agency theory, we develop a framework to understand how and when NGOs will be most influential in shaping the ethical and social responsibility orientations of business using the emergence of SRI as the primary influencing vehicle. We find that NGOs have opportunities to influence corporate conduct via direct, indirect, and interactive (...) influences on the investment community, and that the overall influence of NGOs as major actors in socially responsible investment is growing, with attendant consequences for corporate strategy, governance, and social performance. (shrink)
We argue that differences in the institutional setting of Europe and the US is the critical factor in understanding policymaking in Europe and the United States, and particularly the influence of nongovernmental organizations (NGOs). To test this relationship between institutional differences, corporate social responsibility (CSR), and NGO activism, we investigate 12 cases involving US and European companies in each of three industries. We conclude that different institutional structures and political legacies in the US and Europe are important factors in explaining (...) the influence of NGOs on business and in the policymaking process, regardless of the timeliness of corporate strategy or NGO influence. (shrink)
What are individuals? How can they be identified? These are crucial questions for philosophers and scientists alike. Criteria of individuality seem to differ markedly between metaphysics and the empirical sciences - and this might well explain why no work has hitherto attempted to relate the contributions of metaphysics, physics and biology on this question. This timely volume brings together various strands of research into 'individuality', examining how different sciences handle the issue, and reflecting on how this scientific work relates to (...) metaphysical concerns. The collection makes a major contribution to clarifying and overcoming obstacles to the construction of a general conception of the individual adequate for both physics and biology, and perhaps even beyond. (shrink)
The paper describes the approach by which ethics are integrated into the undergraduate curriculum at Northern Illinois University''s College of Business. Literature is reviewed to identify conceptual frameworks for, and issues associated with, the teaching of business ethics. From the review, a set of guidelines for teaching ethics is developed and proposed. The objectives and strategies implemented for teaching ethics is discussed. Foundation and follow-up coursework, measurement issues and ancillary programs are also discussed.
By recourse to the fundamentals of preference orderings and their numerical representations through linear utility, we address certain questions raised in Nover and Hájek 2004, Hájek and Nover 2006, and Colyvan 2006. In brief, the Pasadena and Altadena games are well-defined and can be assigned any finite utility values while remaining consistent with preferences between those games having well-defined finite expected value. This is also true for the St Petersburg game. Furthermore, the dominance claimed for the Altadena game over the (...) Pasadena game, and that would have been claimed for the St Petersburg game over the Altadena, can be contradicted without fear of inconsistency with the axioms of utility theory. However, insistence upon dominance can be made to yield a contradiction of the Archimedean axiom of utility theory. CiteULike Connotea Del.icio.us What's this? (shrink)
Theses on the semiotic study of life as presented here provide a collectively formulated set of statements on what biology needs to be focused on in order to describe life as a process based on semiosis, or sign action. An aim of the biosemiotic approach is to explain how life evolves through all varieties of forms of communication and signification (including cellular adaptive behavior, animal communication, and human intellect) and to provide tools for grounding sign theories. We introduce the concept (...) of semiotic threshold zone and analyze the concepts of semiosis, function, umwelt, and the like as the basic concepts for theoretical biology. (shrink)
Terrence Malick and the Thought of Film explores how the experience of viewing Terrence Malick's films enables imaginative acts of philosophical interpretation. Useful for both professional philosophers interested in film and scholars of cinema intrigued by philosophy, this book shows the ways Malick's films cast philosophy in new cinematic light.
The transference theory reduces causation to the transmission of physical conserved quantities, like energy or momenta. Although this theory aims at applying to all felds of physics, we claim that it fails to account for a quantum electrodynamic effect, viz. the Aharonov-Bohm effect. After having argued that the Aharonov-Bohm effect is a genuine counter-example for the transference theory, we offer a new physicalist approach of causation, ontic and modal, in which this effect is embedded.
In this paper, we show that it is not a conceptual truth about laws of nature that they are immutable. In order to do so, we survey three popular accounts of lawhood— necessitarianism, dispositionalism and ‘best system analysis’—and expose the extent, as well as the philosophical cost, of the amendments that should be enforced in order to leave room for the possibility of changing laws.
I. Introduction This paper aims to explain Nietzsche’s understanding of tragedy, and in particular his self-characterization as the “tragic philosopher.” What I shall claim is that, according to Nietzsche, to recognize the self-determining or self-creating character of our agency is to reveal it as tragic. Tragedy accordingly illuminates the most fundamental issue in Nietzsche’s mature philosophy: the possibility of affirmation.
Shared views regarding the moral respect which is owed to children in family life are used as a guide in determining the moral permissibility of nontherapeutic clinical research procedures involving children. The comparison suggests that it is not appropriate to seek assent from the preadolescent child. The analogy with interventions used in family life is similarly employed to specify the permissible limit of risk to which children may be exposed in nontherapeutic research procedures. The analysis indicates that recent writers misconceive (...) how certain moral principles, such as respect for personal autonomy, require us to act toward children. The results are also used to assess proposed federal regulations on research with children. (shrink)
The thesis of this paper is that consequentialism does not work as a comprehensive theory of right action. This paper does not offer a typical refutation, in that I do not claim that consequentialism is self-contradictory. One can with perfect consistency claim that the good is prior to the right and that the right consists in maximizing the good. What I claim, however, is that it is senseless to make such a claim. In particular, I attempt to show that the (...) notion of what course of action maximizes the good has no content within a consequentialist framework. Since the problem that I identify rests with maximization, this refutation does not cut across the act/rule distinction. If rule consequentialism holds that there are occasions on which one should follow a rule rather than violate the rule in an optimific way, then it is not maximizing and my arguments do not apply; if not, then it collapses into act consequentialism. I have nothing to say about nonmaximizing forms of consequentialism.1 This refutation does, however, cut across the direct/indirect distinction.2 It makes no difference whether we take consequentialism as offering a principle of decision, or a standard of right. Presumably the former would be parasitic upon the latter for its legitimacy. (shrink)
In this paper, we put forward a new account of emergence called “transformational emergence”. Such an account captures a variety of emergence that can be considered as being diachronic and weakly ontological. The fact that transformational emergence actually constitutes a genuine form of emergence is motivated. Besides, the account is free of traditional problems surrounding more usual, synchronic versions of emergence, and it can find a strong empirical support in a specific physical phenomenon, the fractional quantum Hall effect, which has (...) long been touted as a paradigmatic case of emergence. (shrink)
A simple molecular system is described consisting of the reciprocal linkage between an autocatalytic cycle and a self-assembling encapsulation process where the molecular constituents for the capsule are products of the autocatalysis. In a molecular environment sufficiently rich in the substrates, capsule growth will also occur with high predictability. Growth to closure will be most probable in the vicinity of the most prolific autocatalysis and will thus tend to spontaneously enclose supportive catalysts within the capsule interior. If subsequently disrupted in (...) the presence of new substrates, the released components will initiate production of additional catalytic and capsule components that will spontaneously re-assemble into one or more autocell replicas, thereby reconstituting and sometimes reproducing the original. In a diverse molecular environment, cycles of disruption and enclosure will cause auto-cells to incidentally encapsulate other molecules as well as reactive substrates. To the extent that any captured molecule can be incorporated into the autocatalytic process by virtue of structural degeneracy of the catalytic binding sites, the altered autocell will incorporate the new type of component into subsequent replications. Such altered autocells will be progenitors of “lineages” with variant characteristics that will differentially propagate with respect to the availability of commonly required substrates. Autocells are susceptible to a limited form of evolution, capable of leading to more efficient, more environmentally fitted, and more complex forms. This provides a simple demonstration of the plausibility of open-ended reproduction and evolvability without self-replicating template molecules or maintenance of persistent nonequilibrium chemistry. This model identifies an intermediate domain between prebiotic and biotic systems and bridges the gap from nonequilibrium thermodynamics to life. (shrink)
How do minds emerge from developing brains? According to the representational features of cortex are built from the dynamic interaction between neural growth mechanisms and environmentally derived neural activity. Contrary to popular selectionist models that emphasize regressive mechanisms, the neurobiological evidence suggests that this growth is a progressive increase in the representational properties of cortex. The interaction between the environment and neural growth results in a flexible type of learning: minimizes the need for prespecification in accordance with recent neurobiological evidence (...) that the developing cerebral cortex is largely free of domain-specific structure. Instead, the representational properties of cortex are built by the nature of the problem domain confronting it. This uniquely powerful and general learning strategy undermines the central assumption of classical learnability theory, that the learning properties of a system can be deduced from a fixed computational architecture. Neural constructivism suggests that the evolutionary emergence of neocortex in mammals is a progression toward more flexible representational structures, in contrast to the popular view of cortical evolution as an increase in innate, specialized circuits. Human cortical postnatal development is also more extensive and protracted than generally supposed, suggesting that cortex has evolved so as to maximize the capacity of environmental structure to shape its structure and function through constructive learning. (shrink)
The concept of genidentity has been proposed as a way to better understand identity through time, especially in physics and biology. The genidentity view is utterly anti-substantialist in so far as it suggests that the identity of X through time does not presuppose whatsoever the existence of a permanent “core” or “substrate” of X. Yet applications of this concept to real science have been scarce and unsatisfying. In this paper, our aim is to show that a well-defined concept of functional (...) genidentity can be crucial to shed light on identity through time in classical physics and especially in biology. Finally, we show that understanding identity on the basis of continuity suggests a move towards an ontology of processes. (shrink)
Quantum probability (QP) theory can be seen as a type of vector symbolic architecture (VSA): mental states are vectors storing structured information and manipulated using algebraic operations. Furthermore, the operations needed by QP match those in other VSAs. This allows existing biologically realistic neural models to be adapted to provide a mechanistic explanation of the cognitive phenomena described in the target article by Pothos & Busemeyer (P&B).
Christian hope of resurrection requires that the one raised be the same person who died. Philosophers and theologians alike seek to understand the coherence of bodily resurrection and what accounts for numerical identity between the earthly and risen person. I address this question from the perspective of disability. Is a person with a disability raised in the age to come with that disability? Many theologians argue that disability is essential to one's identity such that it could not be eliminated in (...) the resurrection. What anthropology undergirds these claims is not often explicated. I argue that Thomistic hylemorphic anthropology provides the best context to understand the human person such that disability is not essential to identity. In the resurrection, we shall become truly ourselves. The marks of disability may remain, but Thomistic anthropology expresses the coherence of bodily resurrection in which one may hope for healing which eliminates the disability but not numerical identity. (shrink)
Several advocates of the lively field of “metaphysics of science” have recently argued that a naturalistic metaphysics should be based solely on current science, and that it should replace more traditional, intuition-based, forms of metaphysics. The aim of the present paper is to assess that claim by examining the relations between metaphysics of science and general metaphysics. We show that the current metaphysical battlefield is richer and more complex than a simple dichotomy between “metaphysics of science” and “traditional metaphysics”, and (...) that it should instead be understood as a three dimensional “box”, with one axis distinguishing “descriptive metaphysics” from “revisionary metaphysics”, a second axis distinguishing a priori from a posteriori metaphysics, and a third axis distinguishing “commonsense metaphysics”, “traditional metaphysics” and “metaphysics of science”. We use this three-dimensional figure to shed light on the project of current metaphysics of science, and to demonstrate that, in many instances, the target of that project is not defined with enough precision and clarity. (shrink)
We propose a unified theory of intentions as neural processes that integrate representations of states of affairs, actions, and emotional evaluation. We show how this theory provides answers to philosophical questions about the concept of intention, psychological questions about human behavior, computational questions about the relations between belief and action, and neuroscientific questions about how the brain produces actions. Our theory of intention ties together biologically plausible mechanisms for belief, planning, and motor control. The computational feasibility of these mechanisms is (...) shown by a model that simulates psychologically important cases of intention. (shrink)
Among the very architects of the recent re-emergence of emergentism in the physical sciences, Robert B. Laughlin certainly occupies a prominent place. Through a series of works beginning as early as his Nobel lecture in 1998, a lecture given after having been awarded, together with Störmer and Tsui, the Nobel prize in physics for its contribution in the elucidation of the fractional quantum Hall effect, Laughlin openly and relentlessly advocated a strongly anti-reductionistic view of physics – and, more particularly, of (...) the interface between condensed matter and particles physics – which culminated in what can be considered his emergentist manifesto: A Different Universe. Reinventing Physics from the Bottom Down (2005). In spite of this prominent role in the vindication of emergentism, rare are the philosophers, among whom even those sympathetic to the idea of emergence, who have paid serious attention to Laughlin’s insights. The subtleties of his view – it is true, often concealed in many technicalities – have accordingly, and somewhat unfortunately, mainly passed unnoticed. (shrink)
Like many critics of Rawls, Habermas believes that the Original Position (OP) implicitly utilizes normative (and unargued for) assumptions. The author defends the OP by arguing that its basic concepts are the product of a rational reconstruction of the everyday know-how, or common sense, employed by citizens in democratic practices. The author identifies this reconstruction in Rawls's work but suggests that while this answers the charge of circularity, it raises the problem of contextual relativism. It is concluded that Rawls can (...) avoid such relativism only on a stronger commitment to social scientific research in support of a more transcendental form of rational reconstruction. (shrink)
Contemporary textbooks often define evolution in terms of the replication, mutation, and selective retention of DNA sequences, ignoring the contribution of the physical processes involved. In the closing line of The Origin of Species, however, Darwin recognized that natural selection depends on prior more basic living functions, which he merely described as life’s “several powers.” For Darwin these involved the organism’s capacity to maintain itself and to reproduce offspring that preserve its critical functional organization. In modern terms we have come (...) to recognize that this involves the continual generation of complex organic molecules in complex configurations accomplished with the aid of persistent far-from-equilibrium chemical self-organizing and self-assembling processes. But reliable persistence and replication of these processes also requires constantly available constraints and boundary conditions. Organism autonomy further requires that these constraints and co-dependent dynamics are reciprocally produced, each by the other. In this paper I argue that the different constraint-amplifying dynamics of two or more self-organizing processes can be coupled so that they reciprocally generate each other’s critical supportive boundary conditions. This coupling is a higher-order constraint that effectively constitutes a sign vehicle “interpreted” by the synergistic dynamics of these co-dependent self-organizing process so that they reconstitute this same semiotic-dynamic relationship and its self-reconstituting potential in new substrates. This dynamical co-dependence constitutes Darwin’s “several powers” and is the basis of the biosemiosis that enables evolution. (shrink)
This chapter introduces the main issues and themes of the volume. Approaches to individuality from metaphysics and philosophy of science are contrasted. Recent philosophical developments regarding concepts of biological and physical individuality are exposed. These research trends show how philosophy of physics and philosophy of biology address differently the question of what an individual is. Five main divergences are identified: the centrality of part-whole questions, the issue of identical individuals, the importance of the Principle of the Identity of Indiscernibles and, (...) finally, the importance of structuralist concerns. At the end of the chapter, the structure of the book is explained in detail. (shrink)
The elucidation of the gauge principle ‘‘is the most pressing problem in current philosophy of physics’’ said Michael Redhead in 2003. This paper argues for two points that contribute to this elucidation in the context of Yang–Mills theories. (1) Yang–Mills theories, including quantum electrodynamics, form a class. They should be interpreted together. To focus on electrodynamics is potentially misleading. (2) The essential role of gauge and BRST symmetries is to provide a local field theory that can be quantized and would (...) be equivalent to the quantization of the non-local reduced theory. If this is correct, the gauge symmetry is significant, not so much because it implies ontological consequences, but because it allows us to quantize theories that we would not be able to quantize otherwise. Thus, in the context of Yang–Mills theories, it is essentially a pragmatic principle. This does not seem to be the case for the gauge symmetry in general relativity. (shrink)
Accounts of what it is for an agent to be justified in holding a belief commonly carry commitments concerning what cognitive processes can and should be like. A concern for the plausibility of such commitments leads to a multi-faceted epistemology in which elements of traditionally conflicting epistemologies are vindicated within a single epistemological account. The accessible and articulable states that have been the exclusive focus of much epistemology must constitute only a proper subset of epistemologically relevant processing. The interaction of (...) such states looks rather contextualist. It might also be called quasi-foundationalist. However, in attending to our epistemological tasks we must rely on processing that is sensitive to information that we could not articulate, that is not accessible in the standard internalist sense. When focusing on the full range of epistemologically important processes, the structure of what makes for justification is rather more like that envisioned by some coherentists. (shrink)
Dans Science, Perception and Reality, Sellars distingue l’image manifeste de l’homme et l’image scientifique de l’homme. La première est obtenue à partir de la façon dont nous prenons conscience de nous-mêmes comme humains dans le monde. La seconde correspond à ce que les différentes sciences nous amènent à postuler sur la manière dont l’homme est constitué. Van Fraassen, lui, étend au monde ces concepts...