This volume introduces readers to emergence theory, outlines the major arguments in its defence, and summarizes the most powerful objections against it. It provides the clearest explication yet of this exciting new theory of science, which challenges the reductionist approach by proposing the continuous emergence of novel phenomena.
Being human while trying to scientifically study human nature confronts us with our most vexing problem. Efforts to explicate the human mind are thwarted by our cultural biases and entrenched infirmities; our first-person experiences as practical agents convince us that we have capacities beyond the reach of scientific explanation. What we need to move forward in our understanding of human agency, Paul Sheldon Davies argues, is a reform in the way we study ourselves and a long overdue break with traditional (...) humanist thinking. Davies locates a model for change in the rhetorical strategies employed by Charles Darwin in _On the Origin of Species_. Darwin worked hard to anticipate and diminish the anxieties and biases that his radically historical view of life was bound to provoke. Likewise, Davies draws from the history of science and contemporary psychology and neuroscience to build a framework for the study of human agency that identifies and diminishes outdated and limiting biases. The result is a heady, philosophically wide-ranging argument in favor of recognizing that humans are, like everything else, subjects of the natural world—an acknowledgement that may free us to see the world the way it actually is. (shrink)
A persistent boast of the historical approach to functions is that functional properties are normative. The claim is that a token trait retains its functional status even when it is defective, diseased, or damaged and consequently unable to perform the relevant task. This is because historical functional categories are defined in terms of some sort of historical success -- success in natural selection, typically -- which imposes a norm upon the performance of descendent tokens. Descendents thus are supposed to perform (...) the associated task even when they cannot. The conceit, then, is that malfunctions are explicable in terms of historical success. The aim of this paper is to challenge this conceit. My thesis is that the historical approach to functions lacks the resources with which to account for the possibility of malfunctions. If functional types are defined in terms of historical success, then tokens that lack the defining property due to defect, and tokens that have lost the defining property due to disease or damage, are excluded from the functional category. Historically based malfunctions, in consequence, are impossible. The historical approach is no better than its non-historical competitors in accounting for the presumed normativity of functional properties. (shrink)
T he term emergence is used to describe the appearance of new properties that arise when a system exceeds a certain level of size or complexity, properties that are absent from the constituents of the system. It is a concept often summed up by the phrase that “the whole is greater than the sum of its parts,” and it is a key notion in the burgeoning field of complexity science. Life is often cited as a classic example of an emergent (...) phenomenon: no atoms of my body are living, yet I am living (see, for example, Morowitz ). Biological organisms depend on the processes of their constituent parts, yet they nevertheless exhibit a degree of autonomy from their parts (see, for example, Kauffman ). How can this be? These seem to be contradictory properties. (shrink)
The aim of this paper is to clarify and critically assess the methods of evolutionary psychology, and offer a sketch of an alternative methodology. My thesis is threefold. (1) The methods of inquiry unique to evolutionary psychology rest upon the claim that the discovery of theadaptive functions of ancestral psychological capacities leads to the discovery of thepsychological functions of those ancestral capacities. (2) But this claim is false; in fact, just the opposite is true. We first must discover the psychological (...) functions of our psychological capacities in order to discover their adaptive functions. Hence the methods distinctive of evolutionary psychology are idle in our search for the mechanisms of the mind. (3) There are good reasons for preferring an alternative to the methods of evolutionary psychology, an alternative that aims to discover the functions of our psychological capacities by appeal to the concept of awhole psychology. (shrink)
The social exchange theory of reasoning, which is championed by Leda Cosmides and John Tooby, falls under the general rubric evolutionary psychology and asserts that human reasoning is governed by content-dependent, domain-specific, evolutionarily-derived algorithms. According to Cosmides and Tooby, the presumptive existence of what they call cheater-detection algorithms disconfirms the claim that we reason via general-purpose mechanisms or via inductively acquired principles. We contend that the Cosmides/Tooby arguments in favor of domain-specific algorithms or evolutionarily-derived mechanisms fail and that the notion (...) of a social exchange rule, which is central to their theory, is not correctly characterized. As a consequence, whether or not their conclusion is true cannot be established on the basis of the arguments they have presented. (shrink)
The finite age of the universe and the existence of cosmological horizons provides a strong argument that the observable universe represents a finite causal region with finite material and informational resources. A similar conclusion follows from the holographic principle. In this paper I address the question of whether the cosmological information bound has implications for fundamental physics. Orthodox physics is based on Platonism: the laws are treated as infinitely precise, perfect, immutable mathematical relationships that transcend the physical universe and remain (...) totally unchanged by physical processes, however extreme. If instead the laws of physics are regarded as akin to computer software, with the physical universe as the corresponding hardware, then the finite computational capacity of the universe imposes a fundamental limit on the precision of the laws and the specifiability of physical states. That limit depends on the age of the universe. I examine how the imprecision of the laws impacts on the evolution of highly entangled states and on the problem of dark energy. (shrink)
Sober (1992) has recently evaluated Brandon's (1982, 1990; see also 1985, 1988) use of Salmon's (1971) concept of screening-off in the philosophy of biology. He critiques three particular issues, each of which will be considered in this discussion.
Teleosemantics asserts that mental content is determined by natural selection. The thesis is that content is fixed by the historical conditions under which certain cognitive mechanisms—those that produce and those that interpret representational states—were selectively successful. Content is fixed by conditions of selective success. The thesis of this paper is that teleosemantics is mistaken, that content cannot be fixed by conditions of selective success, because those conditions typically outnumber the intentional objects within a given representational state. To defend against this (...) excess, advocates of teleosemantics must attempt to privilege some conditions of success while ignoring others. This results in selective explanations that are ad hoc, thereby depriving teleosemantics of the virtues it hoped to inherit from the theory of evolution by natural selection, including its alleged naturalistic credentials. (shrink)
For decades most scientists assumed that life emerged billions of years ago in a “primordial soup” somewhere on the Earth’s surface. Evidence is mounting, however, that life may have begun deep beneath the surface, perhaps near a volcanic ocean vent or even inside the hot crust itself. Since there are hints that life’s history on Earth extends back through the phase of massive cosmic bombardment, it may be that life started on Mars and came here later, perhaps inside rocks ejected (...) from the Red Planet by large impacts. The traffic of intact rocks between Mars and Earth is now an established fact, and experiments confirm that microbes could survive the rigours of the journey through space if cocooned within such material. Unfortunately, this planetary cross- contamination compromises astrobiologists’ hope of finding a second genesis in the solar system. (shrink)
One of the most influential physics books of the twentieth century was actually about biology. In a series of lectures, Erwin Schrödinger described how he believed that quantum mechanics, or some variant of it, would soon solve the riddle of life. These lectures were published in 1944 under the title What is life? and are credited by some as ushering in the age of molecular biology. In the nineteenth century, many scientists thought they knew the answer to Schrödinger’s rhetorical question. (...) Life, they maintained, was some sort of magic matter. The continued use of the term ‘organic chemistry’ is a hangover from that era. The belief that there is a chemical recipe for life led to the hope that, if only we knew what it was, we could mix up the right stuff in a test tube and make life in the lab. Most research on biogenesis has followed that tradition, by assuming that chemistry was a bridge — and a long one at that — linking matter with life. Elucidating this chemical pathway has been a tantalizing goal, spurred on by the famous Miller–Urey experiment of 1952, in which amino acids were made by sparking electricity through a mixture of water and common gases. But the concept has turned out to be something of a blind alley, and further progress with prebiotic chemical synthesis has been frustratingly slow. The origin of life remains one of the great outstanding mysteries of science. To take up Schrödinger’s suggestion, a radical solution to the problem, ‘What is life?’ could be that quantum mechanics enabled life to emerge directly from the atomic world, without the need for complex intermediate chemistry. Life must have a chemical basis: organic molecules provide the hardware for biology. But what about the software? When Schrödinger asked, ‘What is life?’ he could already glimpse the central significance of the cell’s information storage and replication processes, even though the role of DNA and the genetic code was yet to be discovered.. (shrink)
Astrobiologists are aware that extraterrestrial life might differ from known life, and considerable thought has been given to possible signatures associated with weird forms of life on other planets. So far, however, very little attention has been paid to the possibility that our own planet might also host communities of weird life. If life arises readily in Earth-like conditions, as many astrobiologists contend, then it may well have formed many times on Earth itself, which raises the question whether one or (...) more shadow biospheres have existed in the past or still exist today. In this paper, we discuss possible signatures of weird life and outline some simple strategies for seeking evidence of a shadow biosphere. Key Words: Weird life—Multiple origins of life—Biogenesis—Biomarkers—Extremophiles—Alternative biochemistry. Astrobiology 9, 241–249. (shrink)
In the atavistic model of cancer progression, tumor cell dedifferentiation is interpreted as a reversion to phylogenetically earlier capabilities. The more recently evolved capabilities are compromised first during cancer progression. This suggests a therapeutic strategy for targeting cancer: design challenges to cancer that can only be met by the recently evolved capabilities no longer functional in cancer cells. We describe several examples of this target‐the‐weakness strategy. Our most detailed example involves the immune system. The absence of adaptive immunity in immunosuppressed (...) tumor environments is an irreversible weakness of cancer that can be exploited by creating a challenge that only the presence of adaptive immunity can meet. This leaves tumor cells more vulnerable than healthy tissue to pathogenic attack. Such a target‐the‐weakness therapeutic strategy has broad applications, and contrasts with current therapies that target the main strength of cancer: cell proliferation. (shrink)
: The oft-repeated claim that life is ‘ written into ’ the laws of nature is examined and criticised. Arguments are given in favour of life spreading between near-neighbour planets in rocky impact ejecta (transpermia), but against panspermia, leading to the conclusion that if life is indeed found to be widespread in the universe, some form of life principle or biological determinism must be at work in the process of biogenesis. Criteria for what would constitute a credible life principle are (...) elucidated. I argue that the key property of life is its information content, and speculate that the emergence of the requisite information-processing machinery might require quantum information theory for a satisfactory explanation. Some clues about how decoherence might be evaded are discussed. The implications of some of these ideas for ‘ fine-tuning ’ are discussed. (shrink)
: All known life requires phosphorus (P) in the form of inorganic phosphate (PO43x or Pi) and phosphate-containing organic molecules. Pi serves as the backbone of the nucleic acids that constitute genetic material and as the major repository of chemical energy for metabolism in polyphosphate bonds. Arsenic (As) lies directly below P on the periodic table and so the two elements share many chemical properties, although their chemistries are sufficiently dissimilar that As cannot directly replace P in modern biochemistry. Arsenic (...) is toxic because As and P are similar enough that organisms attempt this substitution. We hypothesize that ancient biochemical systems, analogous to but distinct from those known today, could have utilized arsenate in the equivalent biological role as phosphate. Organisms utilizing such ‘ weird life ’ biochemical pathways may have supported a ‘ shadow biosphere ’ at the time of the origin and early evolution of life on Earth or on other planets. Such organisms may even persist on Earth today, undetected, in unusual niches. (shrink)
This chapter contains sections titled: * 1 The Universe Is Weirdly Fine-Tuned for Life * 2 The Cosmic Code * 3 The Concept of Laws * 4 Are the Laws Real? * 5 Does a Multiverse Explain the Goldilocks Enigma? * 6 Many Scientists Hate the Multiverse Idea * 7 Who Designed the Multiverse? * 8 If There Were a Unique Final Theory, God Would Be Redundant * 9 What Exists and What Doesn’t: Who or What Gets to Decide? * (...) 10 The Origin of the Rule That Separates What Exists From What Doesn’t * 11 Why Mind Matters * 12 The Universe as a Finite Computer Exposes the Fiction of Idealized Laws * 13 Quantum Mechanics Could Permit the Feedback Loop Between Mind and the Laws Of Physics * Notes. (shrink)
Much of the modern period was dominated by a `reductionist' theory of science. On this view, to explain any event in the world is to reduce it down to fundamental particles, laws, and forces. In recent years reductionism has been dramatically challenged by a radically new paradigm called `emergence'. According to this new theory, natural history reveals the continuous emergence of novel phenomena: new structures and new organisms with new causal powers. Consciousness is yet one more emergent level in the (...) natural hierarchy. Many theologians and religious scholars believe that this new paradigm may offer new insights into the nature of God and God's relation to the world. This volume introduces readers to emergence theory, outlines the major arguments in its defence, and summarizes the most powerful objections against it. Written by experts but suitable as an introductory text, these essays provide the best available presentation of this exciting new field and its potentially momentous implications. (shrink)
We study the response of switched particle detectors to static negative energy densities and negative energy fluxes. It is demonstrated how the switching leads to excitation even in the vacuum and how negative energy can lead to a suppression of this excitation. We obtain quantum inequalities on the detection similar to those obtained for the energy density by Ford and co-workers and in an ‘‘operational’’ context by Helfer. We reexamine the question ‘‘Is there a quantum equivalence principle?’’ in terms of (...) our model. Finally, we briefly address the issue of negative energy and the second law of thermodynamics. (shrink)
Andrews et al. subscribe to the view that distinguishing selectionist from nonselectionist hypotheses – or, distinguishing adaptations from mere spandrels or exaptations – is important to the study of psychology. I offer three reasons for thinking that this view is false; that considerations of past selective efficacy have little to contribute to inquiry in psychology.
Fred Dretske asserts that the conscious or phenomenal experiences associated with our perceptual states—e.g. the qualitative or subjective features involved in visual or auditory states—are identical to properties that things have according to our representations of them. This is Dretske's version of the currently popular representational theory of consciousness . After explicating the core of Dretske's representational thesis, I offer two criticisms. I suggest that Dretske's view fails to apply to a broad range of mental phenomena that have rather distinctive (...) subjective or qualitative features. I also suggest that Dretske's view, in identifying conscious experiences with features of our perceptual states, casts its aim too low. It deflates further than it should and, in consequence, fails to capture what are arguably some of the most important phenomena associated with our conscious lives. (shrink)
In attempting to re-think the notion of asymmetry and its relations with 'first philosophy' and to see how that notion is tracked by the provocation of scepticism, the paper demonstrates something about the implications of Levinas' ethical asymmetry. The paper considers Levinas' tendency to introduce the topic of scepticism when confronted by the logical and textual difficulties that necessarily befall his account of the ethical relation. It argues that such an introduction commits Levinas to the claim: first philosophy entails a (...) fundamental (first person) asymmetry and its attendant scepticism. If, for Levinas, scepticism and first person asymmetry are implicated in all attempts at first philosophy, the paper suggests that an intriguing place to set it to work is those pages of Being and Time which prioritize being-towards-death, the polemical focus of Levinas' being-for-beyond-my-death. (shrink)
Jacob Bekenstein's identification of black hole event horizon area with entropy proved to be a landmark in theoretical physics. In this paper we trace the subsequent development of the resulting generalized second law of thermodynamics (GSL), especially its extension to incorporate cosmological event horizons. In spite of the fact that cosmological horizons do not generally have well-defined thermal properties, we find that the GSL is satisfied for a wide range of models. We explore in particular the case of an asymptotically (...) de Sitter universe filled with a gas of small black holes as a means of casting light on the relative entropic ‘worth’ of black hole versus cosmological horizon area. We present some numerical solutions of the generalized total entropy as a function of time for certain cosmological models, in all cases confirming the validity of the GSL. (shrink)
The importance of applying game theory to the evolution of information in the presence of noise has recently become widely recognized. This Special Issue addresses the theme of spontaneously emergent order in both classical and quantum systems subject to external noise, and includes papers directly related to game theory or the development of supporting techniques. In the following editorial overview we examine the broader context of the subject, including the tension between the destructive and creative aspects of noise, and foreshadow (...) the signiﬁcance of some of the subsequent papers in the volume. (shrink)
We derive conditions for rotating particle detectors to respond in a variety of bounded spacetimes and compare the results with the folklore that particle detectors do not respond in the vacuum state appropriate to their motion. Applications involving possible violations of the second law of thermodynamics are briefly addressed.
A quantum particle moving in a gravitational field may penetrate the classically forbidden region of the gravitational potential. This raises the question of whether the time of flight of a quantum particle in a gravitational field might deviate systematically from that of a classical particle due to tunnelling delay, representing a violation of the weak equivalence principle. I investigate this using a model quantum clock to measure the time of flight of a quantum particle in a uniform gravitational field, and (...) show that a violation of the equivalence principle does not occur when the measurement is made far from the turning point of the classical trajectory. The results are then confirmed using the so-called dwell time definition of quantum tunnelling. I conclude with some remarks about the strong equivalence principle in quantum mechanics. (shrink)