The concept of contextual emergence is proposed as a non-reductive, yet welldeﬁned relation between different levels of description of physical and other systems. It is illustrated for the transition from statistical mechanics to thermodynamical properties such as temperature. Stability conditions are crucial for a rigorous implementation of contingent contexts that are required to understand temperature as an emergent property. It is proposed that such stability conditions are meaningful for contextual emergence beyond physics as well.
The causal argument for physicalism is anayzed and it's key premise--the causal closure of physics--is found wanting. Therefore, a hidden premise must be added to the argument to gain its conclusion, but the hidden premise is indistinguishable from the conclusion of the causal argument. Therefore, it begs the question on physicalism.
The role of contingent contexts in formulating relations between properties of systems at different descriptive levels is addressed. Based on the distinction between necessary and sufficient conditions for interlevel relations, a comprehensive classification of such relations is proposed, providing a transparent conceptual framework for discussing particular versions of reduction, emergence, and supervenience. One of these versions, contextual emergence, is demonstrated using two physical examples: molecular structure and chirality, and thermal equilibrium and temperature. The concept of stability is emphasized as a (...) basic guiding principle of contextual property emergence. (shrink)
Kellert (In the Wake of Chars, University of Chicago press, Chicago, 1993) has argued that Laplacean determinism in classical physics is actually a layered concept, where various properties or layers composing this form of determinism can be peeled away. Here, I argue that a layered conception of determinism is inappropriate and that we should think in terms of different deterministic models applicable to different kinds of systems. The upshot of this analysis is that the notion of state is more closely (...) tied to the kind of system being investigated than is usually considered in discussions of determinism. So when investigating determinism corresponding changes to the appropriate notion of state – and, perhaps, the state space itself – also need to be considered. (shrink)
Determinism is a rich and varied concept. At an abstract level of analysis, Jordan Howard Sobel (1998) identifies at least ninety varieties of what determinism could be like. When it comes to thinking about what deterministic laws and theories in physical sciences might be like, the situation is much clearer. There is a criterion by which to judge whether a law–expressed as some form of equation–is deterministic. A theory would then be deterministic just in case all its laws taken as (...) a whole were deterministic. In contrast, if a law fails this criterion, then it is indeterministic and any theory whose laws taken as a whole fail this criterion must also be indeterministic. Although it is widely believed that classical physics is deterministic and quantum mechanics is indeterministic, application of this criterion yields some surprises for these standard judgments. (shrink)
Recent developments in nonlinear dynamics have found wide application in many areas of science from physics to neuroscience. Nonlinear phenomena such as feedback loops, inter-level relations, wholes constraining and modifying the behavior of their parts, and memory effects are interesting candidates for emergence and downward causation. Rayleigh–Bénard convection is an example of a nonlinear system that, I suggest, yields important insights for metaphysics and philosophy of science. In this paper I propose convection as a model for downward causation in classical (...) mechanics, far more robust and less speculative than the examples typically provided in the philosophy of mind literature. Although the physics of Rayleigh–Bénard convection is quite complicated, this model provides a much more realistic and concrete example for examining various assumptions and arguments found in emergence and philosophy of mind debates. After reviewing some key concepts of nonlinear dynamics, complex systems and the basic physics of Rayleigh–Bénard convection, I begin that examination here by (1) assessing a recently proposed definition for emergence and downward causation, (2) discussing some typical objections to downward causation and (3) comparing this model with Sperry’s examples. (shrink)
The "usual story" regarding molecular chemistry is that it is roughly an application of quantum mechanics. That is to say, quantum mechanics supplies everything necessary and sufficient, both ontologically and epistemologically, to reduce molecular chemistry to quantum mechanics. This is a reductive story, to be sure, but a key explanatory element of molecular chemistry, namely molecular structure, is absent from the quantum realm. On the other hand, typical characterizations of emergence, such as the unpredictability or inexplicability of molecular structure based (...) on quantum mechanics, do not characterize the relationship between molecular chemistry and quantum mechanics well either. A different scheme for characterizing reduction and emergence is proposed that accommodates the relationship between quantum mechanics and molecular chemistry and some initial objections to the scheme are considered. (shrink)
There has been a long-standing debate about the relationshipof predictability and determinism. Some have maintained that determinism impliespredictability while others have maintained that predictability implies determinism. Manyhave maintained that there are no implication relations between determinism andpredictability. This summary is, of course, somewhat oversimplified and quick at least in thesense that there are various notions of determinism and predictability at work in thephilosophical literature. In this essay I will focus on what I take to be the Laplacean visionfor determinism and (...) predictability. While many forms of predictability areinconsistent with this vision, I argue that a suitably restricted notion of predictability,consistent with the practice of physicists, is implied by the Laplacean notion of determinism.It is argued that limitations on predictability are of an in principle nature in the Appendix. (shrink)
Two approaches toward the arrow of time for scattering processes have been proposed in rigged Hilbert space quantum mechanics. One, due to Arno Bohm, involves preparations and registrations in laboratory operations and results in two semigroups oriented in the forward direction of time. The other, employed by the Brussels-Austin group, is more general, involving excitations and de-excitations of systems, and apparently results in two semigroups oriented in opposite directions of time. It turns out that these two time arrows can be (...) related to each other via Wigner's extensions of the spacetime symmetry group. Furthermore, their are subtle differences in causality as well as the possibilities for the existence and creation of time-reversed states depending on which time arrow is chosen. (shrink)
Arno Bohm and Ilya Prigogine's Brussels-Austin Group have been working on the quantum mechanical arrow of time and irreversibility in rigged Hilbert space quantum mechanics. A crucial notion in Bohm's approach is the so-called preparation/registration arrow. An analysis of this arrow and its role in Bohm's theory of scattering is given. Similarly, the Brussels-Austin Group uses an excitation/de-excitation arrow for ordering events, which is also analyzed. The relationship between the two approaches is discussed focusing on their semi-group operators and time (...) arrows. Finally a possible realist interpretation of the rigged Hilbert space formulation of quantum mechanics is considered. (shrink)