This volume contains invited and contributed papers delivered at a symposium on the occasion of Professor Glauber's 60th birthday. The papers, many of which are authored by world leaders in their fields, contain recent research work in quantum optics, statistical mechanics and high energy physics related to the pioneering work of Professor Roy Glauber; most contain original research material that is previously unpublished. The concepts of coherence, cooperativity and fluctuations in systems with many degrees of freedom are a common (...) base for all of Professor Glauber's research initiatives and, in fact, for much of contemporary physics. His role in shaping these cconcepts is reflected and honoured in the papers contained in this book. (shrink)
Recent developments in quantum theory have focused attention on fundamental questions, in particular on whether it might be necessary to modify quantum mechanics to reconcile quantum gravity and general relativity. This book is based on a conference held in Oxford in the spring of 1984 to discuss quantum gravity. It brings together contributors who examine different aspects of the problem, including the experimental support for quantum mechanics, its strange and apparently paradoxical features, its underlying philosophy, and possible modifications to the (...) theory. (shrink)
The success of particle detection in high energy physics colliders critically depends on the criteria for selecting a small number of interactions from an overwhelming number that occur in the detector. It also depends on the selection of the exact data to be analyzed and the techniques of analysis. The introduction of automation into the detection process has traded the direct involvement of the physicist at each stage of selection and analysis for the efficient handling of vast amounts of (...) data. This tradeoff, in combination with the organizational changes in laboratories of increasing size and complexity, has resulted in automated and semi-automated systems of detection. Various aspects of the semi-automated regime were greatly diminished in more generic automated systems, but turned out to be essential to a number of surprising discoveries of anomalous processes that led to theoretical breakthroughs, notably the establishment of the Standard Model of particle physics. The automated systems are much more efficient in confirming specific hypothesis in narrow energy domains than in performing broad exploratory searches. Thus, in the main, detection processes relying excessively on automation are more likely to miss potential anomalies and impede potential theoretical advances. I suggest that putting substantially more effort into the study of electron–positron colliders and increasing its funding could minimize the likelihood of missing potential anomalies, because detection in such an environment can be handled by the semi-automated regime—unlike detection in hadron colliders. Despite virtually unavoidable excessive reliance on automated detection in hadron colliders, their development has been deemed a priority because they can operate at currently highest energy levels. I suggest, however, that a focus on collisions at the highest achievable energy levels diverts funds from searches for potential anomalies overlooked due to tradeoffs at the previous energy thresholds. I also note that even in the same collision environment, different research strategies will opt for different tradeoffs and thus achieve different experimental outcomes. Finally, I briefly discuss current searches for anomalous process in the context of the previous analysis. (shrink)
The rising interest, in the late 20th century, in the foundations of quantum physics, a subject in which Franco Selleri has excelled, has suggested the fair question: how did it become so? The current answer says that experiments have allowed to bring into the laboratories some previous gedanken experiments, beginning with those about EPR and related to Bell’s inequalities. I want to explore an alternative view, by which there would have been, before Bell’s inequalities experimental tests, a change in (...) the views shared by physicists concerning the intellectual status of that issue. I will take three cases which will serve as the threads of our story: the connections between Bohm’s causal interpretation and Bell’s inequalities; Wigner’s ideas on the measurement problem; and finally Everett’s relative states formulation. In the end, I will discuss how those threads were gathered together by creating foundations of quantum physics as a field of research. (shrink)
The problem of how mathematics and physics are related at a foundational level is of interest. The approach taken here is to work towards a coherent theory of physics and mathematics together by examining the theory experiment connection. The role of an implied theory hierarchy and use of computers in comparing theory and experiment is described. The main idea of the paper is to tighten the theory experiment connection by bringing physical theories, as mathematical structures over C, the (...) complex numbers, closer to what is actually done in experimental measurements and computations. The method replaces C by C n which is the set of pairs, R n ,I n , of n figure rational numbers in some basis. The properties of these numbers are based on those of numerical measurement outcomes for continuous variables. A model of space and time based on R n is discussed. The model is scale invariant with regions of constant step size interrupted by exponential jumps. A method of taking the limit n→∞ to obtain locally flat continuum-based space and time is outlined. Also R n based space is invariant under scale transformations. These correspond to expansion and contraction of space relative to a flat background. The location of the origin, which is a space and time singularity, does not change under these transformations. Some properties of quantum mechanics, based on C n and on R n space are briefly investigated. (shrink)
A survey is given of the elegant physics of N-particle systems, both classical and quantal, non-relativistic (NR) and relativistic, non-gravitational (SR) and gravitational (GR). Chapter 1 deals exclusively with NR systems; the correspondence between classical and quantal systems is highlighted and summarized in two tables of Sec. 1.3. Chapter 2 generalizes Chapter 1 to the relativistic regime, including Maxwell’s theory of electromagnetism. Chapter 3 follows Einstein in allowing gravity to curve the spacetime arena; its Sec. 3.2 is devoted to (...) the yet missing theory of elementary particles, which should determine their properties and interactions. If completed, it would replace QFT; promising is the ‘metron’ approach. (shrink)
As an approach to a Theory of Everything a framework for developing a coherent theory of mathematics and physics together is described. The main characteristic of such a theory is discussed: the theory must be valid and and sufficiently strong, and it must maximally describe its own validity and sufficient strength. The mathematical logical definition of validity is used, and sufficient strength is seen to be a necessary and useful concept. The requirement of maximal description of its own validity (...) and sufficient strength may be useful to reject candidate coherent theories for which the description is less than maximal. Other aspects of a coherent theory discussed include universal applicability, the relation to the anthropic principle, and possible uniqueness. It is suggested that the basic properties of the physical and mathematical universes are entwined with and emerge with a coherent theory. Support for this includes the indirect reality status of properties of very small or very large far away systems compared to moderate sized nearby systems. Discussion of the necessary physical nature of language includes physical models of language and a proof that the meaning content of expressions of any axiomatizable theory seems to be independent of the algorithmic complexity of the theory. Gödel maps seem to be less useful for a coherent theory than for purely mathematical theories because all symbols and words of any language must have representations as states of physical systems already in the domain of a coherent theory. (shrink)
In a period of over 50 years, Peter Mittelstaedt has made substantial and lasting contributions to several fields in theoretical physics as well as the foundations and philosophy of physics. Here we present an overview of his achievements in physics and its foundations which may serve as a guide to the bibliography (printed in this Festschrift) of his publications. An appraisal of Peter Mittelstaedt’s work in the philosophy of physics is given in a separate contribution by (...) B. Falkenburg. (shrink)
The point of departure for this article is Werner Heisenberg’s remark, made in 1929: “It is not surprising that our language [or conceptuality] should be incapable of describing processes occurring within atoms, for … it was invented to describe the experiences of daily life, and these consist only of processes involving exceedingly large numbers of atoms. … Fortunately, mathematics is not subject to this limitation, and it has been possible to invent a mathematical scheme—the quantum theory [quantum mechanics]—which seems entirely (...) adequate for the treatment of atomic processes.” The cost of this discovery, at least in Heisenberg’s and related interpretations of quantum mechanics (such as that of Niels Bohr), is that, in contrast to classical mechanics, the mathematical scheme in question no longer offers a description, even an idealized one, of quantum objects and processes. This scheme only enables predictions, in general, probabilistic in character, of the outcomes of quantum experiments. As a result, a new type of the relationships between mathematics and physics is established, which, in the language of Eugene Wigner adopted in my title, indeed makes the effectiveness of mathematics unreasonable in quantum but, as I shall explain, not in classical physics. The article discusses these new relationships between mathematics and physics in quantum theory and their implications for theoretical physics—past, present, and future. (shrink)
Constantin Caratheodory offered the first systematic and contradiction free formulation of thermodynamics on the basis of his mathematical work on Pfaff forms. Moreover, his work on measure theory provided the basis for later improved formulations of thermodynamics and physics of continua where extensive variables are measures and intensive variables are densities. Caratheodory was the first to see that measure theory and not topology is the natural tool to understand the difficulties (ergodicity, approach to equilibrium, irreversibility) in the Foundations of (...) Statistical Physics. He gave a measure-theoretic proof of Poincaré's recurrence theorem in 1919. This work paved the way for Birkhoff to identify later ergodicity as metric transitivity and for Koopman and von Neumann to introduce spectral analysis of dynamical systems in Hilbert spaces. Mixing provided an explanation of the approach to equilibrium but not of irreversibility. The recent extension of spectral theory of dynamical systems to locally convex spaces, achieved by the Brussels–Austin groups, gives new nontrivial time asymmetric spectral decompositions for unstable and/or non-integrable systems. In this way irreversibility is resolved in a natural way. (shrink)
Physics and chemistry underlie the nature of all the world around us, including human brains. Consequently some suggest that in causal terms, physics is all there is. However, we live in an environment dominated by objects embodying the outcomes of intentional design (buildings, computers, teaspoons). The present day subject of physics has nothing to say about the intentionality resulting in existence of such objects, even though this intentionality is clearly causally effective. This paper examines the claim that (...) the underlying physics uniquely causally determines what happens, even though we cannot predict the outcome. It suggests that what occurs is the contextual emergence of complexity: the higher levels in the hierarchy of complexity have autonomous causal powers, functionally independent of lower level processes. This is possible because top-down causation takes place as well as bottom-up action, with higher level contexts determining the outcome of lower level functioning and even modifying the nature of lower level constituents. Stored information plays a key role, resulting in non-linear dynamics that is non-local in space and time. Brain functioning is causally affected by abstractions such as the value of money and the theory of the laser. These are realised as brain states in individuals, but are not equivalent to them. Consequently physics per se cannot causally determine the outcome of human creativity, rather it creates the possibility space allowing human intelligence to function autonomously. The challenge to physics is to develop a realistic description of causality in truly complex hierarchical structures, with top-down causation and memory effects allowing autonomous higher levels of order to emerge with genuine causal powers. (shrink)
The Cartan equations defining simple spinors (renamed “pure” by C. Chevalley) are interpreted as equations of motion in compact momentum spaces, in a constructive approach in which at each step the dimensions of spinor space are doubled while those of momentum space increased by two. The construction is possible only in the frame of the geometry of simple or pure spinors, which imposes contraint equations on spinors with more than four components, and then momentum spaces result compact, isomorphic to invariant-mass-spheres (...) imbedded in each other, since the signatures result steadily Lorentzian; starting from dimension four (Minkowski) up to dimension ten with Clifford algebra ℂℓ(1, 9), where the construction naturally ends. The equations of motion met in the construction are most of those traditionally postulated ad hoc: from Weyl equations for neutrinos (and Maxwell's) to Majorana ones, to those for the electroweak model and for the nucleons interacting with the pseudoscalar pion, up to those for the 3 baryon-lepton families, steadily progressing from the description of lower energy phenomena to that of higher ones. The 3 division algebras: complex numbers, quaternions and octonions appear to be strictly correlated with Clifford algebras and then with this spinor-geometrical approach, from which they appear to gradually emerge in the construction, where they play a basic role for the physical interpretation: at the third step complex numbers generate U(1), possible origin of the electric charge and of the existence of charged—neutral fermion pairs, explaining also easily the opposite charges of proton-electron. Another U(1) appears to generate the strong charge at the fourth step. Quaternions generate the signature of space-time at the first step, the SU(2) internal symmetry of isospin and, in the gauge term, the SU(2) L one, of the electroweak model at the third step; they are also at the origin of 3 families; in number equal to that of quaternion imaginary units. At the fifth and last step octonions generate the SU(3) internal symmetry of flavour, with SU(2) isospin subgroup and, in the gauge term, the one of color, correlated with SU(2) L of the electroweak model. These 3 division algebras seem then to be at the origin of charges, families and of the groups of the Standard model. In this approach there seems to be no need of higher dimensional (>4) space-time, here generated by the four Poincaré translations, and dimensional reduction from ℂℓ(1,9) to ℂℓ(1,3) is equivalent to decoupling of the equations of motion. This spinor-geometrical approach is compatible with that based on strings, since these may be expressed bilinearly (as integrals) in terms of Majorana–Weyl simple or pure spinors which are admitted by ℂℓ(1, 9) = R(32). (shrink)
A case for the project of excising of confusion and obfuscation in the contemporary quantum theory initiated and promoted by David Deutsch has been made. It has been argued that at least some theoretical entities which are conventionally labelled as “interpretations” of quantum mechanics are in fact full-blooded physical theories in their own right, and as such are falsifiable, at least in principle. The most pertinent case is the one of the so-called “Many-Worlds Interpretation” (MWI) of Everett and others. This (...) set of idea differs from other “interpretations” since it does not accept reality of the collapse of Schrödinger’s wavefunction. A survey of several important proposals for discrimination between quantum theories with and without wavefunction collapse appearing from time to time in the literature has been made, and the possibilities discussed in the framework of a wider taxonomy. (shrink)
A modest proposal concerning laws, counterfactuals, and explanations - - Why be Humean? -- Suggestions from physics for deep metaphysics -- On the passing of time -- Causation, counterfactuals, and the third factor -- The whole ball of wax -- Epilogue : a remark on the method of metaphysics.
In their recent book Every Thing Must Go Ladyman and Ross (Ladyman et al. 2007) claim: (1) Physics is analytically complete since it is the only science that cannot be left incomplete (cf. Ladyman et al. 2007, 283). (2) There might not be an ontologically fundamental level (cf. Ladyman et al. 2007, 178). (3) We should not admit anything into our ontology unless it has explanatory and predictive utility (cf. Ladyman et al. 2007, 179). In this discussion note I (...) aim to show that the ontological commitment in (3) implies that the completeness of no science can be achieved where no fundamental level exists. Therefore, if claim (1) requires a science to actually be complete in order to be considered as physics, (1), and if Ladyman and Ross’s “tentative metaphysical hypothesis […] that there is no fundamental level” (178) is true, (2), then there simply is no physics. Ladyman and Ross can, however, avoid this unwanted result if they merely require physics to ever strive for completeness rather than to already be complete. (shrink)
I provide a comprehensive metaphysics of causation based on the idea that fundamentally things are governed by the laws of physics, and that derivatively difference-making can be assessed in terms of what fundamental laws of physics imply for hypothesized events. Highlights include a general philosophical methodology, the fundamental/derivative distinction, and my mature account of causal asymmetry.
According to an increasing number of authors, the best, if not the only, argument in favour of physicalism is the so-called 'overdetermination argument'. This argument, if sound, establishes that all the entities that enter into causal interactions with the physical world are physical. One key premise in the overdetermination argument is the principle of the causal closure of the physical world, said to be supported by contemporary physics. In this paper, I examine various ways in which physics may (...) support the principle, either as a methodological guide or as depending on some other laws and principles of physics. (shrink)
Depending on different positions in the debate on scientific realism, there are various accounts of the phenomena of physics. For scientific realists like Bogen and Woodward, phenomena are matters of fact in nature, i.e., the effects explained and predicted by physical theories. For empiricists like van Fraassen, the phenomena of physics are the appearances observed or perceived by sensory experience. Constructivists, however, regard the phenomena of physics as artificial structures generated by experimental and mathematical methods. My paper (...) investigates the historical background of these different meanings of phenomenon in the traditions of physics and philosophy. In particular, I discuss Newton’s account of the phenomena and Bohr’s view of quantum phenomena, their relation to the philosophical discussion, and to data and evidence in current particle physics and quantum optics. (shrink)
Using the notorious bridge law “water is H 2 O” and the relation between molecular structure and quantum mechanics as examples, I argue that it doesn’t make sense to aim for specific definition(s) of intertheoretical or interdiscourse relation(s) between chemistry and physics (reduction, supervenience, what have you). Proposed definitions of interdiscourse and part-whole relations are interesting only if they provide insight in the variegated interconnected patchwork of theories and beliefs. There is “automatically” some sort of interdiscourse relation if different (...) discourses claim to have something to say about the same situation (event, system), which is the basis of (contingent) local supervenience relations, which, proper empirically support being provided, can be upgraded to ceteris paribus bridge laws. Because of the ceteris paribus feature, and the discourse dependence of event identification, there is at best only global supervenience of the “special sciences” on the physical (and of parts of physics on other parts of physics). (shrink)
We briefly describe in this paper the passage from Mendeleev’s chemistry (1869) to atomic physics (in the 1900’s), nuclear physics (in 1932) and particle physics (from 1953 to 2006). We show how the consideration of symmetries, largely used in physics since the end of the 1920’s, gave rise to a new format of the periodic table in the 1970’s. More specifically, this paper is concerned with the application of the group SO(4,2)⊗SU(2) to the periodic table of (...) chemical elements. It is shown how the Madelung rule of the atomic shell model can be used to set up a periodic table that can be further rationalized via the group SO(4,2)⊗SU(2) and some of its sub-groups. Qualitative results are obtained from this nonstandard table. (shrink)
In this sequence of philosophical essays about natural science, the author argues that fundamental explanatory laws, the deepest and most admired successes of modern physics, do not in fact describe regularities that exist in nature. Cartwright draws from many real-life examples to propound a novel distinction: that theoretical entities, and the complex and localized laws that describe them, can be interpreted realistically, but the simple unifying laws of basic theory cannot.
Final draft (September 2013 - going to production) of the book on the notion of fundamental length I have been writing for the last couple of years, covering issues in the philosophy of math, metaphysics, and the history and the philosophy of modern physics, from classical electrodynamics to current theories of quantum gravity. To be published (2014) in Cambridge University Press.
According to structural realism, in mature science there is structural continuity along theoretical change. A major counterexample to this thesis is the transition from the Eightfold Way to the Standard Model in particle physics. Nevertheless, the notion of structure is significantly important in comprehending the theoretical picture of particle physics, where particles change and undergo transmutations, while the only thing which remains unchanged is the basic structure, i.e. the symmetry group which controls the transmutations. This kind of view (...) agrees with the paradigmatic case where the structure is an internal symmetry and the instantiations are the elementary particles. The metaphysical view which reflects this situation is a version of ontic structuralism. (shrink)
This paper delves into McTaggart’s metaphysical account of reality without time, and compares and contrasts McTaggart’s account with the account of reality given by modern physics. This comparison is of interest, because there are suggestions from contemporary physics that there is no time at the fundamental level. Physicists and philosophers of physics recognize that we do not have a good understanding of how the world could be such that time is unreal. I argue that, from the perspective (...) of one who is trying to understand modern physics, McTaggart’s metaphysical views do provide some insight into how reality can be timeless at the fundamental level, but the insight that they provide is limited. (shrink)
Niels Bohr, founding father of modern atomic physics and quantum theory, was as original a philosopher as he was a physicist. This study explores several dimensions of Bohr's vision: the formulation of quantum theory and the problems associated with its interpretation, the notions of complementarity and correspondence, the debates with Einstein about objectivity and realism, and his sense of the infinite harmony of nature. Honner focuses on Bohr's epistemological lesson, the conviction that all our description of nature is dependent (...) on the words we use and the ways we can unambiguously use them. (shrink)
Although the stated purpose of Physics viii 8 is to prove that only circular locomotion is infinitely continuous, it is generally recognized that a major sub-theme of the chapter has to do with the unity of change and centers on Zeno’s dichotomy paradox. According to one influential account of this sub-theme, Aristotle returns to the dichotomy paradox in Physics viii 8, primarily to engage in a defensive maneuver. In Physics vi, while focused on the infinite divisibility of (...) change instead of its identity conditions, Aristotle left open the possibility that occurrences that are ‘one change’ could have infinitely many parts that are also ‘one change’.1 By Physics viii 8, however, Zeno has brought Aristotle to realize that if this possibility is admitted, then what one chooses to call ‘one change’ is to a large extent arbitrary. But this Aristotle cannot countenance, because his entire theory of change is built upon the concept of a change as a thing uniquely definable as the passage from a particular state to a particular state. In Physics viii 8, then, Aristotle seeks to avoid this result by ‘refining’ the definition of ‘one change’ so that ‘one change’ can no longer have parts that are also ‘one change’ and by invoking the metaphysical machinery of the act-potency distinction to give a positive characterization of the difference between change parts and change wholes.2 According to Michael White, Aristotle ‘refines’ his definition of ‘one change’ in Physics viii 8 by strengthening the criteria of Physics v 4; criteria, which, White is correct to point out, do nothing to prevent this result on their own.3 According to White, this ‘refinement’ consists in adding, to the criteria of Physics v 4 (i.e., the criteria that ‘one change’ must be in a continuous time, have a single subject throughout, and proceed throughout from a terminus of the same species to a contrary terminus of the same species), the additional condition that an occurrence that is ‘one change’ must be bracketed by periods of rest and contain no periods of rest.. (shrink)
Highlighting main issues and controversies, this book brings together current philosophical discussions of symmetry in physics to provide an introduction to the subject for physicists and philosophers. The contributors cover all the fundamental symmetries of modern physics, such as CPT and permutation symmetry, as well as discussing symmetry-breaking and general interpretational issues. Classic texts are followed by new review articles and shorter commentaries for each topic. Suitable for courses on the foundations of physics, philosophy of physics (...) and philosophy of science, the volume is a valuable reference for students and researchers. (shrink)
In recent years, the ontological similarities between the foundations of quantum mechanics and the emptiness teachings in Madhyamika–Prasangika Buddhism of the Tibetan lineage have attracted some attention. After briefly reviewing this unlikely connection, I examine ideas encountered in condensed-matter physics that resonate with this view on emptiness. Focusing on the particle concept and emergence in condensed-matter physics, I highlight a qualitative correspondence to the major analytical approaches to emptiness.
The ambition of this volume is twofold: to provide a comprehensive overview of the field and to serve as an indispensable reference work for anyone who wants to work in it. For example, any philosopher who hopes to make a contribution to the topic of the classical-quantum correspondence will have to begin by consulting Klaas Landsman’s chapter. The organization of this volume, as well as the choice of topics, is based on the conviction that the important problems in the philosophy (...) of physics arise from studying the foundations of the fundamental theories of physics. It follows that there is no sharp line to be drawn between philosophy of physics and physics itself. Some of the best work in the philosophy of physics is being done by physicists, as witnessed by the fact that several of the contributors to the volume are theoretical physicists: viz., Ellis, Emch, Harvey, Landsman, Rovelli, ‘t Hooft, the last of whom is a Nobel laureate. Key features - Definitive discussions of the philosophical implications of modern physics - Masterly expositions of the fundamental theories of modern physics - Covers all three main pillars of modern physics: relativity theory, quantum theory, and thermal physics - Covers the new sciences grown from these theories: for example, cosmology from relativity theory; and quantum information and quantum computing, from quantum theory - Contains special Chapters that address crucial topics that arise in several different theories, such as symmetry and determinism - Written by very distinguished theoretical physicists, including a Nobel Laureate, as well as by philosophers - Definitive discussions of the philosophical implications of modern physics - Masterly expositions of the fundamental theories of modern physics - Covers all three main pillars of modern physics: relativity theory, quantum theory, and thermal physics - Covers the new sciences that have grown from these theories: for example, cosmology from relativity theory; and quantum information and quantum computing, from quantum theory - Contains special Chapters that address crucial topics that arise in several different theories, such as symmetry and determinism - Written by very distinguished theoretical physicists, including a Nobel Laureate, as well as by philosophers. (shrink)
It is shown that if quantum physics is interpreted according to the philosophy of monistic idealism--that consciousness is the ground of all being--then some of the important dualisms of philosophy can be integrated.
This paper is the first of a two-part reexamination of causation in Descartes's physics. Some scholars ? including Gary Hatfield and Daniel Garber ? take Descartes to be a `partial' Occasionalist, who thinks that God alone is the cause of all natural motion. Contra this interpretation, I agree with literature that links Descartes to the Thomistic theory of divine concurrence. This paper surveys this literature, and argues that it has failed to provide an interpretation of Descartes's view that both (...) distinguishes his position from that of his later, Occasionalist followers and is consistent with his broader metaphysical commitments. I provide an analysis that tries to address these problems with earlier `Concurentist' readings of Descartes. On my analysis, Occasionalism entails that created substances do not have intrinsic active causal powers. As I read him, Descartes thinks that bodies have active causal powers that are partly grounded in their intrinsic natures. But I argue ? pace a recent account by Tad Schmaltz ? that Descartes also thinks that God immediately causes all motion in the created world. On the picture that emerges, Descartes's position is both continuous with, and a subtle departure from, the Thomisitic theory of divine concurrence. (shrink)
In this paper I will argue that if physics is to become a coherent metaphysics of nature it needs an “interpretation”. As I understand it, an interpretation of a physical theory amounts to offering (1) a precise formulation of its ontological claims and (2) a clear account of how such claims are related to the world of our experience. Notably, metaphysics enters importantly in both tasks: in (1), because interpreting our best physical theories requires going beyond a merely instrumentalist (...) view of science and therefore using our best metaphysical theories; in (2), because a philosophical elaboration of the theories of the world that are implicit in our experience is one of the tasks of analytic metaphysics, and bridging possible explanatory gaps or even conflicts between the physical image and the manifest image of the world is a typical philosophical task that involves science and metaphysics. (shrink)