What are the laws of physics? -- The stuff that kicks back -- Point-of-view invariance -- Gauging the laws of physics -- Forces and broken symmetries -- Playing dice -- After the bang -- Out of the void -- The comprehensible cosmos -- Models of reality.
In this sequence of philosophical essays about natural science, the author argues that fundamental explanatory laws, the deepest and most admired successes of modern physics, do not in fact describe regularities that exist in nature. Cartwright draws from many real-life examples to propound a novel distinction: that theoretical entities, and the complex and localized laws that describe them, can be interpreted realistically, but the simple unifying laws of basic theory cannot.
INTERNATIONAL STUDIES IN THE PHILOSOPHY OF SCIENCE Vol. 5, number 1, Autumn 1991, pp. 79-87. R.M. Nugayev. -/- The fundamental laws of physics can tell the truth. -/- Abstract. Nancy Cartwright’s arguments in favour of phenomenological laws and against fundamental ones are discussed. Her criticisms of the standard cjvering-law account are extended using Vyacheslav Stepin’s analysis of the structure of fundamental theories. It is argued that Cartwright’s thesis 9that the laws of physics lie) is too (...) radical to accept. A model of theory change is proposed which demonstrates how the fundamental laws of physics can, in fact, be confronted with experience. -/- . (shrink)
Does a world that contains chemistry entail the validity of both the standard model of elementary particle physics and general relativity, at least as effective theories? This article shows that the answer may very well be affirmative. It further suggests that the very existence of stable, spatially extended material objects, if not the very existence of the physical world, may require the validity of these theories.
It has been argued that the fundamental laws of physics are deceitful in that they give the impression of greater unity and coherence in our theories than is actually found to be the case. Causal stories and phenomenological relationships are claimed to provide a more acceptable account of the world, and only theoretical entities — not laws — are considered as perhaps corresponding to real features of the world.This paper examines these claims in the light of the (...) author's own field of research: high energy physics. Some of the distinctions upon which the above conclusions are based are found not to be tenable in practice. Examples from experimental particle physics are presented which suggest an important role of the underlying theoretical structure which cannot be overlooked. It is argued that the fundamental theories must, in fact, be treated as being as worthy or unworthy of ontological commitment as the entities they postulate or the phenomenological relationships they inspire. Whilst it is conceded that aspects of the current theoretical formalism belie literal interpretation, it is maintained that revision in these particular areas need not affect the symmetry principles, particle spectra, or coupling strengths that largely determine the empirical content of the theory. (shrink)
Are the laws of nature real? Do they belong to the world or merely reflect the way we speak about it? And if they are real, what sort of entity are they? These questions have been intensely debated by philosophers. Modern cosmology, however, has given such questions a new twist by introducing a unique perspective on physical reality, the perspective which I shall call the cosmological point of view. In this perspective, the universe as a whole presents itself as (...) a single individual entity that undergoes a radical change with time. Laws of physics, on the other hand, have both local and global significance. They characterize how things behave locally. But they also characterize the entire universe. This suggests an interesting connection between the universe as a whole and what laws of physics hold in this universe. From the cosmological point of view, these two totalities, the laws of physics and the universe, may be related. But how exactly? Are the laws “inscribed” in the fabric of the universe or do they in some sense “precede” it in the order of being? If the latter, what is a “medium,” over and above the physical universe, in which physical laws are “written”? If the former, are they but a consequence of the universe’s very existence? And if so, how could the laws of physics survive the dramatic change the physical state of the universe underwent in the course of time? (shrink)
The most recent challenge to the covering-law model of explanation (N. Cartwright, How the laws of Physics Lie) charges that the fundamental explanatory laws are not true. In fact explanation and truth are alleged to pull in different directions. We hold that this gets its force from confusing issues about the truth of the laws in the explanation and the precision with which those laws can yield an exact description of the event to be explained. (...) In defending this we look at Cartwright's major case studies and sketch an amended covering-law model of explanation. (shrink)
After a brief survey of the literature on ceteris paribus clauses and ceteris paribus laws (1), the problem of exceptions, which creates the need for cp laws, is discussed (2). It emerges that the so-called skeptical view of laws of nature does not apply to laws of any kind whatever. Only some laws of physics are plagued with exceptions, not the laws (3). Cp clauses promise a remedy, which has to be located among (...) the further reactions to the skeptical view (4). After inspecting various translations of the Latin term “ceteris paribus” (5), the paper arrives at the conclusion that, on the most reasonable translation, there are no such things as cp laws, for reasons of logical form. Cp clauses have an indexical content, so that they need singular propositions as their habitat, not general ones. Cp clauses and the universal generalizations they are supposed to modify are not fit for each other (6). (shrink)
The status of fundamental laws is an important issue when deciding between the three broad ontological options of fundamentalism (of which the thesis that physics is complete is typically a sub-type), emergentism, and disorder or promiscuous realism. Cartwrights assault on fundamental laws which argues that such laws do not, and cannot, typically state the facts, and hence cannot be used to support belief in a fundamental ontological order, is discussed in this context. A case is made (...) in defence of a moderate form of fundamentalism, which leaves open the possibility of emergentism, but sets itself against the view that our best ontology is disordered. The argument, taking its cue from Bhaskar, relies on a consideration of the epistemic status of experiments, and the question of the possible generality of knowledge gained in unusual or controlled environments. (shrink)
The belief that laws of nature are contingent played an important role in the emergence of the empirical method of modern physics. During the scientific revolution, this belief was based on the idea of voluntary creation. Taking up Peter Mittelstaedt’s work on laws of nature, this article explores several alternative answers which do not overtly make use of metaphysics: some laws are laws of mathematics; macroscopic laws can emerge from the interplay of numerous subsystems (...) without any specific microscopic nomic structures (John Wheeler’s “law without law”); laws are the preconditions of scientific experience (Kant); laws are theoretical abstractions which only apply in very limited circumstances (Nancy Cartwright). Whereas Cartwright’s approach is in tension with modern scientific methodology, the first three strategies count as illuminating, though partial answers. It is important for the empirical method of modern physics that these three strategies, even when taken together, do not provide a complete explanation of the order of nature. Thus the question of why laws are valid is still relevant. In the concluding section, I argue that the traditional answer, based on voluntary creation, provides the right balance of contingency and coherence which is in harmony with modern scientific method. (shrink)
For many decades, the proponents of `artificial intelligence' have maintained that computers will soon be able to do everything that a human can do. In his bestselling work of popular science, Sir Roger Penrose takes us on a fascinating roller-coaster ride through the basic principles of physics, cosmology, mathematics, and philosophy to show that human thinking can never be emulated by a machine.
It is shown that the method of operationaldefinition of theoretical terms applied inphysics may well support constructivist ideasin cognitive sciences when extended toobservational terms. This leads to unexpectedresults for the notion of reality, inductionand for the problem why mathematics is sosuccessful in physics.A theory of cognitive operators is proposedwhich are implemented somewhere in our brainand which transform certain states of oursensory apparatus into what we call perceptionsin the same sense as measurement devicestransform the interaction with the object intomeasurement results. (...) Then, perceivedregularities, as well as the laws of nature wewould derive from them can be seen asinvariants of the cognitive operators concernedand are by this human specific constructsrather than ontologically independent elements.(e.g., the law of energy conservation can bederived from the homogeneity of time and bythis depends on our mental time metricgenerator). So, reality in so far it isrepresented by the laws of nature has no longeran independent ontological status. This isopposed to Campbell's `natural selectionepistemology'. From this it is shown that thereholds an incompleteness theorem for physicallaws similar to Gödels incompletenesstheorem for mathematical axioms, i.e., there isno definitive or object `theory of everything'.This constructivist approaches to cognitionwill allow a coherent and consistent model ofboth cognitive and organic evolution. Whereasthe classical view sees the two evolutionrather dichotomously (for ex.: most scientistssee cognitive evolution converging towards adefinitive world picture, whereas organicevolution obviously has no specific focus (the`pride of creation'). (shrink)
We address the question of whether it is possible to operate a time machine by manipulating matter and energy so as to manufacture closed timelike curves. This question has received a great deal of attention in the physics literature, with attempts to prove no-go theorems based on classical general relativity and various hybrid theories serving as steps along the way towards quantum gravity. Despite the effort put into these no-go theorems, there is no widely accepted definition of a time (...) machine. We explain the conundrum that must be faced in providing a satisfactory definition and propose a resolution. Roughly, we require that all extensions of the time machine region contain closed timelike curves; the actions of the time machine operator are then sufficiently “potent” to guarantee that closed timelike curves appear. We then review no-go theorems based on classical general relativity, semi-classical quantum gravity, quantum field theory on curved spacetime, and Euclidean quantum gravity. Our verdict on the question of our title is that no result of sufficient generality to underwrite a confident “yes” has been proven. Our review of the no-go results does, however, highlight several foundational problems at the intersection of general relativity and quantum physics that lend substance to the search for an answer. (shrink)
We address the question of whether it is possible to operate a time machine by manipulating matter and energy so as to manufacture closed timelike curves. This question has received a great deal of attention in the physics literature, with attempts to prove no- go theorems based on classical general relativity and various hybrid theories serving as steps along the way towards quantum gravity. Despite the effort put into these no-go theorems, there is no widely accepted definition of a (...) time machine. We explain the conundrum that must be faced in providing a satisfactory definition and propose a resolution. Roughly, we require that all extensions of the time machine region contain closed timelike curves; the actions of the time machine operator are then sufficiently "potent" to guarantee that closed timelike curves appear. We then review no-go theorems based on classical general relativity, semi-classical quantum gravity, quantum field theory on curved spacetime, and Euclidean quantum gravity. Our verdict on the question of our title is that no result of sufficient generality to underwrite a confident "yes" has been proven. Our review of the no-go results does, however, highlight several foundational problems at the intersection of general relativity and quantum physics that lend substance to the search for an answer. (shrink)
Is it possible to take the enterprise of physics seriously while also holding the belief that the world contains an order beyond the reach of that physics? Is it possible to simultaneously believe in objective laws of nature and in miracles? Is it possible to search for the truths of physics while also acknowledging the limitations of that search as it is carried out by limited human knowers? As a philosopher, as a Christian, and as a (...) participant in the physics of his day, Leibniz had an interesting view that bears on all of these questions. This paper examines the status of laws of nature in Leibniz's philosophy and how the status of these laws fits into his larger philosophical picture of the limits of human knowledge and the wise and omniscient God who created the actual world. (shrink)
The paper argues that it is possible for an incompatibilist to accept John Martin Fischer’s plausible insistence that the question whether we are morally responsible agents ought not to depend on whether the laws of physics turn out to be deterministic or merely probabilistic. The incompatibilist should do so by rejecting the fundamentalism which entails that the question whether determinism is true is a question merely about the nature of the basic physical laws. It is argued that (...) this is a better option for ensuring the irrelevance of physics than the embrace of semi-compatibilism, since there are reasons for supposing that alternate possibilities are necessary for moral responsibility, despite Fischer’s claims to the contrary. There are two distinct reasons for supposing that alternate possibilities might be necessary for moral responsibility—one of which is to do with fairness, the other to do with agency itself. It is suggested that if one focuses on the second of these reasons, Fischer’s arguments for supposing that alternate possibilities are unnecessary for moral responsibility can be met by the incompatibilist. Some possible reasons for denying that alternate possibilities are necessary for the existence of agency are then raised and rejected. (shrink)
"The Emperor's New Mind" by Roger Penrose has received a great deal of both praise and criticism. This review discusses philosophical aspects of the book that form an attack on the "strong" AI thesis. Eight different versions of this thesis are distinguished, and sources of ambiguity diagnosed, including different requirements for relationships between program and behaviour. Excessively strong versions attacked by Penrose (and Searle) are not worth defending or attacking, whereas weaker versions remain problematic. Penrose (like Searle) regards the notion (...) of an algorithm as central to AI, whereas it is argued here that for the purpose of explaining mental capabilities the architecture of an intelligent system is more important than the concept of an algorithm, using the premise that what makes something intelligent is not what it does but how it does it. What needs to be explained is also unclear: Penrose thinks we all know what consciousness is and claims that the ability to judge Go "del's formula to be true depends on it. He also suggests that quantum phenomena underly consciousness. This is rebutted by arguing that our existing concept of "consciousness" is too vague and muddled to be of use in science. This and related concepts will gradually be replaced by a more powerful theory-based taxonomy of types of mental states and processes. The central argument offered by Penrose against the strong AI thesis depends on a tempting but unjustified interpretation of Goedel's incompleteness theorem. Some critics are shown to have missed the point of his argument. A stronger criticism is mounted, and the relevance of mathematical Platonism analysed. Architectural requirements for intelligence are discussed and differences between serial and parallel implementations analysed. (shrink)
It is shown that the following three common understandings of Newton’s laws of motion do not hold for systems of infinitely many components. First, Newton’s third law, or the law of action and reaction, is universally believed to imply that the total sum of internal forces in a system is always zero. Several examples are presented to show that this belief fails to hold for infinite systems. Second, two of these examples are of an infinitely divisible continuous body with (...) finite mass and volume such that the sum of all the internal forces in the body is not zero and the body accelerates due to this non-null net internal force. So the two examples also demonstrate the breakdown of the common understanding that according to Newton’s laws a body under no external force does not accelerate. Finally, these examples also make it clear that the expression ‘impressed force’ in Newton’s formulations of his first and second laws should be understood not as ‘external force’ but as ‘exerted force’ which is the sum of all the internal and external forces acting on a given body, if the body is infinitely divisible. (shrink)
This paper develops a means–end analysis of an inductive problem that arises in particle physics: how to infer from observed reactions conservation principles that govern all reactions among elementary particles. I show that there is a reliable inference procedure that is guaranteed to arrive at an empirically adequate set of conservation principles as more and more evidence is obtained. An interesting feature of reliable procedures for finding conservation principles is that in certain precisely defined circumstances they must introduce hidden (...) particles. Among the reliable inductive methods there is a unique procedure that minimizes convergence time as well as the number of times that the method revises its conservation principles. Thus the aims of reliable, fast and steady convergence to an empirically adequate theory single out a unique optimal inference for a given set of observed reactions—including prescriptions for when exactly to introduce hidden particles. (shrink)
The beauty of electricity, or of any other force, is not that the power is mysterious and unexpected, touching every sense at unawares in turn, but that it is under law... Michael Faraday, Wheatstone's Electric Telegraph's Relation to Science (being an argument in favour of the full recognition of Science as a branch of Education), 1854.
Social constructionists believe that experimental evidence plays a minimal role in the production of scientific knowledge, while rationalists such as myself believe that experimental evidence is crucial in it. As one historical example in support of the rationalist position, I trace in some detail the theoretical and experimental research that led to our understanding of beta decay, from Enrico Fermi’s pioneering theory of 1934 to George Sudarshan and Robert Marshak’s and Richard Feynman and Murray Gell-Mann’s suggestion in 1957 and 1958, (...) respectively, of the V–A theory of weak interactions. This is not a history of an unbroken string of successes, but one that includes incorrect experimental results, incorrect experiment-theory comparisons, and faulty theoretical analyses. Nevertheless, we shall see that the constraints that Nature imposed made the V–A theory an almost inevitable outcome of this theoretical and experimental research. (shrink)
In the present paper we face the problem of estimating cell probabilities in the case of a two-dimensional contingency table from a predictive point of view. The solution is given by a double stochastic process. The first subprocess, the unobservable one, is supposed to be exchangeable and invariant. For the second subprocess, the observable one, we suppose it is independent conditional on the first one.
According to Kant’s Metaphysical Foundations of Natural Science, a proper science is organized according to rational principles and has a pure a priori rational part, its metaphysical foundation. In the second edition Preface to the first Critique, Kant claims that his account of time explains the a priori possibility of Newton’s laws of motion. I argue that Kant’s proof of the law of inertia fails, and that this casts doubt on Kant’s enterprise of providing a priori foundations for Newton’s (...)physics. (shrink)
Sciences are often regarded as providing the best, or, ideally, exact, knowledge of the world, especially in providing laws of nature. Ilya Prigogine, who was awarded the Nobel Prize for his theory of non-equilibrium chemical processes—this being also an important attempt to bridge the gap between exact and non-exact sciences [mentioned in the Presentation Speech by Professor Stig Claesson (nobelprize.org, The Nobel Prize in Chemistry 1977)]—has had this ideal in mind when trying to formulate a new kind of science. (...) Philosophers of science distinguish theory and reality, examining relations between these two. Nancy Cartwright’s distinction of fundamental and phenomenological laws, Rein Vihalemm’s conception of the peculiarity of the exact sciences, and Ronald Giere’s account of models in science and science as a set of models are deployed in this article to criticise the common view of science and analyse Ilya Prigogine’s view in particular. We will conclude that on a more abstract, philosophical level, Prigogine’s understanding of science doesn’t differ from the common understanding. (shrink)
Philosophy of physics is a small but thriving research field situated at the intersection between the natural sciences and the humanities. However, what exactly distinguishes philosophy of physics from physics is rarely made explicit in much depth. We provide a detailed analysis in the form of eleven theses, delineating both the nature of the questions asked in philosophy of physics and the methodology with which they are addressed.
We discuss the following problems, plaguing the present search for the “final theory”: (1) How to find a mathematical structure rich enough to be suitably approximated by the mathematical structures of general relativity and quantum mechanics? (2) How to reconcile nonlocal phenomena of quantum mechanics with time honored causality and reality postulates? (3) Does the collapse of the wave function contain some hints concerning the future quantum gravity theory? (4) It seems that the final theory cannot avoid the problem of (...) dynamics, and consequently the problem of time. What kind of time, if this theory is supposed to be background free? (5) Will the dynamics of the “final theory” be probabilistic? Quantum probability exhibits some essential differences as compared with classical probability; are they but variations of some more general probabilistic measure theory? (6) Do we need a radically new interpretation of quantum mechanics, or rather an entirely new theory of which the present quantum mechanics is an approximation? (7) If the final theory is to be background free, it should provide a mechanism of space-time generation. Should we try to explain not only the generation of space-time, but also the generation of its material content? (8) As far as the existence of the initial singularity is concerned, one usually expects either “yes” or “not” answers from the final theory. However, if the mathematical structure of the future theory is supposed to be truly more general that the mathematical structures of the present general relativity and quantum mechanics, is a “third answer“ possible? Could this third answer be related to the probabilistic character of the final theory? We discuss these questions in the framework of a working model unifying gravity and quanta. The analysis reveals unexpected aspects of these rather wildly discussed issues. (shrink)
The rising interest, in the late 20th century, in the foundations of quantum physics, a subject in which Franco Selleri has excelled, has suggested the fair question: how did it become so? The current answer says that experiments have allowed to bring into the laboratories some previous gedanken experiments, beginning with those about EPR and related to Bell’s inequalities. I want to explore an alternative view, by which there would have been, before Bell’s inequalities experimental tests, a change in (...) the views shared by physicists concerning the intellectual status of that issue. I will take three cases which will serve as the threads of our story: the connections between Bohm’s causal interpretation and Bell’s inequalities; Wigner’s ideas on the measurement problem; and finally Everett’s relative states formulation. In the end, I will discuss how those threads were gathered together by creating foundations of quantum physics as a field of research. (shrink)
According to the great discovery by e. noether in 1918 there exists an intrinsic connection between the mathematical symmetries of the laws of nature and the conservation laws. the two kinds of symmetries, namely the continuous and the discrete ones, are discussed. the physical background of these symmetries is illustrated. finally, we sketch some topical conservation problems in elementary particle physics.
This article attacks “open systems” arguments that because constant conjunctions are not generally observed in the real world of open systems we should be highly skeptical that universal laws exist. This work differs from other critiques of open system arguments against laws of nature by not focusing on laws themselves, but rather on the inference from open systems. We argue that open system arguments fail for two related reasons; 1) because they cannot account for the “systems” central (...) to their argument (nor the implied systems labeled “exogenous factors” in relation to the system of interest) and 2) they are nomocentric, fixated on laws while ignoring initial and antecedent conditions that are able to account for systems and exogenous factors within a fundamentalist framework. (shrink)
The success of particle detection in high energy physics colliders critically depends on the criteria for selecting a small number of interactions from an overwhelming number that occur in the detector. It also depends on the selection of the exact data to be analyzed and the techniques of analysis. The introduction of automation into the detection process has traded the direct involvement of the physicist at each stage of selection and analysis for the efficient handling of vast amounts of (...) data. This tradeoff, in combination with the organizational changes in laboratories of increasing size and complexity, has resulted in automated and semi-automated systems of detection. Various aspects of the semi-automated regime were greatly diminished in more generic automated systems, but turned out to be essential to a number of surprising discoveries of anomalous processes that led to theoretical breakthroughs, notably the establishment of the Standard Model of particle physics. The automated systems are much more efficient in confirming specific hypothesis in narrow energy domains than in performing broad exploratory searches. Thus, in the main, detection processes relying excessively on automation are more likely to miss potential anomalies and impede potential theoretical advances. I suggest that putting substantially more effort into the study of electron–positron colliders and increasing its funding could minimize the likelihood of missing potential anomalies, because detection in such an environment can be handled by the semi-automated regime—unlike detection in hadron colliders. Despite virtually unavoidable excessive reliance on automated detection in hadron colliders, their development has been deemed a priority because they can operate at currently highest energy levels. I suggest, however, that a focus on collisions at the highest achievable energy levels diverts funds from searches for potential anomalies overlooked due to tradeoffs at the previous energy thresholds. I also note that even in the same collision environment, different research strategies will opt for different tradeoffs and thus achieve different experimental outcomes. Finally, I briefly discuss current searches for anomalous process in the context of the previous analysis. (shrink)
This paper is the first of a two-part reexamination of causation in Descartes's physics. Some scholars ? including Gary Hatfield and Daniel Garber ? take Descartes to be a `partial' Occasionalist, who thinks that God alone is the cause of all natural motion. Contra this interpretation, I agree with literature that links Descartes to the Thomistic theory of divine concurrence. This paper surveys this literature, and argues that it has failed to provide an interpretation of Descartes's view that both (...) distinguishes his position from that of his later, Occasionalist followers and is consistent with his broader metaphysical commitments. I provide an analysis that tries to address these problems with earlier `Concurentist' readings of Descartes. On my analysis, Occasionalism entails that created substances do not have intrinsic active causal powers. As I read him, Descartes thinks that bodies have active causal powers that are partly grounded in their intrinsic natures. But I argue ? pace a recent account by Tad Schmaltz ? that Descartes also thinks that God immediately causes all motion in the created world. On the picture that emerges, Descartes's position is both continuous with, and a subtle departure from, the Thomisitic theory of divine concurrence. (shrink)
This paper examines E. W. Beth's work in the philosophy of physics, both from a historical and a systematic point of view. Beth saw the philosophy of physics first of all as an opportunity to illustrate and promulgate a new and modern general approach to the philosophy of nature and to philosophy tout court: an approach characterized negatively by its rejection of all traditional metaphysics and positively by its firm orientation towards science. Beth was successful in defending this (...) new ideology, and became its leading Dutch representative in the first two decades after the second world war. Beth also contributed importantly to the method of the philosophy of physics in a narrower sense, by proposing and promoting the semantic approach in the formal analysis of physical theories. Finally, he worked on several specific foundational questions; but he was probably too much of a logician to leave his mark in this area. (shrink)
The paper takes issue with a widely accepted view of mental causation. This is the view that mental causation is either reducible to physical causation or ultimately untenable, because incompatible with the causal completeness of physics. The paper examines, first, why recent attempts to save the phenomena of mental causation by way of the notion of supervenient causation fail. The result of this examination is the claim that any attempted specification of the most basic causal factors which supposedly underlie (...) a causal transaction cannot account for the counterfactually necessary connections with the effect in question. By contrast, the specification of these factors at a higher-level would allow establishing such connections. The paper closes with a discussion of how this view of autonomous ligher-level causation grounded on counterfactual relations can be made compatible with the physicalistic commitment to a complete specification of the particular causes of any physical effect exclusively in physical terms. (shrink)
A case for the project of excising of confusion and obfuscation in the contemporary quantum theory initiated and promoted by David Deutsch has been made. It has been argued that at least some theoretical entities which are conventionally labelled as “interpretations” of quantum mechanics are in fact full-blooded physical theories in their own right, and as such are falsifiable, at least in principle. The most pertinent case is the one of the so-called “Many-Worlds Interpretation” (MWI) of Everett and others. This (...) set of idea differs from other “interpretations” since it does not accept reality of the collapse of Schrödinger’s wavefunction. A survey of several important proposals for discrimination between quantum theories with and without wavefunction collapse appearing from time to time in the literature has been made, and the possibilities discussed in the framework of a wider taxonomy. (shrink)
Purpose: To examine the role of reductionism in the theoretical development of modern physics -- more specifically, in the quest for a complete unification of physical theory -- from the perspective of radical constructivism (RC). Approach: Some central features of the impact of RC on philosophy of physics are pointed out: its position of scientific relativism, with important implications for the validation of scientific propositions; and the notion of sharing constructed knowledge among individual knowers and its consequences for (...) science teaching. The issue of reductionism is then discussed with regard to (a) the hierarchical explanatory ordering of physical phenomena; (b) the idea of a "theory of everything" (TOE); and (c) some of its implications for the methodology and sociology of science. Findings: It is argued that the ontological status of the hierarchical structuring inherent in the sought-after TOE will depend on the individual knower's epistemic position concerning the notion of truth in science. In the relativist epistemology of RC, any true/false dichotomy of theories is without meaning. A hierarchical ordering is just one of many possible strategies that may be chosen for the construction of physical theories; and such a strategy may then be considered successful only to the extent that it yields a theory that is viable. Implications: The paper serves as an illustration of the impact of RC on the ongoing search in physics for a "final theory.". (shrink)