This volume contains invited and contributed papers delivered at a symposium on the occasion of Professor Glauber's 60th birthday. The papers, many of which are authored by world leaders in their fields, contain recent research work in quantum optics, statistical mechanics and high energy physics related to the pioneering work of Professor Roy Glauber; most contain original research material that is previously unpublished. The concepts of coherence, cooperativity and fluctuations in systems with many degrees of freedom are a common (...) base for all of Professor Glauber's research initiatives and, in fact, for much of contemporary physics. His role in shaping these cconcepts is reflected and honoured in the papers contained in this book. (shrink)
Recent developments in quantum theory have focused attention on fundamental questions, in particular on whether it might be necessary to modify quantum mechanics to reconcile quantum gravity and general relativity. This book is based on a conference held in Oxford in the spring of 1984 to discuss quantum gravity. It brings together contributors who examine different aspects of the problem, including the experimental support for quantum mechanics, its strange and apparently paradoxical features, its underlying philosophy, and possible modifications to the (...) theory. (shrink)
In this sequence of philosophical essays about natural science, the author argues that fundamental explanatory laws, the deepest and most admired successes of modern physics, do not in fact describe regularities that exist in nature. Cartwright draws from many real-life examples to propound a novel distinction: that theoretical entities, and the complex and localized laws that describe them, can be interpreted realistically, but the simple unifying laws of basic theory cannot.
A modest proposal concerning laws, counterfactuals, and explanations - - Why be Humean? -- Suggestions from physics for deep metaphysics -- On the passing of time -- Causation, counterfactuals, and the third factor -- The whole ball of wax -- Epilogue : a remark on the method of metaphysics.
Steven French and Decio Krause examine the metaphysical foundations of quantum physics. They draw together historical, logical, and philosophical perspectives on the fundamental nature of quantum particles and offer new insights on a range of important issues. Focusing on the concepts of identity and individuality, the authors explore two alternative metaphysical views; according to one, quantum particles are no different from books, tables, and people in this respect; according to the other, they most certainly are. Each view comes with (...) certain costs attached and after describing their origins in the history of quantum theory, the authors carefully consider whether these costs are worth bearing. Recent contributions to these discussions are analyzed in detail and the authors present their own original perspective on the issues. The final chapter suggests how this perspective can be taken forward in the context of quantum field theory. (shrink)
This report reviews what quantum physics and information theory have to tell us about the age-old question, How come existence? No escape is evident from four conclusions: (1) The world cannot be a giant machine, ruled by any preestablished continuum physical law. (2) There is no such thing at the microscopic level as space or time or spacetime continuum. (3) The familiar probability function or functional, and wave equation or functional wave equation, of standard quantum theory provide mere continuum (...) idealizations and by reason of this circumstance conceal the information-theoretic source from which they derive. (4) No element in the description of physics shows itself as closer to primordial than the elementary quantum phenomenon, that is, the elementary device-intermediated act of posing a yes-no physical question and eliciting an answer or, in brief, the elementary act of observer-participancy. Otherwise stated, every physical quantity, every it, derives its ultimate significance from bits, binary yes-or-no indications, a conclusion which we epitomize in the phrase, it from bit. (shrink)
The rising interest, in the late 20th century, in the foundations of quantum physics, a subject in which Franco Selleri has excelled, has suggested the fair question: how did it become so? The current answer says that experiments have allowed to bring into the laboratories some previous gedanken experiments, beginning with those about EPR and related to Bell’s inequalities. I want to explore an alternative view, by which there would have been, before Bell’s inequalities experimental tests, a change in (...) the views shared by physicists concerning the intellectual status of that issue. I will take three cases which will serve as the threads of our story: the connections between Bohm’s causal interpretation and Bell’s inequalities; Wigner’s ideas on the measurement problem; and finally Everett’s relative states formulation. In the end, I will discuss how those threads were gathered together by creating foundations of quantum physics as a field of research. (shrink)
A conventional wisdom about the progress of physics holds that successive theories wholly encompass the domains of their predecessors through a process that is often called reduction. While certain influential accounts of inter-theory reduction in physics take reduction to require a single "global" derivation of one theory's laws from those of another, I show that global reductions are not available in all cases where the conventional wisdom requires reduction to hold. However, I argue that a weaker "local" form (...) of reduction, which defines reduction between theories in terms of a more fundamental notion of reduction between models of a single fixed system, is available in such cases and moreover suffices to uphold the conventional wisdom. To illustrate the sort of fixed-system, inter-model reduction that grounds inter-theoretic reduction on this picture, I specialize to a particular class of cases in which both models are dynamical systems. I show that reduction in these cases is underwritten by a mathematical relationship that follows the broad prescriptions of Nagel/Schaffner reduction, and support this claim with several examples. Moreover, I show that this broadly Nagelian analysis of inter-model reduction encompasses several cases that are sometimes cited as instances of the "physicist's" limit-based notion of reduction. (shrink)
The success of particle detection in high energy physics colliders critically depends on the criteria for selecting a small number of interactions from an overwhelming number that occur in the detector. It also depends on the selection of the exact data to be analyzed and the techniques of analysis. The introduction of automation into the detection process has traded the direct involvement of the physicist at each stage of selection and analysis for the efficient handling of vast amounts of (...) data. This tradeoff, in combination with the organizational changes in laboratories of increasing size and complexity, has resulted in automated and semi-automated systems of detection. Various aspects of the semi-automated regime were greatly diminished in more generic automated systems, but turned out to be essential to a number of surprising discoveries of anomalous processes that led to theoretical breakthroughs, notably the establishment of the Standard Model of particle physics. The automated systems are much more efficient in confirming specific hypothesis in narrow energy domains than in performing broad exploratory searches. Thus, in the main, detection processes relying excessively on automation are more likely to miss potential anomalies and impede potential theoretical advances. I suggest that putting substantially more effort into the study of electron–positron colliders and increasing its funding could minimize the likelihood of missing potential anomalies, because detection in such an environment can be handled by the semi-automated regime—unlike detection in hadron colliders. Despite virtually unavoidable excessive reliance on automated detection in hadron colliders, their development has been deemed a priority because they can operate at currently highest energy levels. I suggest, however, that a focus on collisions at the highest achievable energy levels diverts funds from searches for potential anomalies overlooked due to tradeoffs at the previous energy thresholds. I also note that even in the same collision environment, different research strategies will opt for different tradeoffs and thus achieve different experimental outcomes. Finally, I briefly discuss current searches for anomalous process in the context of the previous analysis. (shrink)
Physics and chemistry underlie the nature of all the world around us, including human brains. Consequently some suggest that in causal terms, physics is all there is. However, we live in an environment dominated by objects embodying the outcomes of intentional design (buildings, computers, teaspoons). The present day subject of physics has nothing to say about the intentionality resulting in existence of such objects, even though this intentionality is clearly causally effective. This paper examines the claim that (...) the underlying physics uniquely causally determines what happens, even though we cannot predict the outcome. It suggests that what occurs is the contextual emergence of complexity: the higher levels in the hierarchy of complexity have autonomous causal powers, functionally independent of lower level processes. This is possible because top-down causation takes place as well as bottom-up action, with higher level contexts determining the outcome of lower level functioning and even modifying the nature of lower level constituents. Stored information plays a key role, resulting in non-linear dynamics that is non-local in space and time. Brain functioning is causally affected by abstractions such as the value of money and the theory of the laser. These are realised as brain states in individuals, but are not equivalent to them. Consequently physics per se cannot causally determine the outcome of human creativity, rather it creates the possibility space allowing human intelligence to function autonomously. The challenge to physics is to develop a realistic description of causality in truly complex hierarchical structures, with top-down causation and memory effects allowing autonomous higher levels of order to emerge with genuine causal powers. (shrink)
It is shown by means of general principles and specific examples that, contrary to a long-standing misconception, the modern mathematical physics of compressible fluid dynamics provides a generally consistent and efficient language for describing many seemingly fundamental physical phenomena. It is shown to be appropriate for describing electric and gravitational force fields, the quantized structure of charged elementary particles, the speed of light propagation, relativistic phenomena, the inertia of matter, the expansion of the universe, and the physical nature of (...) time. New avenues and opportunities for fundamental theoretical research are thereby illuminated. (shrink)
As an approach to a Theory of Everything a framework for developing a coherent theory of mathematics and physics together is described. The main characteristic of such a theory is discussed: the theory must be valid and and sufficiently strong, and it must maximally describe its own validity and sufficient strength. The mathematical logical definition of validity is used, and sufficient strength is seen to be a necessary and useful concept. The requirement of maximal description of its own validity (...) and sufficient strength may be useful to reject candidate coherent theories for which the description is less than maximal. Other aspects of a coherent theory discussed include universal applicability, the relation to the anthropic principle, and possible uniqueness. It is suggested that the basic properties of the physical and mathematical universes are entwined with and emerge with a coherent theory. Support for this includes the indirect reality status of properties of very small or very large far away systems compared to moderate sized nearby systems. Discussion of the necessary physical nature of language includes physical models of language and a proof that the meaning content of expressions of any axiomatizable theory seems to be independent of the algorithmic complexity of the theory. Gödel maps seem to be less useful for a coherent theory than for purely mathematical theories because all symbols and words of any language must have representations as states of physical systems already in the domain of a coherent theory. (shrink)
In a period of over 50 years, Peter Mittelstaedt has made substantial and lasting contributions to several fields in theoretical physics as well as the foundations and philosophy of physics. Here we present an overview of his achievements in physics and its foundations which may serve as a guide to the bibliography (printed in this Festschrift) of his publications. An appraisal of Peter Mittelstaedt’s work in the philosophy of physics is given in a separate contribution by (...) B. Falkenburg. (shrink)
The present paper focuses on a particular class of models intended to describe and explain the physical behaviour of systems that consist of a large number of interacting particles. Such many-body models are characterized by a specific Hamiltonian (energy operator) and are frequently employed in condensed matter physics in order to account for such phenomena as magnetism, superconductivity, and other phase transitions. Because of the dual role of many-body models as models of physical sys-tems (with specific physical phenomena as (...) their explananda) as well as mathematical structures, they form an important sub-class of scientific models, from which one can expect to draw general conclusions about the function and functioning of models in science, as well as to gain specific insight into the challenge of modelling complex systems of correlated particles in condensed matter physics. In particular, it is argued that many-body models contribute novel elements to the process of inquiry and open up new avenues of cross-model confirmation and model-based understanding. In contradistinction to phenomenological models, which have received comparatively more philosophical attention, many-body models typically gain their strength not from ‘empirical fit’ per se, but from their being the result of a constructive application of mature formalisms, which frees them from the grip of both ‘fundamental theory’ and an overly narrow conception of ‘empirical success’. (shrink)
Constantin Caratheodory offered the first systematic and contradiction free formulation of thermodynamics on the basis of his mathematical work on Pfaff forms. Moreover, his work on measure theory provided the basis for later improved formulations of thermodynamics and physics of continua where extensive variables are measures and intensive variables are densities. Caratheodory was the first to see that measure theory and not topology is the natural tool to understand the difficulties (ergodicity, approach to equilibrium, irreversibility) in the Foundations of (...) Statistical Physics. He gave a measure-theoretic proof of Poincaré's recurrence theorem in 1919. This work paved the way for Birkhoff to identify later ergodicity as metric transitivity and for Koopman and von Neumann to introduce spectral analysis of dynamical systems in Hilbert spaces. Mixing provided an explanation of the approach to equilibrium but not of irreversibility. The recent extension of spectral theory of dynamical systems to locally convex spaces, achieved by the Brussels–Austin groups, gives new nontrivial time asymmetric spectral decompositions for unstable and/or non-integrable systems. In this way irreversibility is resolved in a natural way. (shrink)
A survey is given of the elegant physics of N-particle systems, both classical and quantal, non-relativistic (NR) and relativistic, non-gravitational (SR) and gravitational (GR). Chapter 1 deals exclusively with NR systems; the correspondence between classical and quantal systems is highlighted and summarized in two tables of Sec. 1.3. Chapter 2 generalizes Chapter 1 to the relativistic regime, including Maxwell’s theory of electromagnetism. Chapter 3 follows Einstein in allowing gravity to curve the spacetime arena; its Sec. 3.2 is devoted to (...) the yet missing theory of elementary particles, which should determine their properties and interactions. If completed, it would replace QFT; promising is the ‘metron’ approach. (shrink)
The point of departure for this article is Werner Heisenberg’s remark, made in 1929: “It is not surprising that our language [or conceptuality] should be incapable of describing processes occurring within atoms, for … it was invented to describe the experiences of daily life, and these consist only of processes involving exceedingly large numbers of atoms. … Fortunately, mathematics is not subject to this limitation, and it has been possible to invent a mathematical scheme—the quantum theory [quantum mechanics]—which seems entirely (...) adequate for the treatment of atomic processes.” The cost of this discovery, at least in Heisenberg’s and related interpretations of quantum mechanics, is that, in contrast to classical mechanics, the mathematical scheme in question no longer offers a description, even an idealized one, of quantum objects and processes. This scheme only enables predictions, in general, probabilistic in character, of the outcomes of quantum experiments. As a result, a new type of the relationships between mathematics and physics is established, which, in the language of Eugene Wigner adopted in my title, indeed makes the effectiveness of mathematics unreasonable in quantum but, as I shall explain, not in classical physics. The article discusses these new relationships between mathematics and physics in quantum theory and their implications for theoretical physics—past, present, and future. (shrink)
Although it has become a common place to refer to the ׳sixth problem׳ of Hilbert׳s (1900) Paris lecture as the starting point for modern axiomatized probability theory, his own views on probability have received comparatively little explicit attention. The central aim of this paper is to provide a detailed account of this topic in light of the central observation that the development of Hilbert׳s project of the axiomatization of physics went hand-in-hand with a redefinition of the status of probability (...) theory and the meaning of probability. Where Hilbert first regarded the theory as a mathematizable physical discipline and later approached it as a ׳vague׳ mathematical application in physics, he eventually understood probability, first, as a feature of human thought and, then, as an implicitly defined concept without a fixed physical interpretation. It thus becomes possible to suggest that Hilbert came to question, from the early 1920s on, the very possibility of achieving the goal of the axiomatization of probability as described in the ׳sixth problem׳ of 1900. (shrink)
The purpose of this paper is to highlight the importance of constraints in the theory of relativity and, in particular, what philosophical work they do for Einstein's views on the laws of physics. Einstein presents a view of local ``structure laws'' which he characterizes as the most appropriate form of physical laws. Einstein was committed to a view of science, which presents a synthesis between rational and empirical elements as its hallmark. If scientific constructs are free inventions of the (...) human mind, as Einstein, held, the question arises how such rational constructs, including the symbolic formulation of the laws of physics, can represent physical reality. Representation in turn raises the question of realism. Einstein uses a number of constraints in the theory of relativity to show that by imposing constraints on the rational elements a certain ``fit'' between theory and reality can be achieved. Fit is to be understood as satisfaction of constraint. His emphasis on reference frames in the STR and more general coordinate systems in the GTR, as well as his emphasis on the symmetries of the theory of relativity suggests that Einstein's realism is akin to a certain form of structural realism. His version of structural realism follows from the theory of relativity and is independent of any current philosophical debates about structural realism. (shrink)
Why is the future so different from the past? Why does the past affect the future and not the other way round? The universe began with the Big Bang - will it end with a `Big Crunch'? Now in paperback, this book presents an innovative and controversial view of time and contemporary physics. Price urges physicists, philosophers, and anyone who has ever pondered the paradoxes of time to look at the world from a fresh perspective, and throws fascinating new (...) light on some of the great mysteries of the universe. (shrink)
These articles and speeches by the Nobel Prize-winning physicist date from 1934 to 1958. Rather than expositions on quantum physics, the papers are philosophical in nature, exploring the relevance of atomic physics to many areas of human endeavor. Includes an essay in which Bohr and Einstein discuss quantum and_wave equation theories. 1961 edition.
Statistical mechanics is one of the crucial fundamental theories of physics, and in his new book Lawrence Sklar, one of the pre-eminent philosophers of physics, offers a comprehensive, non-technical introduction to that theory and to attempts to understand its foundational elements. Among the topics treated in detail are: probability and statistical explanation, the basic issues in both equilibrium and non-equilibrium statistical mechanics, the role of cosmology, the reduction of thermodynamics to statistical mechanics, and the alleged foundation of the (...) very notion of time asymmetry in the entropic asymmetry of systems in time. The book emphasises the interaction of scientific and philosophical modes of reasoning, and in this way will interest all philosophers of science as well as those in physics and chemistry concerned with philosophical questions. The book could also be read by an informed general reader interested in the foundations of modern science. (shrink)
The idea that there could be spatially extended mereological simples has recently been defended by a number of metaphysicians (Markosian 1998, 2004; Simons 2004; Parsons (2000) also takes the idea seriously). Peter Simons (2004) goes further, arguing not only that spatially extended mereological simples (henceforth just extended simples) are possible, but that it is more plausible that our world is composed of such simples, than that it is composed of either point-sized simples, or of atomless gunk. The difficulty for these (...) views lies in explaining why it is that the various sub-volumes of space occupied by such simples, are not occupied by proper parts of those simples. Intuitively at least, many of us find compelling the idea that spatially extended objects have proper parts at every sub-volume of the region they occupy. It seems that the defender of extended simples must reject a seemingly plausible claim, what Simons calls the geometric correspondence principle (GCP): that any (spatially) extended object has parts that correspond to the parts of the region that it occupies (Simons 2004: 371). We disagree. We think that GCP is a plausible principle. We also think it is plausible that our world is composed of extended simples. We reconcile these two notions by two means. On the one hand we pay closer attention to the physics of our world. On the other hand, we consider what happens when our concept of something—in this case space—contains elements not all of which are realized in anything, but instead key components are realized in different features of the world. (shrink)
The basic theme of Popper's philosophy--that something can come from nothing--is related to the present situation in physical theory. Popper carries his investigation right to the center of current debate in quantum physics. He proposes an interpretation of physics--and indeed an entire cosmology--which is realist, conjectural, deductivist and objectivist, anti-positivist, and anti-instrumentalist. He stresses understanding, reminding us that our ignorance grows faster than our conjectural knowledge.
Universally recognized as bringing about a revolutionary transformation of the notions of space, time, and motion in physics, Einstein's theory of gravitation, known as "general relativity," was also a defining event for 20th century philosophy of science. During the decisive first ten years of the theory's existence, two main tendencies dominated its philosophical reception. This book is an extended argument that the path actually taken, which became logical empiricist philosophy of science, greatly contributed to the current impasse over realism, (...) whereas new possibilities are opened in revisiting and reviving the spirit of the more sophisticated tendency, a cluster of viewpoints broadly termed transcendental idealism, and furthering its articulation. It also emerges that Einstein, while paying lip service to the emerging philosophy of logical empiricism, ended up siding de facto with the latter tendency. Ryckman's work speaks to several groups, among them philosophers of science and historians of relativity. Equations are displayed as necessary, but Ryckman gives the non-mathematical reader enough background to understand their occurrence in the context of his wider philosophical project. (shrink)
The paper explicates the stages of the author’s philosophical evolution in the light of Kopnin’s ideas and heritage. Starting from Kopnin’s understanding of dialectical materialism, the author has stated that category transformations of physics has opened from conceptualization of immutability to mutability and then to interaction, evolvement and emergence. He has connected the problem of physical cognition universals with an elaboration of the specific system of tools and methods of identifying, individuating and distinguishing objects from a scientific theory domain. (...) The role of vacuum conception and the idea of existence (actual and potential, observable and nonobservable, virtual and hidden) types were analyzed. In collaboration with S.Crymski heuristic and regulative functions of categories of substance, world as a whole as well as postulates of relativity and absoluteness, and anthropic and self-development principles were singled out. Elaborating Kopnin’s view of scientific theories as a practically effective and relatively true mapping of their domains, the author in collaboration with M. Burgin have originated the unified structure-nominative reconstruction (model) of scientific theory as a knowledge system. According to it, every scientific knowledge system includes hierarchically organized and complex subsystems that partially and separately have been studied by standard, structuralist, operationalist, problem-solving, axiological and other directions of the current philosophy of science. 1) The logico-linguistic subsystem represents and normalizes by means of different, including mathematical, languages and normalizes and logical calculi the knowledge available on objects under study. 2) The model-representing subsystem comprises peculiar to the knowledge system ways of their modeling and understanding. 3) The pragmatic-procedural subsystem contains general and unique to the knowledge system operations, methods, procedures, algorithms and programs. 4) From the viewpoint of the problem-heuristic subsystem, the knowledge system is a unique way of setting and resolving questions, problems, puzzles and tasks of cognition of objects into question. It also includes various heuristics and estimations (truth, consistency, beauty, efficacy, adequacy, heuristicity etc) of components and structures of the knowledge system. 5) The subsystem of links fixes interrelations between above-mentioned components, structures and subsystems of the knowledge system. The structure-nominative reconstruction has been used in the philosophical and comparative case-studies of mathematical, physical, economic, legal, political, pedagogical, social, and sociological theories. It has enlarged the collection of knowledge structures, connected, for instance, with a multitude of theoreticity levels and with an application of numerous mathematical languages. It has deepened the comprehension of relations between the main directions of current philosophy of science. They are interpreted as dealing mainly with isolated subsystems of scientific theory. This reconstruction has disclosed a variety of undetected knowledge structures, associated also, for instance, with principles of symmetry and supersymmetry and with laws of various levels and degrees. In cooperation with the physicist Olexander Gabovich the modified structure-nominative reconstruction is in the processes of development and justification. Ideas and concepts were also in the center of Kopnin’s cognitive activity. The author has suggested and elaborated the triplet model of concepts. According to it, any scientific concept is a dependent on cognitive situation, dynamical, multifunctional state of scientist’s thinking, and available knowledge system. A concept is modeled as being consisted from three interrelated structures. 1) The concept base characterizes objects falling under a concept as well as their properties and relations. In terms of volume and content the logical modeling reveals partially only the concept base. 2) The concept representing part includes structures and means (names, statements, abstract properties, quantitative values of object properties and relations, mathematical equations and their systems, theoretical models etc.) of object representation in the appropriate knowledge system. 3) The linkage unites a structures and procedures that connect components from the abovementioned structures. The partial cases of the triplet model are logical, information, two-tired, standard, exemplar, prototype, knowledge-dependent and other concept models. It has introduced the triplet classification that comprises several hundreds of concept types. Different kinds of fuzziness are distinguished. Even the most precise and exact concepts are fuzzy in some triplet aspect. The notions of relations between real scientific concepts are essentially extended. For example, the definition and strict analysis of such relations between concepts as formalization, quantification, mathematization, generalization, fuzzification, and various kinds of identity are proposed. The concepts «PLANET» and «ELEMENTARY PARTICLE» and some of their metamorphoses were analyzed in triplet terms. The Kopnin’s methodology and epistemology of cognition was being used for creating conception of the philosophy of law as elaborating of understanding, justification, estimating and criticizing legal system. The basic information on the major directions in current Western philosophy of law (legal realism, feminism, criticism, postmodernism, economical analysis of law etc.) is firstly introduced to the Ukrainian audience. The classification of more than fifty directions in modern legal philosophy is suggested. Some results of historical, linguistic, scientometric and philosophic-legal studies of the present state of Ukrainian academic science are given. (shrink)
A magisterial study of the philosophy of physics that both introduces the subject to the non-specialist and contains many original and important contributions for professionals in the area. Modern physics was born as a part of philosophy and has retained to this day a properly philosophical concern for the clarity and coherence of ideas. Any introduction to the philosophy of physics must therefore focus on the conceptual development of physics itself. This book pursues that development from (...) Galileo and Newton through Maxwell and Boltzmann to Einstein and the founders of quantum mechanics. There is also discussion of important philosophers of physics in the eighteenth and nineteenth centuries and of twentieth-century debates. In the interest of appealing to the broadest possible readership the author avoids technicalities and explains both the physics and philosophical terms. (shrink)
According to an increasing number of authors, the best, if not the only, argument in favour of physicalism is the so-called 'overdetermination argument'. This argument, if sound, establishes that all the entities that enter into causal interactions with the physical world are physical. One key premise in the overdetermination argument is the principle of the causal closure of the physical world, said to be supported by contemporary physics. In this paper, I examine various ways in which physics may (...) support the principle, either as a methodological guide or as depending on some other laws and principles of physics. (shrink)
Highlighting main issues and controversies, this book brings together current philosophical discussions of symmetry in physics to provide an introduction to the subject for physicists and philosophers. The contributors cover all the fundamental symmetries of modern physics, such as CPT and permutation symmetry, as well as discussing symmetry-breaking and general interpretational issues. Classic texts are followed by new review articles and shorter commentaries for each topic. Suitable for courses on the foundations of physics, philosophy of physics (...) and philosophy of science, the volume is a valuable reference for students and researchers. (shrink)
Murdoch describes the historical background of the physics from which Bohr's ideas grew; he traces the origins of his idea of complementarity and discusses its meaning and significance. Special emphasis is placed on the contrasting views of Einstein, and the great debate between Bohr and Einstein is thoroughly examined. Bohr's philosophy is revealed as being much more subtle, and more interesting than is generally acknowledged.