That space and time should be integrated into a single entity, spacetime, is the great insight of Einstein's special theory of relativity, and leads us to regard spacetime as a fundamental context in which to make sense of the world around us. But it is not the only one. Causality is equally important and at least as far as the special theory goes, it cannot be subsumed under a fundamentally geometrical form of explanation. In fact, the agent of propagation of (...) causal influence is electromagnetic radiation. In this examination, the authors find support for a rationalist approach to physics, never neglecting experimentation, but rejecting a simple empiricist or positivist view of science. (shrink)
Richard Feynman has claimed that anti-particles are nothing but particles `propagating backwards in time'; that time reversing a particle state always turns it into the corresponding anti-particle state. According to standard quantum field theory textbooks this is not so: time reversal does not turn particles into anti-particles. Feynman's view is interesting because, in particular, it suggests a nonstandard, and possibly illuminating, interpretation of the CPT theorem. In this paper, we explore a classical analog of Feynman's view, in the context of (...) the recent debate between David Albert and David Malament over time reversal in classical electromagnetism. (shrink)
Internal global symmetries exist for the free non-relativistic Schrodinger particle, whose associated Noether charges---the space integrals of the wavefunction and the wavefunction multiplied by the spatial coordinate---are exhibited. Analogous symmetries in classical electromagnetism are also demonstrated.
This paper provides a survey of several philosophical issues arising in classical electrodynamics arguing that there is a philosophically rich set of problems in theories of classical physics that have not yet received the attention by philosophers that they deserve. One issue, which is connected to the philosophy of causation, concerns the temporal asymmetry exhibited by radiation fields in the presence of wave sources. Physicists and philosophers disagree on whether this asymmetry reflects a fundamental causal asymmetry or is due to (...) statistical or thermodynamic considerations. I suggest that an explanation appealing to the asymmetry of causation is more promising. Another issue concerns the conceptual structure of the theory. Despite its empirical success, classical electrodynamics faces serious foundational problems. Models of charged particles involve what by the theory's own lights are idealizations, I maintain, and this is a feature that is not readily captured by traditional philosophical accounts of scientific theories. Other issues I discuss concern (i) the relation between Lorentz's theory of the electron and Einstein's Theory of Special Relativity; (ii) the notion of the domain of a theory, the question of theory reduction, and the relation between classical and more fundamental quantum theories; and (iii) the role of locality constraints, their relation to the concept of causation; and the status of locality conditions in the semi-classical theory of the Aharanov-Bohm effect. (shrink)
It is often said that the Aharonov-Bohm effect shows that the vector potential enjoys more ontological significance than we previously realized. But how can a quantum-mechanical effect teach us something about the interpretation of Maxwell's theory—let alone about the ontological structure of the world—when both theories are false? I present a rational reconstruction of the interpretative repercussions of the Aharonov-Bohm effect, and suggest some morals for our conception of the interpretative enterprise.
This paper follows up the analysis of relativity theory begun by Margenau and Mould, by including electromagnetic theory which in their treatment was tacitly accepted. It is shown that the experiments on which Margenau and Mould rely to establish the special theory of relativity actually confirm the mutual consistency of the Maxwell-Lorentz electromagnetic theory and the special relativity theory, but throw no light on the validity of the two theories taken jointly. It is further shown that a modification of the (...) rules of correspondence between the mathematical structure of the theories and immediate experience would bring the theories into agreement with an alternative relativity theory based on the Galilean instead of the Lorentz transformation. An experiment is suggested by which the need for such modification can be tested. A proof is then given that the rules of correspondence between the concepts of the special relativity theory (and therefore of current electromagnetic theory) and experience are not self-consistent, so that some modification of current ideas is essential. It is suggested that a generalisation of Maxwell's theory, in terms of Faraday's "ray vibrations" instead of Lorentz's static ether, might provide a satisfactory basis for a relativistic electromagnetic theory. (shrink)
Hyder constructs two historical narratives. First, he gives an account of Helmholtz's relation to Kant, from the famous Raumproblem, which preoccupied philosophers, geometers, and scientists in the mid-19th century, to Helmholtz's arguments in his four papers on geometry from 1868 to 1878 that geometry is, in some sense, an empirical science (chapters 5 and 6). Here, Hyder responds to the reading of Moritz Schlick, according to whom the "chief epistemological result" of Helmholtz's work is his argument that "Euclidean space is (...) not an inescapable form of our faculty of intuition, but a product of experience" (Schlick's note in Helmholtz 1977 , 35). Schlick's story papers over Helmholtz's deep relationship to Kant, especially in Helmholtz's early work. Hyder's work here puts this relationship at center stage, and contributes a much richer picture of the reasons for Helmholtz's later decision to turn away from the Kantian perspective. The second theme is the argument for the necessity of central forces to a determinate scientific description of physical reality, an abiding concern of Helmholtz's, and one that, as Hyder shows, has Kantian roots. Helmholtz's commitment to the necessity of central forces was key to his responses to rival views on electromagnetism, and is a deep and often under-appreciated element of his epistemology of science. (shrink)
We give a overview of the main areas in theoretical physics, with emphasis on their relation to Lagrangian formalism in classical mechanics. This review covers classical mechanics; the road from classical mechanics to Schrodinger's quantum mechanics; electromagnetism, special and general relativity, and (very briefly) gauge field theory and the Higgs mechanism. We shun mathematical rigor in favor of a straightforward presentation.
A recent paper suggested that if Galilean covariance was extended to signals and interactions, the resulting theory would contain such anomalies as would have impelled physicists towards special relativity even without empirical prompts. I analyze this claim. Some so-called anomalies turn out to be errors. Others have classical analogs, which suggests that classical physicists would not have viewed them as anomalous. Still others, finally, remain intact in special relativity, so that they serve as no impetus towards this theory. I conclude (...) that Galilean covariance is insufficient to derive special relativity. (shrink)
The traditional absolutist-relationist debate is still clearly formulable in the context of General Relativity Theory (GTR), despite the important differences between Einstein's theory and the earlier context of Newtonian physics. This paper answers recent arguments by Robert Rynasiewicz against the significance of the debate in the GTR context. In his (1996) (‘Absolute vs. Relational Spacetime: An Outmoded Debate?’), Rynasiewicz argues that already in the late nineteenth century, and even more so in the context of General Relativity theory, the terms of (...) the original Descartes–Newton–Leibniz dispute about space are not to be found. Nineteenth-century ether theories of electromagnetism, and the metric field of GTR, he claims, do not lend themselves to being interpreted clearly as either absolute space à la Newton, or relational structures à la either Descartes or Leibniz. I argue that, while in some imaginable theories Rynasiewicz's claim that the classical debate dissolves would be correct, in fact in the most important historical theories he discusses, this is not the case. In particular, I argue that in both Lorentz's ether theory and General relativity theory, there is a clear and compelling way to establish connections to the classical absolutist-relationist disputes, and that in both these theories it is the absolutist position that is prima facie victorious. To support my arguments and give a clear overview of the whole debate, I end by offering definitional sketches of relationism and absolutism (substantivalism) about spacetime in the context of contemporary physics. The sketches show the clear connections between these views today and their ancestors in Newton and Leibniz. But at the same time, they indicate how both views are not just claims about existing physical theories, but rather also bets about how future physics will clarify the ontological picture. (shrink)
Those looking for holism in contemporary physics have focused their attention primarily on quantum entanglement. But some gauge theories arguably also manifest the related phenomenon of nonseparability. While the argument is strong for the classical gauge theory describing electromagnetic interactions with quantum “particles”, it fails in the case of general relativity even though that theory may also be formulated in terms of a connection on a principal fiber bundle. Anandan has highlighted the key difference in his analysis of a supposed (...) gravitational analog to the Aharonov-Bohm effect. By contrast with electromagnetism in the original Aharonov-Bohm effect, gravitation is separable and exhibits no novel holism in this case. Whether the nonseparability of classical gauge theories of non-gravitational interactions is associated with holism depends on what counts as the relevant part-whole relation. Loop representations of quantized gauge theories of non- gravitational interactions suggest that these conclusions about holism and nonseparability may extend also to quantum theories of the associated fields. (shrink)
Throughout the history of the Western world, science has possessed an extraordinary amount of authority and prestige. And while its pedestal has been jostled by numerous evolutions and revolutions, science has always managed to maintain its stronghold as the knowing enterprise that explains how the natural world works: we treat such legendary scientists as Galileo, Newton, Darwin, and Einstein with admiration and reverence because they offer profound and sustaining insight into the meaning of the universe. In The Intelligibility of Nature (...) , Peter Dear considers how science as such has evolved and how it has marshaled itself to make sense of the world. His intellectual journey begins with a crucial observation: that the enterprise of science is, and has been, directed toward two distinct but frequently conflated ends—doing and knowing. The ancient Greeks developed this distinction of value between craft on the one hand and understanding on the other, and according to Dear, that distinction has survived to shape attitudes toward science ever since. Teasing out this tension between doing and knowing during key episodes in the history of science—mechanical philosophy and Newtonian gravitation, elective affinities and the chemical revolution, enlightened natural history and taxonomy, evolutionary biology, the dynamical theory of electromagnetism, and quantum theory—Dear reveals how the two principles became formalized into a single enterprise, science, that would be carried out by a new kind of person, the scientist. Finely nuanced and elegantly conceived, The Intelligibility of Nature will be essential reading for aficionados and historians of science alike. (shrink)
Mathematically, gauge theories are extraordinarily rich --- so rich, in fact, that it can become all too easy to lose track of the connections between results, and become lost in a mass of beautiful theorems and properties: indeterminism, constraints, Noether identities, local and global symmetries, and so on. -/- One purpose of this short article is to provide some sort of a guide through the mathematics, to the conceptual core of what is actually going on. Its focus is on the (...) Lagrangian, variational-problem description of classical mechanics, from which the link between gauge symmetry and the apparent violation of determinism is easy to understand; only towards the end will the Hamiltonian description be considered. -/- The other purpose is to warn against adopting too unified a perspective on gauge theories. It will be argued that the meaning of the gauge freedom in a theory like general relativity is (at least from the Lagrangian viewpoint) significantly different from its meaning in theories like electromagnetism. The Hamiltonian framework blurs this distinction, and orthodox methods of quantization obliterate it; this may, in fact, be genuine progress, but it is dangerous to be guided by mathematics into conflating two conceptually distinct notions without appreciating the physical consequences. (shrink)
Why did Einstein tirelessly study unified field theory for more than 30 years? In this book, the author argues that Einstein believed he could find a unified theory of all of nature's forces by repeating the methods he used when he formulated general relativity. The book discusses Einstein's route to the general theory of relativity, focusing on the philosophical lessons that he learnt. It then addresses his quest for a unified theory for electromagnetism and gravity, discussing in detail his (...) efforts with Kaluza-Klein and, surprisingly, the theory of spinors. From these perspectives, Einstein's critical stance towards the quantum theory comes to stand in a new light. This book will be of interest to physicists, historians and philosophers of science. (shrink)
This book is a stimulating and engaging discussion of philosophical issues in the foundations of classical electromagnetism. In the rst half, Frisch argues against the standard conception of the theory as consistent and local. The second half is devoted to the puzzle of the arrow of radiation: the fact that waves behave asymmetrically in time, though the laws governing their evolution are temporally symmetric. The book is worthwhile for anyone interested in understanding the physical theory of electromagnetism, as (...) well for the views it presents on philosophical issues such as causation, counterfactuals, laws, scienti c theories, models, and explanation. While philosophers of physics tend to focus on quantum mechanics and relativity, Frisch’s book shows that there are deep foundational issues in classical physics, equally worthy of attention. That said, let me lodge disagreement on some key points. Frisch argues from an alleged inconsistency in classical electromagnetism— that Maxwell’s equations, the Lorentz force law, and the conservation of energy cannot be jointly true—to the conclusion that the standard view of scienti c theories as a formalism plus an interpretation is incorrect. Consistency is a necessary condition of any view on which scienti c theories give us an account of “ways the world could be” (Frisch, , ). Since classical electromagnetism is successfully used by practicing physicists, consistency must be just one criterion of theory choice weighed equally among others. This is an intriguing idea, but I am not sure that consistency can be given up so easily. That road leads dangerously close to accepting orthodox ‘Copenhagen’ quantum mechanics. Surely the inconsistency of.. (shrink)
Classically, a gauge potential was merely a convenient device for generating a corresponding gauge field. Quantum-mechanically, a gauge potential lays claim to independent status as a further feature of the physical situation. But whether this is a local or a global feature is not made any clearer by the variety of mathematical structures used to represent it. I argue that in the theory of electromagnetism (or a non-Abelian generalization) that describes quantum particles subject to a classical interaction, the gauge (...) potential is best understood as a feature of the physical situation whose global character is most naturally represented by the holonomies of closed curves in space-time. (shrink)
Einstein considered general covariance to characterize the novelty of his General Theory of Relativity (GTR), but Kretschmann thought it merely a formal feature that any theory could have. The claim that GTR is ``already parametrized'' suggests analyzing substantive general covariance as formal general covariance achieved without hiding preferred coordinates as scalar ``clock fields,'' much as Einstein construed general covariance as the lack of preferred coordinates. Physicists often install gauge symmetries artificially with additional fields, as in the transition from Proca's to (...) Stueckelberg's electromagnetism. Some post-positivist philosophers, due to realist sympathies, are committed to judging Stueckelberg's electromagnetism distinct from and inferior to Proca's. By contrast, physicists identify them, the differences being gauge-dependent and hence unreal. It is often useful to install gauge freedom in theories with broken gauge symmetries (second-class constraints) using a modified Batalin-Fradkin-Tyutin (BFT) procedure. Massive GTR, for which parametrization and a Lagrangian BFT-like procedure appear to coincide, mimics GTR's general covariance apart from telltale clock fields. A generalized procedure for installing artificial gauge freedom subsumes parametrization and BFT, while being more Lagrangian-friendly than BFT, leaving any primary constraints unchanged and using a non-BFT boundary condition. Artificial gauge freedom licenses a generalized Kretschmann objection. However, features of paradigm cases of artificial gauge freedom might help to demonstrate a principled distinction between substantive and merely formal gauge symmetry. (shrink)
Henri Poincaré’s views on the foundations of mechanics and the nature of mechanical explanation were influenced by the work of two of the most renowned nineteenth century scientists, JamesClerkMaxwell and Heinrich Hertz. In order then to unravel Poincaré’s views and own contribution to the subject it is important to see the connection between Maxwell’s and Hertz’s researches on the one hand and Poincaré’s on the other. Consequently, I start this paper with a brief account (...) of Poincaré’s encounter with Maxwell’s work in electromagnetism. Then, in section 2, I move on to show how Hertz’s work on the foundations of mechanics shaped Poincaré own views. In sections 3 and 4, I formulate Poincaré’s own conventionalist philosophy of mechanics and show how several methodological considerations, especially the search for unity, mitigated his conventionalism. Having thus examined Poincaré’s views on the foundations of mechanics, in section 5 I turn my attention to his notion of mechanical explanation and his proof that a mechanical explanation of a set of phenomena is possible if (and only if) the principle of conservation of energy is satisfied. I then go on to show how Poincaré secured the possibility of a mechanical explanation of electromagnetic phenomena, and also how, having done so, he ended up with an unlimited number of configurations of matter in motion that could underpin electromagnetic phenomena. The upshot of this paper will be that Poincaré departed from the traditional conceptions on mechanical explanation and defended a purely structural conception, the strong point of which was that it promoted — as the motto of this paper says — the true and only aim of science, namely unity. (shrink)
Newton and Einstein each in his way showed us the following: an epistemologically responsible physicist adopts the most measured understanding possible of spacetime structure. The proper way to infer a doctrine of spacetime is by a kind of measuring inference -- a deduction from phenomena. Thus it was (I argue) by an out-and-out deduction from the phenomena of inertiality (as colligated by the three laws of motion) that Newton delineated the conceptual presuppositions concerning spacetime structure that are needed before we (...) can actually think coherently about these phenomena. And Einstein (I argue) very much recapitulated this argument pattern, twice over in fact, recolligating the phenomena first so as to add something from the laws of electromagnetism, and then so as to add everything about gravitation, into what he understood by inertiality. Notably, to deduce one’s theoretical conclusions from phenomena is both more cautious and more cogent than to "infer to the best explanation". And in the context of the development of a doctrine of spacetime, deductions from phenomena lay before us formal rather than causal understanding. Deductions from phenomena tell us, in this context, not what things or what causes there are, but rather what our concepts should be like. The more measured the inference is, however, the more definitively it tells us this. For these reasons the most measured understanding of spacetime lies on a line between conventionalism and realism, between relationalism and absolutism, and indeed (as I demonstrate) between empiricism and rationalism. Spacetime is understood as neither merely immanent in material goings-on, nor truly transcendent of them either. In order to explain this understanding as adequately as I can and in order to remark its excellences most fully, I consider some respects in which the tertium quid between metaphysical realism and strict empiricism about spacetime is wise in the sense of practical wisdom. The wisest understanding of spacetime illustrates, I argue, an original and fundamental connection that epistemology has with ethics. (shrink)
In 1907, Einstein set out to fully relativize all motion, no matter whether uniform or accelerated. After ﬁve failed attempts between 1907 and 1918, he ﬁnally threw in the towel around 1920, setting himself a new goal. For the rest of his life he searched for a classical ﬁeld theory unifying gravity and electromagnetism. As he struggled to relativize motion, Einstein had to readjust both his approach and his objectives at almost every step along the way; he got himself (...) hopelessly confused at times; he fooled himself with fallacious arguments and sloppy calculations; and he committed what he later allegedly called the biggest blunder of his career: he introduced the cosmological constant. There is a very uplifting moral to this somber tale. Although Einstein never reached his original destination, the harvest of his thirteen-year odyssey is quite impressive. First of all, what is left of absolute motion in general relativity is far more palatable than absolute motion in special relativity or Newtonian theory. And general relativity does seem to eliminate absolute space. More importantly, from a modern physics point of view, Einstein produced a spectacular new theory of gravity based on what he called the equivalence principle. This principle says that inertial and gravitational effects are due to one and the same structure, the inertio-gravitational ﬁeld, which in Einstein’s theory is represented by a metric tensor ﬁeld. In addition to laying the foundations of this theory, Einstein, among other things, launched relativistic cosmology, suggested the possibility of gravitational waves, gave the ﬁrst sensible deﬁnition of a space-time singularity, and caught on to the intimate connection between general covariance and energy-momentum conservation, an example of the general connection between symmetries and conservation laws of Noether’s theorems. These results more than make up for the—at least by the standards of modern philosophy of science—rather opportunistic way in which they were obtained.. (shrink)
Visual analogy is believed to be important in human problem solving. Yet, there are few computational models of visual analogy. In this paper, we present a preliminary computational model of visual analogy in problem solving. The model is instantiated in a computer program, called Galatea, which uses a language for representing and transferring visual information called Privlan. We describe how the computational model can account for a small slice of a cognitive-historical analysis of Maxwell’s reasoning about electromagnetism.
Einstein's theories of special and general relativity are unanimously praised by scientists for their extraordinary beauty to the extent that some consider the latter to be the most beautiful theory in physics. The grounds for these assertions are assessed here and it is concluded that the beauty of Einstein's theories can be attributed to two of their aspects. The first is that they incorporate all possible ingredients that constitute the beauty of theories: simplicity, symmetry, invariance, unification, etc. The second concerns (...) the perfect logical consistency of Einstein's theories, a crucial factor in the unanimous praise of their beauty. Theories other than Einstein's are also assessed here, namely, electromagnetism and the various quantum theories, and it is concluded that all these suffer from logical flaws. This has resulted in a lack of consensus among scientists with respect to their beauty. Consequently, the unanimous claim that Einstein's theories are of superior beauty as compared to other theories in physics is hereby substantiated. (shrink)
Abstract Philosophical discussions of experiment usually focus exclusively on testing predictions. In this paper I compare G. Morpurgo's experimental test of the Gell?Mann/ Zweig quark hypothesis with two neglected uses of experiment: constructing representations of new phenomena and inventing the instruments that produce such phenomena. These roles are illustrated by J. B. Biot's 1821 observations of electromagnetism and by Michael Faraday's invention of the first electromagnetic motor, also in 1821. The comparison identifies similarities between observation and experiment, showing how (...) both observation and experiment actively engage the natural world and how each engagement shapes representation and subsequent empirical work. This challenges the post?empiricist assumption of the sufficiency of knowing only the outcomes of experiments. I conclude that traditional views of observational access have looked in the wrong place for empirical constraints on theorizing. The active character of observation implies that a realist interpretation of experimenters? discourse should be grounded in the fine structure of experimental practice rather than the supposedly decisive, golden events favoured by hypothetico?deductive methodology. (shrink)
Faraday's field concept presupposes that field stresses should share the axial symmetry of the lines of force. In the present article, the field dynamics is similarly required to depend only on field properties that can be tested through the motion of test-particles. Precise expressions of this 'Faradayan' principle in field-theoretical language are shown to severely restrict the form of classical field theories. In particular, static forces must obey the inverse square law in a linear approximation. Within a Minkowskian and Lagrangian (...) framework, the Faradayan principle automatically leads to Maxwell's theory of electromagnetism and to Einstein's theory of gravitation, without appeal to the equivalence principle. A comparison is drawn between this, Feynman's, and Einstein's way to arrive at general relativity. (shrink)
It is argued that Weyl’s theory of gravitation and electricity came out of ‘mathematical justice’: out of the equal rights direction and length. Such mathematical justice was manifestly at work in the context of discovery, and is enough (together with a couple of simple and natural operations) to derive all of source-free electromagnetism. Weyl’s repeated references to coordinates and gauge are taken to express equal treatment of direction and length.
This article explores Michael Faraday’s “Historical Sketch of Electro‐Magnetism” as a fruitful source for understanding the epistemic significance of experimentation. In this work Faraday provides a catalog of the numerous experimental and theoretical developments in the early history of electromagnetism. He also describes methods that enable experimentalists to dissociate experimental results from the theoretical commitments generating their research. An analysis of the methods articulated in this sketch is instructive for confronting epistemological worries about the theory‐dependence of experimentation. †To contact (...) the author, please write to: 10289 Saint Katherine Lane, Saint Ann, MO 63074; e‐mail: firstname.lastname@example.org. (shrink)
This article examines the implications of the holonomy interpretation of classical electromagnetism. As has been argued by Richard Healey and Gordon Belot, classical electromagnetism on this interpretation evinces a form of nonseparability, something that otherwise might have been thought of as confined to nonclassical physics. Consideration of the differences between this classical nonseparability and quantum nonseparability shows that the nonseparability exhibited by the classical electromagnetism on the holonomy interpretation is closer to separability than might at first appear.
In 1912, Henri Poincaré published an argument which apparently shows that the hypothesis of quanta is both necessary and sufficient for the truth of Planck''s experimentally corroborated law describing the spectral distribution of radiant energy in a black body. In a recent paper, John Norton has reaffirmed the authority of Poincarés argument, setting it up as a paradigm case in which empirical data can be used to definitively rule out theoretical competitors to a given theoretical hypothesis. My goal is (...) to dispute Norton''s claim that there is no theoretical underdetermination problem arising between classical physics and early quantum theory. The strategy I use in defending my view is to adopt a suggestion made by Jarrett Leplin and Larry Laudan on how to assess the relative merits of competing theoretical alternatives, where each alternative has an equal capacity to save the phenomena. In the course of the paper, I distinguish between two branches of classical physics: classical mechanics and classical electromagnetism. The former is claimed by Norton and Poincaré to be determinately ruled out by the black body evidence; and it is the former that I argue is compatible with this evidence. (shrink)
This article focuses on subtle energies (those energies that fall outside the four regularly recognized energy forces of gravity, electromagnetism, and the strong and weak nuclear forces). Research and insights from the social, physical, and healing sciences are discussed. Key concepts from these disciplines are explored creating a cross-disciplinary analysis of recent research. A case is made for building upon the growing understanding of the influence and importance of the subtle energies in our daily lives as well as the (...) ongoing evolution of our species and planet. The author advocates for increased explorations in the use of these energies for positive social transformation and healing. (shrink)
If you have taken a college biology class, or just watched Animal Planet, you may have been struck by the startling complexity of living organisms. From the grandest mammal to the lowliest cell, life displays intricacy and structure that would put a high-paid team of engineers to shame. How could such fantastically organized, complex structures arise blindly out of unintelligent matter? Speaking of matter, why is it the way it is? Though unimaginably vast, our universe has precise features, as does (...) the matter in it. A glance at the inside back cover of a college physics textbook shows that there are extremely precise numbers describing the fundamental properties of matter. These include numbers for the speed of light in a vacuum, for the masses of fundamental particles like the electron, proton, and neutron, and for the strengths of forces like gravity and electromagnetism that act on those particles. These numbers seem utterly arbitrary. For all we know, they could have been completely different. Yet they turn out to be exactly what a universe needs in order for complex life to emerge in it. Likewise, the cosmology section of an astronomy course will teach you that there are very precise values for the temperature of the universe, for how much matter there is per cubic centimeter in the universe, for the rate at which the universe is expanding, and so on. How did those numbers get to be what they are? Were they just magically pulled out of a cosmic hat at the Big Bang? (shrink)
Debate among scientists is frequently hampered by intense difficulties in communicating and translating their viewpoints. This well-known fact illustrates the role of unarticulated core knowledge in the activities of sientific communities. But it has been little noticed that the issue afficts not just written science, but especially traditions of experimental activity and their products, including instruments and techniques. The question is addressed on the basis of examples from the history of optics and electromagnetism - Fresnel and Brewster, Maxwell and (...) Hertz - and texts from Kuhn's Structure. Particular attention is paid to interrelations between succeeding theories, and to the notorious problem of theory-choice. (shrink)
1. Introduction : humanity's urge to understand -- 2. Elements of scientific thinking : skepticism, careful reasoning, and exhaustive evaluation are all vital. Science Is universal -- Maintaining a critical attitude. Reasonable skepticism -- Respect for the truth -- Reasoning. Deduction -- Induction -- Paradigm shifts -- Evaluating scientific hypotheses. Ockham's razor -- Quantitative evaluation -- Verification by others -- Statistics : correlation and causation -- Statistics : the indeterminacy of the small -- Careful definition -- Science at the frontier. (...) When good theories become ugly -- Stuff that just does not fit -- 3. Christopher Columbus and the discovery of the "Indies" : it can be disastrous to stubbornly refuse to recognize that you have falsified your own hypothesis -- 4. Antoine Lavoisier and Joseph Priestley both test the befuddling phlogiston theory : junking a confusing hypothesis may be necessary to clear the way for new and productive science -- 5. Michael Faraday discovers electromagnetic induction but fails to unify electromagnetism and gravitation : it is usually productive to simplify and consolidate your hypotheses -- 6. Wilhelm Röntgen intended to study cathode rays but ended up discovering X-rays : listen carefully when Mother Nature whispers in your ear : she may be leading you to a Nobel Prize -- 7. Max Planck, the first superhero of quantum theory, saves the universe from the ultraviolet catastrophe : assemble two flawed hypotheses about a key phenomenon into a model that fits experiment exactly and people will listen to you even if you must revolutionize physics -- 8. Albert Einstein attacks the problem "Are atoms real?" from every angle : solving a centuries-old riddle in seven different ways can finally resolve it -- 9. Niels Bohr models the hydrogen atom as a quantized system with compelling exactness, but his later career proves that collaboration and developing new talent can become more significant than the groundbreaking research of any individual -- 10. Conclusions, status of science, and lessons for our time. Conclusions from our biographies -- What thought processes lead to innovation? -- Is the scientist an outsider? -- The status of the modern scientific enterprise -- Lessons for our time -- Can the scientific method be applied to public policy? -- Why so little interest in science? -- Knowledge is never complete. (shrink)
This volume is about searching for fundamental theory in physics which has become somewhat elusive in recent decades. Like a group of blind men investigating an elephant, one physicist postulates the trunk as a hose, another a leg as a tree, the body a wall or barrier, the tail a rope and the ears as a fan. The organizers of the Vigier series symposia strongly believe cross polination by exploring many avenues of seemingly disparate research is key to breakthrough discovery (...) and solicited papers on all areas of physics deemed pertinent in Astrophysics, Cosmology, nuclear physics, quantum theory, electromagnetism, thermodynamics, vacuum field theory and topology. (shrink)
This bibliographical review of the modelling of the mitotic apparatus covers a period of one hundred and twenty years, from the discovery of the bipolar mitotic spindle up to the present day. Without attempting to be fully comprehensive, it will describe the evolution of the main ideas that have left their mark on a century of experimental and theoretical research. Fol and Bütschli's first writings date back to 1873, at a time when Schleiden and Schwann's cell theory was rapidly gaining (...) ground throughout Germany. Both mitosis and chromosomes were to be discovered within the space of thirty years, along with the two key events in the animal and plant reproductive cycle, namely fecondation and meiosis. The mitotic pole, a term still in use to this day, was employed to describe a morphological fact which was noted as early as 1876, namely that the lines and the dots of the karyokinetic figure, with its spindle and asters, looks remarkably like the lines of force around a bar magnet. This was to lead to models designed to explain the movements of chromosomes which take place when the cell nucleus appears to cease to exist as an organelle during mitosis. The nature of those mechanisms and the origin of the forces behind the chromosomes' ordered movements were central to the debate. Auguste Prenant, in a remarkable bibliographical synthesis published in 1910, summed up the opposing viewpoints of the vitalists, on the one hand, who favoured the theory of contractility or extensility in spindle fibres, and of those who believed in models based on physical phenomena, on the other. The latter subdivided into two groups: some, like Bütschli, Rhumbler or Leduc, referred to diffusion, osmosis and superficial tension, whilst the others, led by Gallardo and Hartog, focussed on the laws of electromagnetism. Lillie, Kuwada and Darlington followed up this line of research. The mid-20th century was a major turning point. Most of the modelling mentioned above was criticized and fell into disuse after disappearing from research publications and textbooks.This marked the onset of a new era, as electron microscopes made possible the materialization and detailed study of the macromolecular elements of the fibres, filaments and microtubules of the cytoskeleton. The successive phases of (a) de Harven and Bernhard's 1956 discovery of the centriole's ultrastructure, (b) its identification with the basal body of the cilia and flagella, confirming the theory set out by Henneguy and von Lenhossek (1898–99), (c) the universal presence of microtubules in animal, vegetal and eukaryotic protist cells, (d) the polymerization-depolymerization induced reversible transformations of the tubulin pool in mitosing cells (Inoue, 1960), (e) ultrastructural comparative studies of the mitotic apparatus of eukaryotes illustrating the Pickett-Heaps integrating concept of the MTOC (microtubule-organizing centre), (f) the possibility ofin vitro experiments on mtocs or on microtubules, brings us upon the present day, which has seen the focus placed on the concept of motor-proteins (kinesin, dynein) and on cell cycle models. The latter are based on a close coincidence between the observable modifications of the mitotic apparatus and the periodic variations in intracellular concentrations of calcium or of certain enzymes (cyclins, Cdc2) during the main transitions of the cell cycle. (shrink)
Alternatives in the History of Science. The paper deals with the function of the scientist's subjective activity in the research process. This will be discussed at the background of the discourse between distant action and narrow action theories of electromagnetism in 19th century physics. The analysis shows in which high degree the protagonists of these theories (Weber, Maxwell) regarded this situation consciously as a bifurcation (alternative) in the development of their science. This article describes then how the history of (...) science values the case. The result of this valuation is dependent on the different philosophical points of view. Finally we point out some desiderata for the further discussion on methodology of science that would be the consequence of the acknowledgement of real bifurcations in scientific thought. (shrink)
Professor Sir Roger Penrose's work, spanning fifty years of science, with over five thousand pages and more than three hundred papers, has been collected together for the first time and arranged chronologically over six volumes, each with an introduction from the author. Where relevant, individual papers also come with specific introductions or notes. Many important realizations concerning twistor theory occurred during the short period of this third volume, providing a new perspective on the way that mathematical features of the complex (...) geometry of twistor theory relate to actual physical fields. Following on from the nonlinear graviton construction, a twistor construction was found for (anti-)self-dual electromagnetism allowing the general (anti-)self-dual Yang-Mills field to be obtained. It became clear that some features of twistor contour integrals could be understood in terms of holomorphic sheaf cohomology. During this period, the Oxford research group founded the informal publication, Twistor Newsletter. This volume also contains the influential Weyl curvature hypothesis and new forms of Penrose tiles. (shrink)