The success of particle detection in high energy physics colliders critically depends on the criteria for selecting a small number of interactions from an overwhelming number that occur in the detector. It also depends on the selection of the exact data to be analyzed and the techniques of analysis. The introduction of automation into the detection process has traded the direct involvement of the physicist at each stage of selection and analysis for the efficient handling of vast amounts (...) of data. This tradeoff, in combination with the organizational changes in laboratories of increasing size and complexity, has resulted in automated and semi-automated systems of detection. Various aspects of the semi-automated regime were greatly diminished in more generic automated systems, but turned out to be essential to a number of surprising discoveries of anomalous processes that led to theoretical breakthroughs, notably the establishment of the StandardModel of particlephysics. The automated systems are much more efficient in confirming specific hypothesis in narrow energy domains than in performing broad exploratory searches. Thus, in the main, detection processes relying excessively on automation are more likely to miss potential anomalies and impede potential theoretical advances. I suggest that putting substantially more effort into the study of electron–positron colliders and increasing its funding could minimize the likelihood of missing potential anomalies, because detection in such an environment can be handled by the semi-automated regime—unlike detection in hadron colliders. Despite virtually unavoidable excessive reliance on automated detection in hadron colliders, their development has been deemed a priority because they can operate at currently highest energy levels. I suggest, however, that a focus on collisions at the highest achievable energy levels diverts funds from searches for potential anomalies overlooked due to tradeoffs at the previous energy thresholds. I also note that even in the same collision environment, different research strategies will opt for different tradeoffs and thus achieve different experimental outcomes. Finally, I briefly discuss current searches for anomalous process in the context of the previous analysis. (shrink)
According to structural realism, in mature science there is structural continuity along theoretical change. A major counterexample to this thesis is the transition from the Eightfold Way to the StandardModel in particlephysics. Nevertheless, the notion of structure is significantly important in comprehending the theoretical picture of particlephysics, where particles change and undergo transmutations, while the only thing which remains unchanged is the basic structure, i.e. the symmetry group which controls the transmutations. (...) This kind of view agrees with the paradigmatic case where the structure is an internal symmetry and the instantiations are the elementary particles. The metaphysical view which reflects this situation is a version of ontic structuralism. (shrink)
Does a world that contains chemistry entail the validity of both the standardmodel of elementary particlephysics and general relativity, at least as effective theories? This article shows that the answer may very well be affirmative. It further suggests that the very existence of stable, spatially extended material objects, if not the very existence of the physical world, may require the validity of these theories.
The standardmodel of the quantum theory of measurement is based on an interaction Hamiltonian in which the observable to be measured is multiplied by some observable of a probe system. This simple Ansatz has proved extremely fruitful in the development of the foundations of quantum mechanics. While the ensuing type of models has often been argued to be rather artificial, recent advances in quantum optics have demonstrated their principal and practical feasibility. A brief historical review of the (...)standardmodel together with an outline of its virtues and limitations are presented as an illustration of the mutual inspiration that has always taken place between foundational and experimental research in quantum physics. (shrink)
The dominant view in the cognitive science of religion (the ‘StandardModel’) is that religious belief and behaviour are not adaptive traits but rather incidental byproducts of the cognitive architecture of mind. Because evidence for the StandardModel is inconclusive, the case for it depends crucially on its alleged methodological superiority to selectionist alternatives. However, we show that the StandardModel has both methodological and evidential disadvantages when compared with selectionist alternatives. We also consider (...) a pluralistic approach, which holds that religion or various aspects of religion originated as byproducts of evolved cognitive structures but were subsequently co-opted for adaptive purposes. We argue that when properly formulated, the pluralistic approach also has certain advantages over the StandardModel. (shrink)
This paper develops a means–end analysis of an inductive problem that arises in particlephysics: how to infer from observed reactions conservation principles that govern all reactions among elementary particles. I show that there is a reliable inference procedure that is guaranteed to arrive at an empirically adequate set of conservation principles as more and more evidence is obtained. An interesting feature of reliable procedures for finding conservation principles is that in certain precisely defined circumstances they must introduce (...) hidden particles. Among the reliable inductive methods there is a unique procedure that minimizes convergence time as well as the number of times that the method revises its conservation principles. Thus the aims of reliable, fast and steady convergence to an empirically adequate theory single out a unique optimal inference for a given set of observed reactions—including prescriptions for when exactly to introduce hidden particles. (shrink)
This book is the first to offer a systematic account of the role of language in the development and interpretation of physics. An historical-conceptual analysis of the co-evolution of physics and mathematics leads to the classical/quantum interface. Bohr's interpretation is analyzed and extended to the interpretation of the standardmodel of particlephysics.
I make an attempt at the description of the delicate role of the standardmodel of arithmetic for the syntax of formal systems. I try to assess whether the possible instability in the notion of finiteness deriving from the nonstandard interpretability of arithmetic affects the very notions of syntactic metatheory and of formal system. I maintain that the crucial point of the whole question lies in the evaluation of the phenomenon of formalization. The ideas of Skolem, Zermelo, Beth (...) and Carnap (among others) on the problem are discussed. ‘A tries to explain to B the meaning of negation. Finally A gives up, saying: “You don’t understand what I mean, and I am not going to explain any longer,” to which B replies: “Yes, I see what you mean, and I am glad you are willing to continue your explanations”’. G. Mannoury, reported by E. W. Beth (Beth, 1963, 489). (shrink)
This paper responds to a recent claim by Shrader-Frechette that current particlephysics, with its essentially atomist paradigm, is in a state of Kuhnian crisis. We respond to Shrader-Frechette's claim in two ways: first, we argue directly against much of the evidence used by Shrader-Frechette as indicators of Kuhnian crisis; second, we question Shrader-Frechette's application of Kuhnian categories to current research in general, pointing out the dangers inherent in such an analysis.
But the question raised repeatedly in the news media was: What difference does it make? Who cares if the Top mass is 180 GeV or 120 GeV? What possible effect could it have on the "real world" of Medicare and rock stars and ethnic cleansing and Superbowls and insider trading? In this column we will present some ideas from a colloquium given recently at the University of Washington by Dr. Robert N. Cahn of Lawrence Berkeley National Laboratory which address these (...) questions. We'll start by considering the StandardModel of particlephysics. (shrink)
Aquinas’s eudaimonism is normally interpreted as twofold in the sense of it dividing into the imperfect, natural happiness of Aristotle and the perfect, supernatural happiness of Augustine. I argue in this work that Aquinas is logically committed to a third type of happiness that, in light of the standard view, rendershis eudaimonism threefold. The paper begins with an overview of the standard twofold model of Aquinas’s eudaimonism; it then turns to the model’s logicalproblem whose solution requires (...) the postulation of a third type of happiness. In the second part of the paper, two clarificatory issues are addressed, several objections are considered, and in closing, I explain why Aquinas’s commitment to a third type of happiness offers the Christian wayfarer grounds for a new optimism. (shrink)
The paper studies the nature of understanding in condensed matter physics (CMP), mediated by the successful employment of its models. I first consider two obvious candidates for the criteria of model-based understanding, Van Fraassen's sense of empirical adequacy and Hacking's instrumental utility , and conclude that both are unsatisfactory. Inspired by Hasok Chang's recent proposal to reformulate realism as the pursuit of ontological plausibility in our system of knowledge, we may require the model under consideration to be (...) understood (or intelligible) before claiming model-based understanding. Here the understanding of a model typically consists of the following: figuring out at least one plausible (preferably realistic) physical mechanism for the model, determining the theoretical consequences of the model by mathematically probing it and developing our physical intuitions about the model. I consider the q-state Potts model to illustrate. After having understood a model, we may employ the model to understand its target phenomena in the world. This is done by matching one of the interpretative models of the model with the central features of the phenomena. The matching should be motivated (ideally both theoretically and empirically) in the sense that we have good reason to believe that the central features of the phenomena can be thought of as having more or less the same structure as postulated by the interpretative model. In conclusion, I propose a two-stage account of model-based understanding in CMP: (1) understanding of a model and (2) matching a target phenomenon with a well-motivated interpretative model of the model. Empirical success and instrumental utility both play their roles in the evaluation of how successful the model is, but are not the essential part of model-based understanding. (shrink)
We briefly describe in this paper the passage from Mendeleev’s chemistry (1869) to atomic physics (in the 1900’s), nuclear physics (in 1932) and particlephysics (from 1953 to 2006). We show how the consideration of symmetries, largely used in physics since the end of the 1920’s, gave rise to a new format of the periodic table in the 1970’s. More specifically, this paper is concerned with the application of the group SO(4,2)⊗SU(2) to the periodic table (...) of chemical elements. It is shown how the Madelung rule of the atomic shell model can be used to set up a periodic table that can be further rationalized via the group SO(4,2)⊗SU(2) and some of its sub-groups. Qualitative results are obtained from this nonstandard table. (shrink)
The electron is conceived here as a complex structure composed of a subparticle that is bound to a nearly circular motion. Although in quantum mechanics the spin is not representable, in classical stochastic physics this corresponds to the angular momentum of the subparticle. In fact, assuming Schrödinger-type hydrodynamic equations of motion for the subparticle, the spin-1/2 representation in configuration space and the corresponding Pauli matrices for the electron are obtained. The Hamiltonian of Pauli's theory as the nonrelativistic limit of (...) Dirac's equation is also derived. (shrink)
Conventional particle theories such as the StandardModel have a number of freely adjustable coupling constants and mass parameters, depending on the symmetry algebra of the local gauge group and the representations chosen for the spinor and scalar fields. There seems to be no physical principle to determine these parameters as long as they stay within certain domains dictated by the renormalization group. Here however, reasons are given to demand that, when gravity is coupled to the system, (...) local conformal invariance should be a spontaneously broken exact symmetry. The argument has to do with the requirement that black holes obey a complementarity principle relating ingoing observers to outside observers, or equivalently, initial states to final states. This condition fixes all parameters, including masses and the cosmological constant. We suspect that only examples can be found where these are all of order one in Planck units, but the values depend on the algebra chosen. This paper combines findings reported in two previous preprints (G. ’t Hooft in arXiv:1009.0669 [gr-qc], 2010; arXiv:1011.0061 [gr-qc], 2010) and puts these in a clearer perspective by shifting the emphasis towards the implications for particle models. (shrink)
Experimental high-energy and nuclear physics was created in Spain thanks to Joaquín Catalá de Alemany, who founded the Institute of Corpuscular Physics (IFIC) at the University of Valencia in 1950. The physics of photographic emulsions, cheap and easy to manipulate, were well adapted to the depressed situation in Spain following the Civil War. This essay describes how, using these techniques, Catalá de Alemany created a group, established links with international laboratories, and fostered a tradition that continues today.
James Elkins has shaped the discussion about how we—as artists, as art historians, or as outsiders—view art. He has not only revolutionized our thinking about the purpose of teaching art, but has also blazed trails in creating a means of communication between scientists, artists, and humanities scholars. In Six Stories from the End of Representation , Elkins weaves stories about recent images from painting, photography, physics, astrophysics, and microscopy. These images, regardless of origin, all fail as representations: they are (...) blurry, dark, pixellated, or otherwise unclear. In these opaque images, Elkins finds an opportunity to create stories that speak simultaneously to artists and to scientists, and to open both those fields to those of us who have little purchase in either. Regarding each image through the lens of the discipline that produced it, Elkins simultaneously affirms the unique structure of each way of viewing the world and brings those views together into a vibrant conversation. (shrink)
If one starts from de Broglie's basic relativistic assumptions, i.e., that all particles have an intrinsic real internal vibration in their rest frame, i.e., hv 0 =m 0 c 2 ; that when they are at any one point in space-time the phase of this vibration cannot depend on the choice of the reference frame, then, one can show (following Mackinnon (1) ) that there exists a nondispersive wave packet of de Broglie's waves which can be assimilated to the nonlinear (...) soliton wave U 0 introduced by him in his double solution model of wave mechanics. (2) Since de Broglie's linear pilot waves can be considered to be real waves propagating as collective motions on a covariant subquantum chaotic “aether,” (3) these new solition waves can be considered as describing the particle's immediate neighborhood, i.e., the aether's reaction to the particle's motion in the stochastic interpretation of quantum mechanics. The existence of such a physical aether (which provides a perfectly causal interpretation of the action-a-distance implied by the Einstein-Podolsky-Rosen experiments) can now be proved by establishing the reality of de Broglie's waves in realizable experiments. (shrink)
So far this has been a lonely and unrewarding quest. New experiments occasionally come along which point to a breakdown of the StandardModel, but up to now they have invariably been proved wrong by more careful analysis or subsequent experiments with better data. A case in point is the energetic jet data from the CDF experiment at FermiLab which suggested possible substructure of the quark. (See my AV column "Inside the Quark" in the September-1996 issue of Analog.) (...) The CDF group found an unexpected excess of "jets" (clumps of energetic particles moving in the same direction) with energies above 200 GeV in their data. They found that they could not explain this excess of high energy jets using the StandardModel, as interpreted by standard theoretical procedures, and they pointed out that the data might represent new physics, possibly an indication that the quark is a composite object made of even more fundamental particles. (shrink)
INTERNATIONAL STUDIES IN THE PHILOSOPHY OF SCIENCE Vol. 5, number 1, Autumn 1991, pp. 79-87. R.M. Nugayev. -/- The fundamental laws of physics can tell the truth. -/- Abstract. Nancy Cartwright’s arguments in favour of phenomenological laws and against fundamental ones are discussed. Her criticisms of the standard cjvering-law account are extended using Vyacheslav Stepin’s analysis of the structure of fundamental theories. It is argued that Cartwright’s thesis 9that the laws of physics lie) is too radical to (...) accept. A model of theory change is proposed which demonstrates how the fundamental laws of physics can, in fact, be confronted with experience. -/- . (shrink)
This paper argues that a successful philosophical analysis of models and simulations must accommodate an account of mathematically rigorous results. Such rigorous results may be thought of as genuinely model-specific contributions, which can neither be deduced from fundamental theory nor inferred from empirical data. Rigorous results provide new indirect ways of assessing the success of models and simulations and are crucial to understanding the connections between different models. This is most obvious in cases where rigorous results map different models (...) on to one another. Not only does this put constraints on the extent to which performance in specific empirical contexts may be regarded as the main touchstone of success in scientific modelling, it also allows for the transfer of warrant across different models. Mathematically rigorous results can thus come to be seen as not only strengthening the cohesion between scientific strategies of modelling and simulation, but also as offering new ways of indirect confirmation. (shrink)
This paper discusses and provides a tentative model of a firm for purposes of accounting. The paper first presents the neo-classical capital circulation model of the firm—a model that has been an integral part of Finnish business economics and accounting education for at least half a century. During the same period the stakeholder model has become an alternative model of the firm in Scandinavia. These models have represented two alternatives to define the firm in education. (...) In this paper we try to combine these two models to provide a more comprehensive picture of the firm, especially from the accounting education perspective. An earlier version of this model was presented in 1995 in a book edited by Juha Näsi. Stakeholders with business transactions were seen to be crucial to firms and were categorized as primary in the 1995 model. In addition to stakeholders, a number of stakeholder issues were illustrated in the stakeholder model serving accounting as an educational framework. In this paper the earlier model has been updated, taking into account the current globalization and other developments in the business world. (shrink)
The author designed the Reasoning Analysis Test to provide empirical support for the CRM analysis of informal fallacies. While informal, the results provide presumptive evidence that those committing informal fallacies may tacitly reason as predicted by CRM. Davis has argued persuasively that Gricean theory has not lived up to expectations, In light of his critique, the CRM analyses of Begging the Question and Equivocation are amended. Johnson has provided standards for judging any theory of informal fallacies. It is argued that (...) CRM survives the Standard Critique and sufficiently meets Johnson's other criteria. (shrink)
There are two opposing traditions in contemporary quantum field theory (QFT). Mainstream Lagrangian QFT led to and supports the standardmodel of particle interactions. Algebraic QFT seeks to provide a rigorous consistent mathematical foundation for field theory, but cannot accommodate the local gauge interactions of the standardmodel. Interested philosophers face a choice. They can accept algebraic QFT on the grounds of mathematical consistency and general accord with the semantic conception of theory interpretation. This suggests (...) a rejection of particle ontology. Or they can accept the standardmodel on the grounds of its established success. This alternative, which I defend, suggests revising philosophical accounts of scientific theory and finding some way of accommodating particles. *Received December 2005; accepted April 2008. †To contact the author, please write to: 2045 Manzanita Drive, Oakland, CA 94611; e‐mail: email@example.com. (shrink)
The rules of scientific discovery as formulated by K. Popper are briefly reviewed. Historical examples such as the prediction of planets and outstanding events in elementary particlephysics are used to show how these rules are applied by the working physicist. Thus these rules are shown to be actual tools rather than abstract norms in the development of physics.
This paper is an amalgam of physics and mathematical logic. It contains an elementary axiomatization of spacetime in terms of the primitive concepts of particle, signal, and transmission and reception. In the elementary language formed with these predicates we state AxiomsE, C, andU, which are naturally interpretable as basic physical properties of particles and signals. We then determine all mathematical models of this axiom system; these represent certain generalizations of the standardmodel. Also, the automorphism groups (...) of the models are determined. Finally we give another physical model and discuss the philosophical implications. (shrink)
Sea ice contains flaws including frictional contacts. We aim to describe quantitatively the mechanics of those contacts, providing local physics for geophysical models. With a focus on the internal friction of ice, we review standard micro-mechanical models of friction. The solid's deformation under normal load may be ductile or elastic. The shear failure of the contact may be by ductile flow, brittle fracture, or melting and hydrodynamic lubrication. Combinations of these give a total of six rheological models. When (...) the material under study is ice, several of the rheological parameters in the standard models are not constant, but depend on the temperature of the bulk, on the normal stress under which samples are pressed together, or on the sliding velocity and acceleration. This has the effect of making the shear stress required for sliding dependent on sliding velocity, acceleration, and temperature. In some cases, it also perturbs the exponent in the normal-stress dependence of that shear stress away from the value that applies to most materials. We unify the models by a principle of maximum displacement for normal deformation, and of minimum stress for shear failure, reducing the controversy over the mechanism of internal friction in ice to the choice of values of four parameters in a single model. The four parameters represent, for a typical asperity contact, the sliding distance required to expel melt-water, the sliding distance required to break contact, the normal strain in the asperity, and the thickness of any ductile shear zone. (shrink)
In recent years, the ontological similarities between the foundations of quantum mechanics and the emptiness teachings in Madhyamika–Prasangika Buddhism of the Tibetan lineage have attracted some attention. After briefly reviewing this unlikely connection, I examine ideas encountered in condensed-matter physics that resonate with this view on emptiness. Focusing on the particle concept and emergence in condensed-matter physics, I highlight a qualitative correspondence to the major analytical approaches to emptiness.
Weak Quantum Theory (WQT) and the Model of Pragmatic Information (MPI) are two psychophysical concepts developed on the basis of quantum physics. The present study contributes to their empirical examination. The issue of the study is whether WQT and MPI can not only explain ‘psi’-phenomena theoretically but also prove to be consistent with the empirical phenomenology of extrasensory perception (ESP). From the main statements of both models, 33 deductions for psychic readings are derived. Psychic readings are defined as (...) settings, in which psychics support or counsel clients by using information not mediated through the five senses. A qualitative approach is chosen to explore how the psychics experience extrasensory perceptions. Eight psychics are interviewed with a half-structured method. The reports are examined regarding deductive and inductive aspects, using a multi-level structured content analysis. The vast majority of deductions is clearly confirmed by the reports. Even though the study has to be seen as an explorative attempt with many aspects to be specified, WQT and MPI prove to be coherent and helpful concepts to explain ESP in psychic readings. (shrink)
The Cartan equations defining simple spinors (renamed “pure” by C. Chevalley) are interpreted as equations of motion in compact momentum spaces, in a constructive approach in which at each step the dimensions of spinor space are doubled while those of momentum space increased by two. The construction is possible only in the frame of the geometry of simple or pure spinors, which imposes contraint equations on spinors with more than four components, and then momentum spaces result compact, isomorphic to invariant-mass-spheres (...) imbedded in each other, since the signatures result steadily Lorentzian; starting from dimension four (Minkowski) up to dimension ten with Clifford algebra ℂℓ(1, 9), where the construction naturally ends. The equations of motion met in the construction are most of those traditionally postulated ad hoc: from Weyl equations for neutrinos (and Maxwell's) to Majorana ones, to those for the electroweak model and for the nucleons interacting with the pseudoscalar pion, up to those for the 3 baryon-lepton families, steadily progressing from the description of lower energy phenomena to that of higher ones. The 3 division algebras: complex numbers, quaternions and octonions appear to be strictly correlated with Clifford algebras and then with this spinor-geometrical approach, from which they appear to gradually emerge in the construction, where they play a basic role for the physical interpretation: at the third step complex numbers generate U(1), possible origin of the electric charge and of the existence of charged—neutral fermion pairs, explaining also easily the opposite charges of proton-electron. Another U(1) appears to generate the strong charge at the fourth step. Quaternions generate the signature of space-time at the first step, the SU(2) internal symmetry of isospin and, in the gauge term, the SU(2) L one, of the electroweak model at the third step; they are also at the origin of 3 families; in number equal to that of quaternion imaginary units. At the fifth and last step octonions generate the SU(3) internal symmetry of flavour, with SU(2) isospin subgroup and, in the gauge term, the one of color, correlated with SU(2) L of the electroweak model. These 3 division algebras seem then to be at the origin of charges, families and of the groups of the Standardmodel. In this approach there seems to be no need of higher dimensional (>4) space-time, here generated by the four Poincaré translations, and dimensional reduction from ℂℓ(1,9) to ℂℓ(1,3) is equivalent to decoupling of the equations of motion. This spinor-geometrical approach is compatible with that based on strings, since these may be expressed bilinearly (as integrals) in terms of Majorana–Weyl simple or pure spinors which are admitted by ℂℓ(1, 9) = R(32). (shrink)
Hodgkin and Huxley’s model of the action potential is an apparent dream case of covering‐law explanation in biology. The model includes laws of physics and chemistry that, coupled with details about antecedent and background conditions, can be used to derive features of the action potential. Hodgkin and Huxley insist that their model is not an explanation. This suggests either that subsuming a phenomenon under physical laws is insufficient to explain it or that Hodgkin and Huxley were (...) wrong. I defend Hodgkin and Huxley against Weber’s heteronomy thesis and argue that explanations are descriptions of mechanisms. †To contact the author, please write to: Department of Philosophy, Philosophy‐Neuroscience‐Psychology Program, Washington University in St. Louis, One Brookings Drive, Wilson Hall, St. Louis, MO 63130; e‐mail: firstname.lastname@example.org. (shrink)
In this paper, I examine the claim that any physical theory will have an extremely limited domain of application because 1) we have to use distinct theories to model different situations in the world and 2) no theory has enough textbook models to handle anything beyond a highly simplified situation. Against the first claim, I show that many examples used to bolster it are actually instances of application of the very same classical theory rather than disjoint theories. Thus, there (...) is a hidden unity to the world of classical physics that is usually overlooked (by, for example, Nancy Cartwright who argues for the claims above). Against the second claim, I show that the practice of classical physics involves an enormous (infinite) number of models the use of which cannot be written off as merely ad hoc. Thus, although classical physics cannot, of course, model every situation in nature, it has a much larger domain than some would have us believe. (shrink)
We extended the Barut’s classical model of zitterbewegung from 3+1 dimensional spacetime into 2+1 and 1+1 dimensional spacetimes and discussed the symmetry and integrability properties of the model in 2+1, 1+1 and 3+1 dimensions. In these cases, the free particle current or the velocity of the particle can be decomposed as a constant convection current and polarization currents.In 2+1 dimensional spacetime, a velocity of the particle and spin tensor are dependent to each other and the (...) chirality can not be introduced. The free particle has 7 constants of motion: The momentum three vector, the charge, the energy in proper time, the scalar constant spin or magnetic polarization and the two components of total angular momentum. Two component electric polarizations oscillate with Zitterbewegung frequency.In 1+1 dimensional spacetime we have an independent velocity vector and a scalar spin tensor. The free particle has 5 integrals of motion: The momentum two vector, the charge, the energy in proper time, and the scalar total angular momentum. The normal component of the velocity or the scalar electric polarization oscillates with Zitterbewegung frequency.In 3+1 dimensional spacetime, the particle has an independent velocity vector, spin tensor and chirality. The free particle has 12 integrals of motion: The momentum four vector, the charge, the energy in proper time or mass, the three vector spin or magnetic polarizations and three components of total angular momentum. The parallel component of velocity into momentum and the normal components of the spin tensor or the spin three vector are constants of motion for the free particle. The chirality and electric polarizations oscillate with the Zitterbewegung frequency. The system is superintegrable in all dimensions. (shrink)
In this paper it is exactly proved that the standard transformations of the three-dimensional (3D) vectors of the electric and magnetic fields E and B are not relativistically correct transformations. Thence the 3D vectors E and B are not well-defined quantities in the 4D space-time and, contrary to the general belief, the usual Maxwell equations with the 3D E and B are not in agreement with the special relativity. The 4-vectors E a and B a , as well-defined 4D (...) quantities, are introduced instead of ill-defined 3D E and B. The proof is given in the tensor and the Clifford algebra formalisms. (shrink)
A generalization of the familiar de Broglie-Bohm interpretation of quantum mechanics is formulated, based on relinquishing the momentum relationship p=∇S and allowing a spread of momentum values at each position. The development of this framework also provides a new perspective on the well-known question of joint distributions for quantum mechanics. It is shown that, for an extension of the original model to be physically acceptable and consistent with experiment, it is necessary to impose certain restrictions on the associated joint (...) distribution for particle positions and momenta. These requirements thereby define a new class of possible models. In pursuing this line of reasoning, the main contributions of this paper are (i) to identify the restrictions that must be imposed, (ii) to demonstrate that joint distribution expressions satisfying them do exist, and (iii) to construct a sample model based on one such joint distribution. (shrink)
The rising interest, in the late 20th century, in the foundations of quantum physics, a subject in which Franco Selleri has excelled, has suggested the fair question: how did it become so? The current answer says that experiments have allowed to bring into the laboratories some previous gedanken experiments, beginning with those about EPR and related to Bell’s inequalities. I want to explore an alternative view, by which there would have been, before Bell’s inequalities experimental tests, a change in (...) the views shared by physicists concerning the intellectual status of that issue. I will take three cases which will serve as the threads of our story: the connections between Bohm’s causal interpretation and Bell’s inequalities; Wigner’s ideas on the measurement problem; and finally Everett’s relative states formulation. In the end, I will discuss how those threads were gathered together by creating foundations of quantum physics as a field of research. (shrink)
A model for the structure of point-like fermions as tightly bound composite states is described. The model is based upon the premise that electromagnetism is the only fundamental interaction. The fundamental entity of the model is an object called the vorton. Vortons are semiclassical monopole configurations of electromagnetic charge and field, constructed to satisfy Maxwell's equations. Vortons carry topological charge and one unit each of two different kinds of angular momenta, and are placed in magnetically bound pair (...) states having angular momentum l=1/2. The topological charge prevents the mutual annihilation of the vorton pair. The helicity eigenstates of the vortons' intrinsic angular momenta form the basis for a set of internal quantum numbers for the pair which distinguish the different (point-like) pair states. Sixteen fourcomponent spinor states, eight leptonic and eight hadronic, are obtained. Eleven of these are identified with the quantum numbers of the experimentally known particles: e, ve, μ, vμ, τ, vτ; p, n, Λ, Λc, and b. Thus one new heavy lepton with its neutrino and three new quark states are predicted. Some possibilities for the extension of this model are discussed. (shrink)
Gauge theories have provided our most successful representations of the fundamental forces of nature. How, though, do such representations work? Interpretations of gauge theory aim to answer this question. Through understanding how a gauge theory's representations work, we are able to say what kind of world our gauge theories reveal to us. -/- A gauge theory's representations are mathematical structures. These may be transformed among themselves while certain features remain the same. Do the representations related by such a gauge transformation (...) merely offer alternative ways of representing the very same situation? If so, then gauge symmetry is a purely formal property since it reflects no corresponding symmetry in nature. -/- Gauging What's Real describes the representations provided by gauge theories in both classical and quantum physics. Richard Healey defends the thesis that gauge transformations are purely formal symmetries of almost all the classes of representations provided by each of our theories of fundamental forces. He argues that evidence for classical gauge theories of forces (other than gravity) gives us reason to believe that loops rather than points are the locations of fundamental properties. In addition to exploring the prospects of extending this conclusion to the quantum gauge theories of the StandardModel of elementary particlephysics, Healey assesses the difficulties faced by attempts to base such ontological conclusions on the success of these theories. (shrink)