Analogies between classical statistical mechanics and quantum field theory played a pivotal role in the development of renormalization group methods for application in the two theories. This paper focuses on the analogies that informed the application of RG methods in QFT by Kenneth Wilson and collaborators in the early 1970's. The central task that is accomplished is the identification and analysis of the analogical mappings employed. The conclusion is that the analogies in this case study are formal analogies, and (...) not physical analogies. That is, the analogical mappings relate elements of the models that play formally analogous roles and that have substantially different physical interpretations. Unlike other cases of the use of analogies in physics, the analogical mappings do not preserve causal structure. The conclusion that the analogies in this case are purely formal carries important implications for the interpretation of QFT, and poses challenges for philosophical accounts of analogical reasoning and arguments in defence of scientific realism. Analysis of the interpretation of the cutoffs is presented as an illustrative example of how physical disanalogies block the exportation of physical interpretations from from statistical mechanics to QFT. A final implication is that the application of RG methods in QFT supports non-causal explanations, but in a different manner than in statistical mechanics. (shrink)
Renormalization group explanations account for the astonishing phenomenon that microscopically very different physical systems display the same macro-behavior when undergoing phase-transitions. Among philosophers, this explanandum phenomenon is often described as the occurrence of a particular kind of multiply realized macro-behavior. In several recent publications, Robert Batterman denies that RG explanations account for this explanandum phenomenon by following the commonality strategy, i.e. by identifying properties that microscopically very different physical systems have in common. Arguing against Batterman’s claim, I defend the (...) view that RG explanations are in accord with the commonality strategy. (shrink)
This is an introduction to renormalization group methods in quantum field theory aimed at philosophers of science. review path integral methods, the relationship between early renormalization theory and renormalization group methods, and conceptual shifts in thinking about quantum field theory spurred by the development of renormalization group methods.
We have two aims. The main one is to expound the idea of renormalization in quantum field theory, with no technical prerequisites. Our motivation is that renormalization is undoubtedly one of the great ideas—and great successes--of twentieth-century physics. Also it has strongly influenced in diverse ways, how physicists conceive of physical theories. So it is of considerable philosophical interest. Second, we will briefly relate renormalization to Ernest Nagel's account of inter-theoretic relations, especially reduction. One theme will (...) be a contrast between two approaches to renormalization. The old approach, which prevailed from ca. 1945 to 1970, treated renormalizability as a necessary condition for being an acceptable quantum field theory. On this approach, it is a piece of great good fortune that high energy physicists can formulate renormalizable quantum field theories that are so empirically successful. But the new approach to renormalization explains why the phenomena we see, at the energies we can access in our particle accelerators, are described by a renormalizable quantum field theory. For whatever non-renormalizable interactions may occur at yet higher energies, they are insignificant at accessible energies. Thus the new approach explains why our best fundamental theories have a feature, viz. renormalizability, which the old approach treated as a selection principle for theories. That is worth saying since philosophers tend to think of scientific explanation as only explaining an individual event, or perhaps a single law, or at most deducing one theory as a special case of another. Here we see a framework in which there is a space of theories. And this framework is powerful enough to deduce that what seemed “manna from heaven” is to be expected: the good fortune is generic. We also maintain that universality, a concept stressed in renormalization theory, is essentially the familiar philosophical idea of multiple realizability; and that it causes no problems for reductions of a Nagelian kind. (shrink)
Providing a precise statement of their position has long been a central challenge facing the scientific realist. This paper draws some morals about how realism ought to be formulated from the renormalization group framework in high energy physics.
One realist response to the pessimistic meta-induction distinguishes idle theoretical wheels from aspects of successful theories we can expect to persist and espouses realism about the latter. Implementing the response requires a strategy for identifying the distinguished aspects. The strategy I will call renormalization group realism has the virtue of directly engaging the gears of our best current physics—perturbative quantum field theories. I argue that the strategy, rather than disarming the skeptical possibilities evinced by the pessimistic meta-induction, forces (...) them to retreat a level. I also suggest that those skeptical possibilities continue to carry force. (shrink)
It is commonly claimed that the universality of critical phenomena is explained through particular applications of the renormalization group. This article has three aims: to clarify the structure of the explanation of universality, to discuss the physics of such RG explanations, and to examine the extent to which universality is thus explained. The derivation of critical exponents proceeds via a real-space or a field-theoretic approach to the RG. Building on work by Mainwood, this article argues that these approaches (...) ought to be distinguished: while the field-theoretic approach explains universality, the real-space approach fails to provide an adequate explanation. (shrink)
Renormalization Scrutinized.Sébastien Rivat - 2019 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 68:23-39.details
In this paper, I propose a general framework for understanding renormalization by drawing on the distinction between effective and continuum Quantum Field Theories (QFTs), and offer a comprehensive account of perturbative renormalization on this basis. My central claim is that the effective approach to renormalization provides a more physically perspicuous, conceptually coherent and widely applicable framework to construct perturbative QFTs than the continuum approach. I also show how a careful comparison between the two approaches: (i) helps to (...) dispel the mystery surrounding the success of the renormalization procedure; (ii) clarifies the various notions of renormalizability; and (iii) gives reasons to temper Butterfield and Bouatta's claim that some continuum QFTs are ripe for metaphysical inquiry (Butterfield & Bouatta, 2014). (shrink)
The renormalization group has been characterized as merely a coarse-graining procedure that does not illuminate the microscopic content of quantum field theory but merely gets us from that content, as given by axiomatic QFT, to macroscopic predictions. I argue that in the constructive field theory tradition, RG techniques do illuminate the microscopic dynamics of a QFT, which are not automatically given by axiomatic QFT. RG techniques in constructive field theory are also rigorous, so one cannot object to their foundational (...) import on grounds of lack of rigor. (shrink)
The renormalization group has been characterized as merely a coarse-graining procedure that does not illuminate the microscopic content of quantum field theory, but merely gets us from that content, as given by axiomatic QFT, to macroscopic predictions. I argue that in the constructive field theory tradition, RG techniques do illuminate the microscopic dynamics of a QFT, which are not automatically given by axiomatic QFT. RG techniques in constructive field theory are also rigorous, so one cannot object to their foundational (...) import on grounds of lack of rigor. (shrink)
This paper discusses the alleged reduction of Thermodynamics to Statistical Mechanics. It includes an historical discussion of J. Willard Gibbs' famous caution concerning the connections between thermodynamic properties and statistical mechanical properties---his so-called ``Thermodynamic Analogies.'' The reasons for Gibbs' caution are reconsidered in light of relatively recent work in statistical physics on the existence of the thermodynamic limit and the explanation of critical behavior using the renormalization group apparatus. A probabilistic understanding of the renormalization group arguments allows (...) for a kind of unification of Gibbs' approach with contemporary understanding of the reduction problem. (shrink)
We revisit the construction of the gravitational functional renormalization group equation tailored to the Arnowitt–Deser–Misner formulation emphasizing its connection to the covariant formulation. The results obtained from projecting the renormalization group flow onto the Einstein–Hilbert action are reviewed in detail and we provide a novel example illustrating how the formalism may be connected to the causal dynamical triangulations approach to quantum gravity.
Discussions of the foundations of Classical Equilibrium Statistical Mechanics (SM) typically focus on the problem of justifying the use of a certain probability measure (the microcanonical measure) to compute average values of certain functions. One would like to be able to explain why the equilibrium behavior of a wide variety of distinct systems (different sorts of molecules interacting with different potentials) can be described by the same averaging procedure. A standard approach is to appeal to ergodic theory to justify this (...) choice of measure. A different approach, eschewing ergodicity, was initiated by A. I. Khinchin. Both explanatory programs have been subjected to severe criticisms. This paper argues that the Khinchin type program deserves further attention in light of relatively recent results in understanding the physics of universal behavior. (shrink)
This paper investigates the consequences for our understanding of physical theories as a result of the development of the renormalization group. Kadanoff's assessment of these consequences is discussed. What he called the ``extended singularity theorem'' poses serious difficulties for philosophical interpretation of theories. Several responses are discussed. The resolution demands a philosophical rethinking of the role of mathematics in physical theorizing.
I argue against the currently prevalent view that algebraic quantum field theory (AQFT) is the correct framework for philosophy of quantum field theory and that “conventional” quantum field theory (CQFT), of the sort used in mainstream particle physics, is not suitable for foundational study. In doing so, I defend that position that AQFT and CQFT should be understood as rival programs to resolve the mathematical and physical pathologies of renormalization theory, and that CQFT has succeeded in this task (...) and AQFT has failed. I also defend CQFT from recent criticisms made by Doreen Fraser. (shrink)
Fearful Symmetry brings the incredible discoveries of contemporary physics within everyone's grasp. A. Zee, a distinguished physicist and skillful expositor, tells the exciting story of how today's theoretical physicists are following Einstein in their search for the beauty and simplicity of Nature. Animated by a sense of reverence and whimsy, the book describes the majestic sweep and accomplishments of twentieth-century physics. In the end, we stand in awe before the grand vision of modern physics--one of the greatest (...) chapters in the intellectual history of humankind. (shrink)
I argue against the currently-prevalent view in philosophy of physics that algebraic quantum field theory is the correct framework for philosophy of quantum field theory and that "conventional" quantum field theory, of the sort used in mainstream particle physics, is not suitable for foundational study. In doing so, I defend the position that AQFT and CQFT, understood in an appropriate sense, ought to be understood as rival programs to resolve the mathematical and physical pathologies of renormalization theory, (...) and that CQFT has succeeded in this task and AQFT has failed. I also defend CQFT from recent criticisms made by Doreen Fraser. (shrink)
We explicate recent results that shed light on the obscure and troubling problem of renormalization in Quantum Field Theory (QFT). We review how divergent predictions arise in perturbative QFT, and how they are renormalized into finite quantities. Commentators have worried that there is no foundation for renormalization, and hence that QFTs are not logically coherent. We dispute this by describing the physics behind liquid diffusion, in which exactly analogous divergences are found and renormalized. But now we are (...) looking at a problem that is physically and formally well-defined, proving that the problems of renormalization, by themselves, cannot refute QFT. (shrink)
In this paper we pay attention to the inconsistency in the derivation of the symmetric electromagnetic energy–momentum tensor for a system of charged particles from its canonical form, when the homogeneous Maxwell’s equations are applied to the symmetrizing gauge transformation, while the non-homogeneous Maxwell’s equations are used to obtain the motional equation. Applying the appropriate non-homogeneous Maxwell’s equations to both operations, we obtained an additional symmetric term in the tensor, named as “compensating term”. Analyzing the structure of this “compensating term”, (...) we suggested a method of “gauge renormalization”, which allows transforming the divergent terms of classical electrodynamics (infinite self-force, self-energy and self-momentum) to converging integrals. The motional equation obtained for a non-radiating charged particle does not contain its self-force, and the mass parameter includes the sum of mechanical and electromagnetic masses. The motional equation for a radiating particle also contains the sum of mechanical and electromagnetic masses, and does not yield any “runaway solutions”. It has been shown that the energy flux in a free electromagnetic field is guided by the Poynting vector, whereas the energy flux in a bound EM field is described by the generalized Umov’s vector, defined in the paper. The problem of electromagnetic momentum is also examined. (shrink)
We explicate recent results that shed light on the obscure and troubling problem of renormalization in Quantum Field Theory. We review how divergent predictions arise in perturbative QFT, and how they are renormalized into finite quantities. Commentators have worried that there is no foundation for renormalization, and hence that QFTs are not logically coherent. We dispute this by describing the physics behind liquid diffusion, in which exactly analogous divergences are found and renormalized. But now we are looking (...) at a problem that is physically and formally well-defined, proving that the problems of renormalization, by themselves, cannot refute QFT. (shrink)
Kenneth Wilson won the Nobel Prize in Physics in 1982 for applying renormalization group, which he learnt from quantum field theory (QFT), to problems in statistical physics—the induced magnetization of materials (ferromagnetism) and the evaporation and condensation of fluids (phase transitions). See Wilson (1983). The renormalization group got its name from its early applications in QFT. There, it appeared to be a rather ad hoc method of subtracting away unwanted infinities. The further allegation was that the (...) procedure is so horrendously complicated that one cannot see the forest for the trees. The second allegation is justified in the applications that made it famous. But it is not true of the following example, which appears in Chowdhury and Stauffer (2000, 486-488). (shrink)
Further arguments are offered in defence of the position that the variant of quantum field theory (QFT) that should be subject to interpretation and foundational analysis is axiomatic quantum field theory. I argue that the successful application of renormalization group (RG) methods within alternative formulations of QFT illuminates the empirical content of QFT, but not the theoretical content. RG methods corroborate the point of view that QFT is a case of the underdetermination of theory by empirical evidence. I also (...) urge caution in extrapolating interpretive conclusions about QFT from the application of RG methods in other contexts (e.g., condensed matter physics). This paper replies to criticisms advanced by David Wallace, but aims to be self-contained. (shrink)
A book on the notion of fundamental length, covering issues in the philosophy of math, metaphysics, and the history and the philosophy of modern physics, from classical electrodynamics to current theories of quantum gravity. Published (2014) in Cambridge University Press.
An effective theory in physics is one that is supposed to apply only at a given length scale; the framework of effective field theory describes a ‘tower’ of theories each applying at different length scales, where each ‘level’ up is a shorter-scale theory. Owing to subtlety regarding the use and necessity of EFTs, a conception of emergence defined in terms of reduction is irrelevant. I present a case for decoupling emergence and reduction in the philosophy of physics. This (...) paper develops a positive conception of emergence, based on the novelty and autonomy of the ‘levels’, by considering physical examples, involving critical phenomena, the renormalisation group, and symmetry breaking. This positive conception of emergence is related to underdetermination and universality, but, I argue, is preferable to other accounts of emergence in physics that rely on universality. (shrink)
The paper is concerned with explaining some of the principal theoretical developments in elementary particle physics and discussing the associated methodological problems both in respect of heuristics and appraisal. Particular reference is made to relativistic quantum field theory, renormalization, Feynman diagram techniques, the analytic S-matrix and the Chew — Frautschi bootstrap.
The debate between Fraser and Wallace over the foundations of quantum field theory has spawned increased focus on both the axiomatic and conventional formalisms. The debate has set the tone for future foundational analysis, and has forced philosophers to “pick a side”. The two are seen as competing research programs, and the major divide between the two manifests in how each handles renormalization. In this paper I argue that the terms set by the Fraser-Wallace debate are misleading. AQFT and (...) CQFT should be viewed as complementary formalisms that start from the same physical basis. Further, the focus on cutoffs as demarcating the two approaches is also highly misleading. Though their methods differ, both axiomatic and conventional QFT seek to use the same physical principles to explain the same domain of phenomena. (shrink)
The author, who was Schrödinger's assistant during his last years in Vienna, gives an account of Schrödinger's views and activities during that time which lead him to a different approach to research on the relations between gravitation and quantum phenomena. Various features of past research are outlined in nontechnical terms. A heuristic argument is presented for the role of the zero-point energy of massive particles in counteracting gravitational collapse and the formation of horizons. Arguments are presented for the view that (...) progress in describing extreme gravitational phenomena can be achieved by the new outlook obtained from the introduction of the analog of Maxwell's vacuum displacement term with a quasiconstant parameter, rather than from renormalization of special processes, even if this is successful. The results can be expected to be in accord with Schrödinger's conjectures.A physical interpretation for the change of sign of the differential invariant of Karlhede, Lindström, and Åman at the horizon is suggested.Some important historical details about Schrödinger are touched upon. (shrink)
One striking feature of the contemporary modelling practice is its interdisciplinary nature. The same equation forms, and mathematical and computational methods, are used across different disciplines, as well as within the same discipline. Are there, then, differences between intra- and interdisciplinary transfer, and can the comparison between the two provide more insight on the challenges of interdisciplinary theoretical work? We will study the development and various uses of the Ising model within physics, contrasting them to its applications to socio-economic (...) systems. While the renormalization group methods justify the transfer of the Ising model within physics – by ascribing them to the same universality class – its application to socio-economic phenomena has no such theoretical grounding. As a result, the insights gained by modelling socio-economic phenomena by the Ising model may remain limited. (shrink)
A venerable tradition in the metaphysics of science commends ontological reduction: the practice of analysis of theoretical entities into further and further proper parts, with the understanding that the original entity is nothing but the sum of these. This tradition implicitly subscribes to the principle that all the real action of the universe (also referred to as its "causation") happens at the smallest scales-at the scale of microphysics. A vast majority of metaphysicians and philosophers of science, covering a wide swath (...) of the spectrum from reductionists to emergentists, defend this principle. It provides one pillar of the most prominent theory of science, to the effect that the sciences are organized in a hierarchy, according to the scales of measurement occupied by the phenomena they study. On this view, the fundamentality of a science is reckoned inversely to its position on that scale. This venerable tradition has been justly and vigorously countered-in physics, most notably: it is countered in quantum theory, in theories of radiation and superconduction, and most spectacularly in renormalization theories of the structure of matter. But these counters-and the profound revisions they prompt-lie just below the philosophical radar. This book illuminates these counters to the tradition principle, in order to assemble them in support of a vaster (and at its core Aristotelian) philosophical vision of sciences that are not organized within a hierarchy. In so doing, the book articulates the principle that the universe is active at absolutely all scales of measurement. This vision, as the book shows, is warranted by philosophical treatment of cardinal issues in the philosophy of science: fundamentality, causation, scientific innovation, dependence and independence, and the proprieties of explanation. (shrink)
This paper looks at emergence in physical theories and argues that an appropriate way to understand socalled “emergent protectorates” is via the explanatory apparatus of the renormalization group. It is argued that mathematical singularities play a crucial role in our understanding of at least some well-defined emergent features of the world.
Conventional particle theories such as the Standard Model have a number of freely adjustable coupling constants and mass parameters, depending on the symmetry algebra of the local gauge group and the representations chosen for the spinor and scalar fields. There seems to be no physical principle to determine these parameters as long as they stay within certain domains dictated by the renormalization group. Here however, reasons are given to demand that, when gravity is coupled to the system, local conformal (...) invariance should be a spontaneously broken exact symmetry. The argument has to do with the requirement that black holes obey a complementarity principle relating ingoing observers to outside observers, or equivalently, initial states to final states. This condition fixes all parameters, including masses and the cosmological constant. We suspect that only examples can be found where these are all of order one in Planck units, but the values depend on the algebra chosen. This paper combines findings reported in two previous preprints (G. ’t Hooft in arXiv:1009.0669 [gr-qc], 2010; arXiv:1011.0061 [gr-qc], 2010) and puts these in a clearer perspective by shifting the emphasis towards the implications for particle models. (shrink)
AUTHOR: STAN GUDDER (John Evans Professor of Mathematical Physics, University of Denver, USA) -- -/- We consider a discrete scalar, quantum field theory based on a cubic 4-dimensional lattice. We mainly investigate a discrete scattering operator S(x0,r) where x0 and r are positive integers representing time and maximal total energy, respectively. The operator S(x0,r) is used to define transition amplitudes which are then employed to compute transition probabilities. These probabilities are conditioned on the time-energy (x0,r). In order to maintain (...) total unit probability, the transition probabilities need to be reconditioned at each (x0,r). This is roughly analogous to renormalization in standard quantum field theory, except no infinities or singularities are involved. We illustrate this theory with a simple scattering experiment involving a common interaction Hamiltonian. We briefly mention how discreteness of space-time might be tested astronomically. Moreover, these tests may explain the existence of dark energy and dark matter. (shrink)
A physically consistent semi-classical treatment of black holes requires universality arguments to deal with the `trans-Planckian' problem where quantum spacetime effects appear to be amplified such that they undermine the entire semi-classical modelling framework. We evaluate three families of such arguments in comparison with Wilsonian renormalization group universality arguments found in the context of condensed matter physics. Our analysis is framed by the crucial distinction between robustness and universality. Particular emphasis is placed on the quality whereby the various (...) arguments are underpinned by `integrated' notions of robustness and universality. Whereas the principal strength of Wilsonian universality arguments can be understood in terms of the presence of such integration, the principal weakness of all three universality arguments for Hawking radiation is its absence. (shrink)
We argue that the true nature of the renormalizability of Horava-Lifshitz gravity lies in the presence of higher order spatial derivatives and not in the anisotropic Lifshitz scaling of space and time. We discuss the possibility of constructing a higher order spatial derivatives model that has the same renormalization properties of Horava-Lifshitz gravity but that does not make use of the Lifshitz scaling. In addition, the state-of-the-art of the Lorentz symmetry restoration in Horava-Lifshitz-type theories of gravitation is reviewed.
We present integral equations for the scattering amplitudes of three scalar particles, using the Faddeev channel decomposition, which can be readily extended to any finite number of particles of any helicity. The solution of these equations, which have been demonstrated to be calculable, provide a nonperturbative way of obtaining relativistic scattering amplitudes for any finite number of particles that are Lorentz invariant, unitary, cluster decomposable and reduce unambiguously in the nonrelativistic limit to the nonrelativistic Faddeev equations. The aim of this (...) program is to develop equations which explicitly depend upon physically observable input variables, and do not require “renormalization” or “dressing” of these parameters to connect them to the boundary states. As a unitary, cluster decomposible, multichannel theory, physical systems whose constituents are confined can be readily described. (shrink)
Phase transitions are an important instance of putatively emergent behavior. Unlike many things claimed emergent by philosophers, the alleged emergence of phase transitions stems from both philosophical and scientific arguments. Here we focus on the case for emergence built from physics, in particular, arguments based upon the infinite idealization invoked in the statistical mechanical treatment of phase transitions. After teasing apart several challenges, we defend the idea that phase transitions are best thought of as conceptually novel, but not ontologically (...) or explanatorily irreducible to finite physics; indeed, by looking at ongoing work on “smooth phase transitions” we even suggest that they’re not even conceptually novel. In the case of renormalization group theory, consideration of infinite systems and their singular behavior provides a central theoretical tool, but this is compatible with an explanatory reduction. Phase transitions may be “emergent” in some sense of this protean term, but not in a sense that is incompatible with the reductionist project broadly construed. (shrink)
This book discusses the notion that quantum gravity may represent the "breakdown" of spacetime at extremely high energy scales. If spacetime does not exist at the fundamental level, then it has to be considered "emergent", in other words an effective structure, valid at low energy scales. The author develops a conception of emergence appropriate to effective theories in physics, and shows how it applies (or could apply) in various approaches to quantum gravity, including condensed matter approaches, discrete approaches, and (...) loop quantum gravity. (shrink)
We classify the phase transition thresholds from provability to unprovability for certain Friedman-style miniaturizations of Kruskal's Theorem and Higman's Lemma. In addition we prove a new and unexpected phase transition result for ε0. Motivated by renormalization and universality issues from statistical physics we finally state a universality hypothesis.
Reply to Cartwright.Philip W. Anderson - 2001 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 32 (3):499-500.details
I am afraid that Nancy Cartwright and I will have to agree to disagree, on the whole. If my review comes through as harsh, it is perhaps the natural response of a quantum theorist who has worked in economics to a book in which physics and economics are treated as epistemically identical.
In the asymptotic safety paradigm, a quantum field theory reaches a regime with quantum scale invariance in the ultraviolet, which is described by an interacting fixed point of the Renormalization Group. Compelling hints for the viability of asymptotic safety in quantum gravity exist, mainly obtained from applications of the functional Renormalization Group. The impact of asymptotically safe quantum fluctuations of gravity at and beyond the Planck scale could at the same time induce an ultraviolet completion for the Standard (...) Model of particle physics with high predictive power. (shrink)
Tian Yu Cao has written a serious and scholarly book covering a great deal of physics. He ranges from classical relativity theory, both special and general, to relativistic quantum …eld theory, including non-Abelian gauge theory, renormalization theory, and symmetry-breaking, presenting a detailed and very rich picture of the mainstream developments in quantum physics; a remarkable feat. It has, moreover, a philosophical message: according to Cao, the development of these theories is inconsistent with a Kuhnian view of theory (...) change, and supports better a quali…ed realism. (shrink)
In this paper I detail three major mathematical developments that led to the emergence of Yang–Mills theories as the foundation for the standard model of particle physics. In less than 10 years, work on renormalizability, the renormalization group, and lattice quantum field theory highlighted the utility of Yang–Mills type models of quantum field theory by connecting poorly understood candidate dynamical models to emerging experimental results. I use this historical case study to provide lessons for theory construction in (...) class='Hi'>physics, and touch on issues related to renormalization group realism from a more historical perspective. In particular, I highlight the fact that much of the hard work in theory construction comes when trying to understand the consequences and representational capacities of a theoretical framework. (shrink)
The fundamental open questions of general relativity theory are the unification of the gravitational field with other fields, aiming at a unified geometrization of physics, as well as the renormalization of relativistic gravitational theory in order to obtain their self-consistent solutions. These solutions are to furnish field-theoretic particle models—a problem first discussed by Einstein. In addition, we are confronted with the issue of a coupling between gravitational and matter fields determined (not only) by Einstein's principle of equivalence, and (...) also with the question of the geometric meaning of a gravitational quantum theory. In our view, all these problems are so closely related that they warrant a general solution. We treat mainly the concepts suggested by Einstein and Weyl. (shrink)
Cao makes two claims of particular philosophical interest, in his book "The Conceptual Development of 20th Century Field Theories". (i) The history of these developments refutes Kuhn's relativistic epistemology, and (tacitly) (2) the question of realism in quantum field theory can be addressed independent of one's views on the probem of measurement. I argue that Cao is right on the first score, although for reasons different from the ones he cites, but wrong on the second. In support of the first (...) of these claims, I review in detail the correspondence between the treatment of critical phenomena in condensed matter physics, and of scaling in the renormalization group of RQFT. (shrink)
The renormalization group is used to analyze the behavior of certain gravitationally significant renormalized coupling constants under a scaling of the spacetime curvature. After discussing a simple example, the results are summarized for a class of grand unified theories.
The asymptotic safety program strives for a consistent description of gravity as a non-perturbatively renormalizable quantum field theory. In this framework the gravitational interactions are encoded in a renormalization group flow connecting the quantum gravity regime at trans-Planckian scales to observable low-energy physics. Our proceedings reviews the key elements underlying the predictive power of the construction and summarizes the state-of-the-art in determining its free parameters. The explicit construction of a realistic renormalization group trajectory describing our world shows (...) that the flow possesses two characteristic scales. The Planck scale where Newton’s coupling G becomes constant is generated dynamically. The freeze-out of the cosmological constant \ occurs at a terrestrial scale fixed by the observed value of the dimensionless product \. We also review the perspectives of determining the free parameters of the theory through cosmologicalobservations. (shrink)
The physical, mathematical, and philosophical foundations of the quantum theory of free Bose fields in fixed general relativistic spacetimes are examined. It is argued that the theory is logically and mathematically consistent whereas semiclassical prescriptions for incorporating the back-reaction of the quantum field on the geometry lead to inconsistencies. Still, the relations and heuristic value of the semiclassical approach to canonical and covariant schemes of quantum gravity-plus-matter are assessed. Both conventional and rigorous formulations of the theory and of its principal (...) predictions, cosmological particle creation and horizon radiation, are expounded and compared. Special attention is devoted to spacetime properties needed for the existence or uniqueness of the relevant theoretical elements , renormalization of the stress tensor). The emergence of unitarily inequivalent representations in a single dynamical context is used as motivation for the introduction of the abstract $\rm C\sp{\*}$-algebraic axiomatic formalism. The operationalist and conventionalist claims of the original abstract algebraic program are criticized in favor of its tempered outgrowth, local quantum physics. The interpretation of the theory as a wave mechanics of classical field configurations, deriving from the Schrodinger representations of the abstract algebra, is discussed and is found superior, at least on the level of analogy, to particle or harmonic oscillator interpretations. Further, it is argued that the various detector results and the Fulling nonuniqueness problem do not undermine the particle concept in the ways commonly claimed. In particular, arguments are offered against the attribution of particle status to the Rindler quanta, against the physical realizability of the Rindler vacuum, and against the more general notion of observer-dependence as to the definition of 'particle' or 'vacuum'. However, the question of the ontological status of particles is raised in terms of the consistency of quantum field theory with non-reductive realism about particles, the latter being conceived as entities exhibiting attributes of discreteness and localizability. Two arguments against non-reductive realism about particles, one from axiomatic algebraic local quantum theory in Minkowski spacetime and one from quantum field theory in curved spacetime, are developed. (shrink)
I will give a broad overview of what has become the standard paradigm in cosmology. I will describe the relational notion of time that is often used in cosmological calculations and discuss how the local nature of Einstein's equations allows us to translate this notion into statements about `initial' data. Classically this relates our local definition of time to a quasi-local region of a particular spatial slice, however incorporating quantum theory comes at the expense of losing this locality entirely. This (...) occurs due to the presence of two, apparently distinct, issues: Seemingly classical issues to do with the infinite spatial volume of the universe and Quantum field theory issues, which revolve around trying to apply renormalization in cosmology. Following the ‘cosmological principle’ - an extension of the ‘Copernicus principle’ - that physics at every point in our universe should look the same, we are lead to the modern view of cosmology. This procedure is reasonably well understood for an exactly homogeneous universe, however the inclusions of small perturbations over this homogeneity leads to many interpretational/ conceptual difficulties. For example, in an infinite universe perturbations can be arbitrarily close to homogeneous. To any observer, such a perturbation would appear to be a simple rescaling of the homogenous background and hence, physically, would not be considered an inhomogeneous perturbation at all. However, any attempt to choose the physically relevant scale at which perturbations should be considered homogeneous will break the cosmological principle i.e. it will make the resulting physics observer dependent. It amounts to `putting the perturbations in a box' and a delicate practical issue is that the universe is not static, hence the scale of the box will be time dependent. Thus what appears ‘physically homogeneous’ to an observer at one time will not appear so at another. This issue is brought to the forefront by considering the canonical version of the theory. The phase space formulation of General Relativity, just as for any other theory, contains the shadow of the underlying quantum theory. This means that, although the formulation is still classical, many of the subtleties that are present in the quantum theory are already apparent. In the cosmological context the infinite spatial volume renders almost all expressions formal or ill-defined. In order to proceed, we must restrict our attention to a cosmology that has some finite spatial extent, on which our relational notion of time is everywhere definable. In particular, this would constrain the permissible data outside our `observable universe'. This difficulty is an IR or large scale issues in cosmology, however in addition there are UV or short scale problems that need to be tackled. There are the usual problems of renormalization, which are further complicated by the fact that the universe is not static. In the cosmological setting this leads to new IR problems which again prevent one from taking the spatial extent of the universe to infinity. The physical relevance of this problem, the consequence for defining a time variable, and the distinction of homogeneous and inhomogeneous IR issues will be discussed. (shrink)
The nonlinear integro-differential equation, obtained from the coupled Maxwell-Dirac equations by eliminating the potential Aμ, is solved by iteration rather than perturbation. The energy shift is complex, the imaginary part giving the spontaneous emission. Both self-energy and vacuum polarization terms are obtained. All results, including renormalization terms, are finite.