On one popular view, the general covariance of gravity implies that change is relational in a strong sense, such that all it is for a physical degree of freedom to change is for it to vary with regard to a second physical degree of freedom. At a quantum level, this view of change as relative variation leads to a fundamentally timeless formalism for quantum gravity. Here, we will show how one may avoid this acute ‘problem of time’. Under our view, (...) duration is still regarded as relative, but temporal succession is taken to be absolute. Following our approach, which is presented in more formal terms in, it is possible to conceive of a genuinely dynamical theory of quantum gravity within which time, in a substantive sense, remains. 1 Introduction1.1 The problem of time1.2 Our solution2 Understanding Symmetry2.1 Mechanics and representation2.2 Freedom by degrees2.3 Voluntary redundancy3 Understanding Time3.1 Change and order3.2 Quantization and succession4 Time and Gravitation4.1 The two faces of classical gravity4.2 Retaining succession in quantum gravity5 Discussion5.1 Related arguments5.2 Concluding remarks. (shrink)
We present a Bayesian analysis of the epistemology of analogue experiments with particular reference to Hawking radiation. Provided such experiments can be externally validated via universality arguments, we prove that they are confirmatory in Bayesian terms. We then provide a formal model for the scaling behaviour of the confirmation measure for multiple distinct realisations of the analogue system and isolate a generic saturation feature. Finally, we demonstrate that different potential analogue realisations could provide different levels of confirmation. Our results thus (...) provide a basis both to formalise the epistemic value of analogue experiments that have been conducted and to advise scientists as to the respective epistemic value of future analogue experiments. (shrink)
Econophysics is a new and exciting cross-disciplinary research field that applies models and modelling techniques from statistical physics to economic systems. It is not, however, without its critics: prominent figures in more mainstream economic theory have criticized some elements of the methodology of econophysics. One of the main lines of criticism concerns the nature of the modelling assumptions and idealizations involved, and a particular target are ‘kinetic exchange’ approaches used to model the emergence of inequality within the distribution of individual (...) monetary income. This article will consider such models in detail, and assess the warrant of the criticisms drawing upon the philosophical literature on modelling and idealization. Our aim is to provide the first steps towards informed mediation of this important and interesting interdisciplinary debate, and our hope is to offer guidance with regard to both the practice of modelling inequality, and the inequality of modelling practice. _1_ Introduction _1.1_ Econophysics and its discontents _1.2_ Against burglar economics _2_ Modelling Inequality _2.1_ Mainstream economic models for income distribution _2.2_ Econophysics models for income distribution _3_ Idealizations in Kinetic Exchange Models _3.1_ Binary interactions _3.2_ Conservation principles _3.3_ Exchange dynamics _ 4 _ Fat Tails and Savings _ 5 _ Evaluation. (shrink)
To demarcate the limits of experimental knowledge, we probe the limits of what might be called an experiment. By appeal to examples of scientific practice from astrophysics and analogue gravity, we demonstrate that the reliability of knowledge regarding certain phenomena gained from an experiment is not circumscribed by the manipulability or accessibility of the target phenomena. Rather, the limits of experimental knowledge are set by the extent to which strategies for what we call ‘inductive triangulation’ are available: that is, the (...) validation of the mode of inductive reasoning involved in the source-target inference via appeal to one or more distinct and independent modes of inductive reasoning. When such strategies are able to partially mitigate reasonable doubt, we can take a theory regarding the phenomena to be well supported by experiment. When such strategies are able to fully mitigate reasonable doubt, we can take a theory regarding the phenomena to be established by experiment. There are good reasons to expect the next generation of analogue experiments to provide genuine knowledge of unmanipulable and inaccessible phenomena such that the relevant theories can be understood as well supported. This article is part of a discussion meeting issue ‘The next generation of analogue gravity experiments’. (shrink)
The idea that democracy is under threat, after being largely dormant for at least 40 years, is looming increasingly large in public discourse. Complex systems theory offers a range of powerful new tools to analyse the stability of social institutions in general, and democracy in particular. What makes a democracy stable? And which processes potentially lead to instability of a democratic system? This paper offers a complex systems perspective on this question, informed by areas of the mathematical, natural, and social (...) sciences. We explain the meaning of the term 'stability' in different disciplines and discuss how laws, rules, and regulations, but also norms, conventions, and expectations are decisive for the stability of a social institution such as democracy. (shrink)
We claim that, as it stands, the Deutsch–Wallace–Everett approach to quantum theory is conceptually incoherent. This charge is based upon the approach’s reliance upon decoherence arguments that conflict with its own fundamental precepts regarding probabilistic reasoning in two respects. This conceptual conflict obtains even if the decoherence arguments deployed are aimed merely towards the establishment of certain ‘emergent’ or ‘robust’ structures within the wave function: To be relevant to physical science notions such as robustness must be empirically grounded, and, on (...) our analysis, this grounding can only plausibly be done in precisely the probabilistic terms that lead to conceptual conflict. Thus, the incoherence problems presented necessitate either the provision of a new, non-probabilistic empirical grounding for the notions of robustness and emergence in the context of decoherence, or the abandonment of the Deutsch–Wallace–Everett programme for quantum theory. (shrink)
Symplectic reduction is a formal process through which degeneracy within the mathematical representations of physical systems displaying gauge symmetry can be controlled via the construction of a reduced phase space. Typically such reduced spaces provide us with a formalism for representing both instantaneous states and evolution uniquely and for this reason can be justifiably afforded the status of fun- damental dynamical arena - the otiose structure having been eliminated from the original phase space. Essential to the application of symplectic reduction (...) is the precept that the first class constraints are the relevant gauge generators. This prescription becomes highly problematic for reparameterization invariant theories within which the Hamiltonian itself is a constraint; not least because it would seem to render prima facie distinct stages of a history physically identical and observable functions changeless. Here we will consider this problem of time within non-relativistic me- chanical theory with a view to both more fully understanding the temporal struc- ture of these timeless theories and better appreciating the corresponding issues in relativistic mechanics. For the case of nonrelativistic reparameterization invariant theory application of symplectic reduction will be demonstrated to be both unnec- essary; since the degeneracy involved is benign; and inappropriate; since it leads to a trivial theory. With this anti-reductive position established we will then examine two rival methodologies for consistently representing change and observable func- tions within the original phase space before evaluating the relevant philosophical implications. We will conclude with a preview of the case against symplectic re- duction being applied to canonical general relativity. (shrink)
We propose a solution to the problem of time for systems with a single global Hamiltonian constraint. Our solution stems from the observation that, for these theories, conventional gauge theory methods fail to capture the full classical dynamics of the system and must therefore be deemed inappropriate. We propose a new strategy for consistently quantizing systems with a relational notion of time that does capture the full classical dynamics of the system and allows for evolution parametrized by an equitable internal (...) clock. This proposal contains the minimal temporal structure necessary to retain the ordering of events required to describe classical evolution. In the context of shape dynamics (an equivalent formulation of general relativity that is locally scale invariant and free of the local problem of time) our proposal can be shown to constitute a natural methodology for describing dynamical evolution in quantum gravity and to lead to a quantum theory analogous to the Dirac quantization of unimodular gravity. (shrink)
Some time ago, Joel Katzav and Brian Ellis debated the compatibility of dispositional essentialism with the principle of least action. Surprisingly, very little has been said on the matter since, even by the most naturalistically inclined metaphysicians. Here, we revisit the Katzav–Ellis arguments of 2004–05. We outline the two problems for the dispositionalist identified Katzav in his 2004 , and claim they are not as problematic for the dispositional essentialist at it first seems – but not for the reasons espoused (...) by Ellis. (shrink)
Starting from a generalized Hamilton-Jacobi formalism, we develop a new framework for constructing observables and their evolution in theories invariant under global time reparametrizations. Our proposal relaxes the usual Dirac prescription for the observables of a totally constrained system and allows one to recover the influential partial and complete observables approach in a particular limit. Difficulties such as the non-unitary evolution of the complete observables in terms of certain partial observables are explained as a breakdown of this limit. Identification of (...) our observables relies upon a physical distinction between gauge symmetries that exist at the level of histories and states, and those that exist at the level of histories and not states. This distinction resolves a tension in the literature concerning the physical interpretation of the partial observables and allows for a richer class of observables in the quantum theory. There is the potential for the application of our proposal to the quantization of gravity when understood in terms of the Shape Dynamics formalism. (shrink)
A physically consistent semi-classical treatment of black holes requires universality arguments to deal with the `trans-Planckian' problem where quantum spacetime effects appear to be amplified such that they undermine the entire semi-classical modelling framework. We evaluate three families of such arguments in comparison with Wilsonian renormalization group universality arguments found in the context of condensed matter physics. Our analysis is framed by the crucial distinction between robustness and universality. Particular emphasis is placed on the quality whereby the various arguments are (...) underpinned by `integrated' notions of robustness and universality. Whereas the principal strength of Wilsonian universality arguments can be understood in terms of the presence of such integration, the principal weakness of all three universality arguments for Hawking radiation is its absence. (shrink)
We introduce ‘model migration’ as a species of cross-disciplinary knowledge transfer whereby the representational function of a model is radically changed to allow application to a new disciplinary context. Controversies and confusions that often derive from this phenomenon will be illustrated in the context of econophysics and phylogeographic linguistics. Migration can be usefully contrasted with concept of ‘imperialism’, that has been influentially discussed in the context of geographical economics. In particular, imperialism, unlike migration, relies upon extension of the original model (...) via an expansion of the domain of phenomena it is taken to adequately described. The success of imperialism thus requires expansion of the justificatory sanctioning of the original idealising assumptions to a new disciplinary context. Contrastingly, successful migration involves the radical representational re-interpretation of the original model, rather than its extension. Migration thus requires ‘re-sanctioning’ of new ‘counterpart idealisations’ to allow application to an entirely different class of phenomena. Whereas legitimate scientific imperialism should be based on the pursuit of some form of ontological unification, no such requirement is need to legitimate the practice of model migration. The distinction between migration and imperialism will thus be shown to have significant normative as well as descriptive value. (shrink)
In 1981 Unruh proposed that fluid mechanical experiments could be used to probe key aspects of the quantum phenomenology of black holes. In particular, he claimed that an analogue to Hawking radiation could be created within a fluid mechanical `dumb hole', with the event horizon replaced by a sonic horizon. Since then an entire sub-field of `analogue gravity' has been created. In 2016 Steinhauer reported the experimental observation of quantum Hawking radiation and its entanglement in a Bose-Einstein condensate analogue black (...) hole. What can we learn from such analogue experiments? In particular, in what sense can they provide evidence of novel phenomena such as black hole Hawking radiation? (shrink)
The analysis of the temporal structure of canonical general relativity and the connected interpretational questions with regard to the role of time within the theory both rest upon the need to respect the fundamentally dual role of the Hamiltonian constraints found within the formalism. Any consistent philosophical approach towards the theory must pay dues to the role of these constraints in both generating dynamics, in the context of phase space, and generating unphysical symmetry transformations, in the context of a hypersurface (...) embedded within a solution. A first denial of time in the terms of a position of reductive temporal relationalism can be shown to be troubled by failure on the first count, and a second denial in the terms of Machian temporal relationalism can be found to be hampered by failure on the second. A third denial of time, consistent with both of the Hamiltonian constraints roles, is constituted by the implementation of a scheme for constructing observables in terms of correlations and leads to a radical Parmenidean timelessness. The motivation for and implications of each of these three denials are investigated. (shrink)
The subjective Everettian approach to quantum mechanics presented by Deutsch and Wallace fails to constitute an empirically viable theory of quantum phenomena. The decision theoretic implementation of the Born rule realized in this approach provides no basis for rejecting Everettian quantum mechanics in the face of empirical data that contradicts the Born rule. The approach of Greaves and Myrvold, which provides a subjective implementation of the Born rule as well but derives it from empirical data rather than decision theoretic arguments, (...) avoids the problem faced by Deutsch and Wallace and is empirically viable. However, there is good reason to cast doubts on its scientific value. (shrink)
We offer a new proposal for cosmic singularity resolution based upon a quantum cosmology with a unitary bounce. This proposal is illustrated via a novel quantization of a mini-superspace model in which there can be superpositions of the cosmological constant. This possibility leads to a finite, bouncing unitary cosmology. Whereas the usual Wheeler–DeWitt cosmology generically displays pathological behaviour in terms of non-finite expectation values and non-unitary dynamics, the finiteness and unitarity of our model are formally guaranteed. For classically singular models (...) with a massless scalar field and cosmological constant, we show that well-behaved quantum observables can be constructed and generic solutions to the universal Schrodinger equation are singularity-free. Key features of the solutions of our model include a cosmic bounce due to quantum gravitational effects, a well-defined FLRW limit far from the bounce, and a super-inflationary epoch in the intermediate region. Furthermore, our model displays novel features including: i) superpositions of values of the cosmological constant; ii) a non-zero scattering length around the big bounce; and iii) bound ‘Efimov universe’ states for negative cosmological constant. The last feature provides a new platform for quantum simulation of the early universe. A companion paper provides detailed interpretation and analysis of particular cosmological solutions. (shrink)
Hamiltonian constraints feature in the canonical formulation of general relativity. Unlike typical constraints they cannot be associated with a reduction procedure leading to a non-trivial reduced phase space and this means the physical interpretation of their quantum analogues is ambiguous. In particular, can we assume that “quantisation commutes with reduction” and treat the promotion of these constraints to operators annihilating the wave function, according to a Dirac type procedure, as leading to a Hilbert space equivalent to that reached by quantisation (...) of the problematic reduced space? If not, how should we interpret Hamiltonian constraints quantum mechanically? And on what basis do we assert that quantisation and reduction commute anyway? These questions will be refined and explored in the context of modern approaches to the quantisation of canonical general relativity. (shrink)
In an illuminating article, Claus Beisbart argues that the recently-popular thesis that the probabilities of statistical mechanics (SM) are Best System chances runs into a serious obstacle: there is no one axiomatization of SM that is robustly best, as judged by the theoretical virtues of simplicity, strength, and fit. Beisbart takes this 'no clear winner' result to imply that the probabilities yielded by the competing axiomatizations simply fail to count as Best System chances. In this reply, we express sympathy for (...) the 'no clear winner' thesis. However, we argue that an importantly different moral should be drawn from this. We contend that the implication for Humean chances is not that there are no SM chances, but rather that SM chances fail to be sharp. (shrink)
A novel approach to quantization is shown to allow for superpositions of the cosmological constant in isotropic and homogeneous mini-superspace models. Generic solutions featuring such superpositions display: i) a unitary evolution equation; ii) singularity resolution; iii) a cosmic bounce. Explicit cosmological solutions are constructed. These exhibit characteristic bounce features including a ‘super-inflation’ regime with universal phenomenology that can naturally be made to be insensitive to Planck-scale physics.
This paper provides the first systematic philosophical analysis of an increasingly important part of modern scientific practice: analogue quantum simulation. We introduce the distinction between `simulation' and `emulation' as applied in the context of two case studies. Based upon this distinction, and building upon ideas from the recent philosophical literature on scientific understanding, we provide a normative framework to isolate and support the goals of scientists undertaking analogue quantum simulation and emulation. We expect our framework to be useful to both (...) working scientists and philosophers of science interested in cutting-edge scientific practice. (shrink)
A companion paper provides a proposal for cosmic singularity resolution based upon general features of a bouncing unitary cosmological model in the mini-superspace approximation. This paper analyses novel phenomenology that can be identified within particular solutions of that model. First, we justify our choice of particular solutions based upon a clearly articulated and observationally motivated principle. Second, we demonstrate that the chosen solutions follow a classical mini- superspace cosmology before smoothly bouncing off the classically singular region. Third, and most significantly, (...) we identify a ‘Rayleigh-scattering’ limit for physically reasonable choices of parameters within which the solutions display effective behaviour that is insensitive to the details of rapidly oscillating Planck-scale physics. This effective physics is found to be compatible with an effective period of cosmic inflation well below the Planck scale. The detailed effective physics of this Rayleigh-scattering limit is provided via: i) an exact analytical treatment of the model in the de Sitter limit; and ii) numerical solutions of the full model. (shrink)
We provide an analysis of the empirical consequences of the AdS/CFT duality with reference to the application of the duality in a fundamental theory, effective theory and instrumental context. Analysis of the first two contexts is intended to serve as a guide to the potential empirical and ontological status of gauge/gravity dualities as descriptions of actual physics at the Planck scale. The third context is directly connected to the use of AdS/CFT to describe real quark-gluon plasmas. In the latter context, (...) we find that neither of the two duals are confirmed by the empirical data. (shrink)
Political scientists have conventionally assumed that achieving democracy is a one-way ratchet. Only very recently has the question of “democratic backsliding” attracted any research attention. We argue that democratic instability is best understood with tools from complexity science. The explanatory power of complexity science arises from several features of complex systems. Their relevance in the context of democracy is discussed. Several policy recommendations are offered to help stabilize current systems of representative democracy.
Political scientists have conventionally assumed that achieving democracy is a one-way ratchet. Only very recently has the question of ‘democratic backsliding’ attracted any research attention. We argue that democratic instability is best understood with tools from complexity science. The explanatory power of complexity science arises from several features of complex systems. Their relevance in the context of democracy is discussed. Several policy recommen- dations are offered to help stabilize current systems of representative democracy.
This paper provides a prospectus for a new way of thinking about the wavefunction of the universe: a Ψ-epistemic quantum cosmology. We present a proposal that, if successfully implemented, would resolve the cosmological measurement problem and simultaneously allow us to think sensibly about probability and evolution in quantum cosmology. Our analysis draws upon recent work on the problem of time in quantum gravity and causally symmet- ric local hidden variable theories. Our conclusion weighs the strengths and weaknesses of the approach (...) and points towards paths for future development. (shrink)
A companion paper provides a proposal for cosmic singularity resolution based upon general features of a bouncing unitary cosmological model in the mini-superspace approximation. This paper analyses novel phenomenology that can be identified within particular solutions of that model. First, we justify our choice of particular solutions based upon a clearly articulated and observationally-motivated principle. Second, we demonstrate that the chosen solutions follow a classical mini-superspace cosmology before smoothly bouncing off the classically singular region. Third, and most significantly, we identify (...) a 'Rayleigh-scattering' limit for physically reasonable choices of parameters within which the solutions display effective behaviour that is insensitive to the details of rapidly oscillating Planck-scale physics. This effective physics is found to be compatible with an effective period of cosmic inflation well below the Planck scale. The detailed effective physics of this Rayleigh-scattering limit is provided via: i) an exact analytical treatment of the model in the de Sitter limit; and ii) numerical solutions of the full model. (shrink)
Many philosophers have claimed that Bayesianism can provide a simple justification for hypothetico-deductive inference, long regarded as a cornerstone of the scientific method. Following up a remark of van Fraassen, we analyze a problem for the putative Bayesian justification of H-D inference in the case where what we learn from observation is logically stronger than what our theory implies. Firstly, we demonstrate that in such cases the simple Bayesian justification does not necessarily apply. Secondly, we identify a set of sufficient (...) conditions for the mismatch in logical strength to be justifiably ignored as a "harmless idealization''. Thirdly, we argue, based upon scientific examples, that the pattern of H-D inference of which there is a ready Bayesian justification is only rarely the pattern that one actually finds at work in science. Whatever the other virtues of Bayesianism, the idea that it yields a simple justification of a pervasive pattern of scientific inference appears to have been oversold. (shrink)
The canonical formalism of general relativity affords a particularly interesting characterisation of the infamous hole argument. It also provides a natural formalism in which to relate the hole argument to the problem of time in classical and quantum gravity. In this paper we examine the connection between these two much discussed problems in the foundations of spacetime theory along two interrelated lines. First, from a formal perspective, we consider the extent to which the two problems can and cannot be precisely (...) and distinctly characterised. Second, from a philosophical perspective, we consider the implications of various responses to the problems, with a particular focus upon the viability of a `deflationary' attitude to the relationalist/substantivalist debate regarding the ontology of spacetime. Conceptual and formal inadequacies within the representative language of canonical gravity will be shown to be at the heart of both the canonical hole argument and the problem of time. Interesting and fruitful work at the interface of physics and philosophy relates to the challenge of resolving such inadequacies. (shrink)
The `problem of time' is a cluster of interpretational and formal issues in the foundations of general relativity relating to both the representation of time in the classical canonical formalism, and to the quantization of the theory. The purpose of this short chapter is to provide an accessible introduction to the problem.