The way in which philosophers have thought about the scientific method and the nature of good scientific reasoning over the last few centuries has been consistently and heavily influenced by the examples set by physics. The astounding achievements of 19th and 20th century physics demonstrated that physicists had successfully identified methodologies and reasoning patterns that were uniquely well suited to discovering fundamental truths about the natural world. Inspired by this success, generations of philosophers set themselves the goal of taxonomising, codifying, formalising and evaluating these reasoning patterns. Many will concede that this has been a tremendously fruitful exercise that has served both to illuminate characteristic methodological and epistemological features of the physical sciences, and to inform the way that philosophers think about the epistemic ideals served by science more generally.

However, as has been widely noted, the great challenges confronted by contemporary physics have led to a number of fundamental shifts in the way that physicists formulate, assess and apply their theories of the physical world. The most prominent examples of this trend occur in the realm of theoretical high energy physics, where many of the most influential theories advocated by physicists lie beyond the reach of extant experimental methods, and are therefore extremely difficult to test empirically. The fact that whole communities of physicists have devoted so much time and effort to evaluating theories that are largely disconnected from experiments and empirical testing suggests that existing philosophical accounts of the epistemology of physics, based as they are on a broadly empiricist conception of physics, are no longer completely apt, or are at least somewhat out of date. This in turn suggests that it is time for philosophers to redirect their attention towards the characteristic reasoning strategies at play in contemporary physics. As well as clarifying the epistemic structure of 21st century physics and providing new stimulation for general debates surrounding the epistemology of science, this will also allow more philosophers to confront and engage with the existential methodological debates currently raging within physics. There is currently intense and wide spread disagreement surrounding what kinds of reasoning strategies can legitimately be employed in the assessment of competing physical theories and research programs. This special issue aims to facilitate the further engagement of philosophers in this crucial debate by collecting eight contributions that both highlight the deep philosophical issues at stake in debates surrounding the proper methodology for theory assessment in physics, and present novel philosophical perspectives on specific reasoning strategies in the physical sciences.

Of the eight contributions included in the issue, two engage directly with current debates surrounding how theories of quantum gravity should be assessed in the absence of relevant experimental data. Firstly, Cabrera’s ‘String Theory, Non-Empirical Theory Assessment, and the Context of Pursuit’ addresses the currently influential view, articulated by e.g. Dawid (2013), that string theorists (and theoretical quantum gravity researchers in general) have largely abandoned the empirical methodology that was previously characteristic of physics, in which a theory is evaluated primarily on the basis of how well it predicts/accommodates the results of relevant experiments. On Dawid’s view, current high energy physics has moved away from this methodological template, and towards a new kind of ‘non empirical’ methodology, in which theories are assessed primarily on the basis of argumentation that does not concern their ability to make accurate empirical predictions. Dawid (2013) provides arguments for why such a methodological shift might be legitimate and justified in circumstances where direct experimental testing is not possible. In contrast, Smolin (2006) argues that the fact that string theorists are willing to believe their theory in the absence of experimental success is a symptom of a problematic sociological ‘groupthink’ phenomenon in current high energy physics. Cabrera argues against both Smolin and Dawid, and contends that string theory neither betrays the scientific method (as Smolin argues) nor requires a radical shift in how we think about scientific reasoning and theory assessment (as Dawid argues). In order to make this point, Carbrera utilises Reichenbach’s (1938) influential distinction between the ‘context of pursuit’ and the ‘context of justification’ in science. Very roughly, the context of pursuit is the stage of science at which scientists are concerned with identifying promising and fruitful hypotheses and research programs, while the context of justification is the stage at which scientists are actually concerned with assessing the truth of those theories that have been deemed worthy of serious investigation. Cabrera argues that, when conceived of as occurring in the context of pursuit (rather than the context of justification), the reasoning strategies employed by string theorists become significantly less puzzling.

Crowther, Linnemann and Wüthrich’s (CLW), contribution, ‘What we cannot learn from analogue experiments’ addresses a specific reasoning strategy that physicists have employed to overcome the impossibility of testing some vitally important theoretical predictions in quantum gravity research. The relevant strategy is ‘analogue simulation’, which is best illustrated by the example of Hawking radiation and dumb holes. Hawking (1975) famously derived the existence of a form of thermal blackhole radiation (Hawking radiation) from a mixture of quantum and relativistic considerations. If empirically confirmed, the existence of Hawking radiation would have fundamental and wide ranging implications for the search for a theory of quantum gravity. However, it is currently impossible to directly test whether real black holes actually emit Hawking radiation. In order to overcome this obstacle, Unruh (1981) proposes the investigation of ‘analogue systems’, i.e. systems that share important properties with black holes, but which, unlike black holes, are readily susceptible to current experimental techniques. The most salient examples of such analogue systems are ‘dumb holes’, i.e. systems where sound waves encounter a horizon in a fluid. Unruh’s idea is that, since dumb holes are similar to black holes in various important respects, it is reasonable to think they should be similar in some other relevant respects, such as whether they emit anything like Hawking radiation, i.e. either they should both do so, or neither should. The idea then is that observing Hawking radiation–like phenomena in analogue systems like dumb holes should significantly confirm the existence of Hawking radiation in black holes. Dardashti et al. (2017) have recently argued that, under certain conditions, this kind of analogue simulation argument can in fact serve its purpose successfully, thereby allowing us to confirm (or disconfirm) theoretical predictions that are currently not empirically testable. CLW provide an extended critique of this claim, and eventually conclude that analogue simulation arguments are not able to provide significant confirmation of the existence of Hawking radiation in black holes.

Sabine Hossenfelder’s contribution focuses on the context of pursuit in the foundations of physics, i.e. it focuses on how promising research programs are initially identified in the foundations of physics. Specifically, Hossenfelder addresses so called ‘naturalness arguments’. These arguments assess how promising a given hypothesis is in terms of whether the hypothesis entails any very small or very large values for relevant physical parameters. Generally, hypotheses which do entail any such very small or very large numbers are considered to be less promising than theories that don’t, and the rationale that is given for this inference is that very small or large values are ‘unlikely’ or at least ‘in need of explanation’. Hossenfelder contends that, while there are some restricted conditions under which this kind of inference is legitimate (namely circumstances where we actually have good reason for regarding small/large values as being quantifiably unlikely), these conditions are rarely satisfied in actual applications of naturalness arguments in physics.

Continuing on the theme of identifying and formulating promising theories in fundamental physics, Adam Koberinski’s contribution provides a historically informed perspective on theory formation in particle physics. Specifically, by looking closely at the crucial mathematical developments that led to the emergence of the Yang-Mills theory as the foundation for the standard model, Koberinski draws some general morals for the methodology of theory construction in physics.

Moving away from the context of pursuit, and towards the context of justification, Chall et al.’s contribution provides an in-depth look at how the discovery of the Higgs boson at the Large Hadron Collider was taken to confirm the standard model in particle physics. Chall et al.’s approach is partially empirical, in so far as the analysis is based on concrete sociological evidence regarding the physics community’s confidence in the standard model, and how it was affected by the discovery of the Higgs boson. The authors then go on to provide a philosophical interpretation of physicists’ faith in the standard model in terms of the concept of a Lakatosian ‘scientific research programme’.

Erik Curiel’s contribution also focuses on the phenomenon of scientific confirmation, but diverges from extant discussions on the topic by focusing, not on how individual physical theories are confirmed, but rather on the question of how entire physical frameworks are confirmed. The consensus, dating back to at least Carnap (1956) was that since frameworks provide the settings in which concrete scientific theories can be rigorously articulated, they do not themselves have determinate semantic content, and therefore are not the kinds of things that can be confirmed or falsified. This led to a broadly conventionalist view of frameworks that was largely disconnected from the influential debates surrounding scientific confirmation. Curiel’s contribution aims to bridge this gap by identifying a previously unrecognised mode of scientific reasoning, ‘Newtonian abduction’, whose goal is to confer confirmation upon entire frameworks.

John Norton’s ‘Eternal Inflation: When Probabilities Fail’ also attempts to undermine a current orthodoxy from the literature on scientific confirmation. Specifically, Norton uses the case study of eternally inflating cosmology to support the view that legitimate inductive inference need not be based on anything like a fully probabilistic inductive logic. Norton constructs an alternative, non-probabilistic inductive logic that allows for meaningful predictions to be made in the absence of definite probability distributions. This logic is justified by the background conditions of an eternally inflating universe, and is therefore an example of Norton’s ‘material theory of induction’—a view of inductive inference which contends that there is no universally correct inductive logic, but rather that the details of one’s inductive logic should always be sensitive to the details of the scientific enquiries in which one is engaged.

Finally, Samuel Fletcher’s contribution focuses on the role of counterfactual reasoning in physics. Specifically, Fletcher develops a novel semantics for evaluating the truth of counterfactual claims in the context of specific physical theories. To construct this semantics, Fletcher develops a novel way of determining which models of a theory are more or less similar to one another for the purposes of a given counterfactual query. This in turn leads to a new perspective on the logical properties of counterfactual reasoning in physics.

Overall, the eight contributions offer a plethora of new perspectives on central issues in the epistemology of physics, and promise to stimulate productive debate regarding the fundamental and characteristic reasoning norms of the physical sciences.