Before the opening of the LHC more than a decade ago, it was widely anticipated on the basis of the so-called naturalness principle that the LHC would reveal new, as-yet unknown particles lying outside the framework of the Standard Model (SM). Today, the continued absence of any signatures of physics beyond the SM at the LHC raises important questions about the value and status of the naturalness principle moving forward. Is the naturalness principle as it has been applied in the context of the SM fundamentally flawed? Or does the violation of naturalness provide a vital clue about where physicists should be focusing their efforts in the search for more encompassing theories? Or is the principle too vaguely defined to draw a firm conclusion either way?

The absence of new physics at the LHC leaves the physics of fundamental particles and interactions in an ambiguous state. On the one side, there is general relativity (GR), which describes gravitational interactions through symmetries of space-time; on the other is the SM, which not only incorporates the remaining three fundamental interactions (electromagnetic, weak, and strong), but also accommodates all known elementary particles (and not more) in the form of multiplets of internal symmetries. Although both these theories arise from symmetries, their differences have been too severe to put them on common ground. Most notably, GR in four space-time dimensions so far has defied a consistent approach to quantization. Although both theories are bound to fail under certain conditions, it is unclear whether it will ever be possible to experimentally establish these conditions, and thus get empirical input for a unified description of both theories.

One crucial difference between GR and the SM, which lies at the heart of the naturalness principle, is that their “typical mass scales” differ by about 17 orders of magnitude. The scale of gravity is given by the strength of the gravitational interaction, governed by the Planck mass \(M_\text {Planck}\sim 10^{19}\) GeV. The scale of the SM is given by the vacuum expectation value of the Higgs field, \(v\approx 246\) GeV, which is closely related to the masses of the elementary particles. From the perspective of classical physics, this large separation of scales may be regarded as a mere curiosity. In quantum field theory, however, descriptions of physics associated with different energy scales are related by the renormalization group; many have thought that these special features of quantum field theory, combined with particular characteristics of the quantum field associated with the Higgs boson, serve to generate a naturalness problem. For this reason, questions of naturalness are also often tied to questions about the mathematical definition and physical interpretation of quantum field theory.

There are several ways of formulating the principle, the most important of which are discussed in the contributions to this issue. At this point, let us just stress that naturalness is not a criterion of consistency, unlike unitarity, vacuum stability, or absence of anomalies. Nor is it a requirement of agreement with experimental data. Rather, it is often understood as the requirement that the numerical values of a theory’s parameters not rely on fine-tuned cancellations. Thus, a theory can be internally mathematically consistent and agree with all experimental observations within its intended domain of applicability and still fail to be natural; in this sense, naturalness is a purely extra-empirical constraint on theory choice. Nevertheless, particle physicists have found the naturalness principle sufficiently compelling to advance the principle as one of the primary motivations for research into theories beyond the SM.

Many possible “solutions” to the naturalness problem (assuming it is a problem) have been proposed over the last few decades: Supersymmetry, Extra Dimensions, Technicolor, Little Higgs, to name just some of the most popular ones. Most of them require physics beyond the SM roughly at the TeV scale. Absence of any such signals at the LHC is ruling out more and more of these models, or at least large parts of their parameter spaces. This has begun to put the naturalness principle, and maybe even physicists’ criteria for theory choice in general, into question.

Because of its extra-empirical character, discussions around the naturalness principle have always had a philosophical component. Philosophers and historians of physics recognized the philosophical importance of this principle years ago, and the current situation is making it ever more attractive to the philosophical community. The fact that the expectations set by the naturalness principle have been consistently flouted by experimental findings of the LHC prompts us to carefully examine, or re-examine, the following questions, which are of considerable interest to physicists, philosophers of physics, and historians of physics alike: How is a quantum field theory (QFT) mathematically defined? Which parts of the QFT formalism represent features of reality and which mere artifacts of mathematical convention? Do the historical case studies advanced in support of the naturalness principle, such as the discovery of the charm quark, actually illustrate the same principle that was invoked to predict the appearance of new physics at the LHC, or bear merely a surface similarity to it? These questions transcend traditional disciplinary boundaries and often require an even mixture of conceptual, mathematical, and historical analysis. At a workshop on “Naturalness, Hierarchy, and Fine Tuning” in Spring 2018, organized at RWTH Aachen University, Germany, by members of the research collaboration “The Epistemology of the Large Hadron Collider”, scholars from various disciplines including physics, philosophy, and history of science, have gathered to exchange their views on these issues.

Most of the contributions in this volume are from participants of this workshop. They cover the topic from various perspectives. Borrelli and Castellani take a historical view. Along with a historical outline of the development of the naturalness principle and the term “natural” in the context of physics, they examine the role of Wilson’s work on renormalization in inspiring the naturalness principle, as well as comparing the notions of naturalness advanced in three seminal papers by Susskind, ’t Hooft, and Veltman.

A critical discussion of a specific recent formulation of the Higgs naturalness problem, which concerns renormalized parameters rather than bare parameters as in the original formulation of the problem, is given by Harlander and Rosaler.Footnote 1 They argue that the allegation rests on attributing too much physical significance to the numerical values of renormalized parameters in an effective field theory (EFT). While the delicate cancellations purported to lie at the heart of the naturalness problem are thought to constitute a conspicuous cancellation between ostensibly independent physical quantities, they argue that this cancellation is in fact merely an eliminable artifact of mathematical convention. This dependence on arbitrary mathematical convention is taken to the extreme in (perturbatively) renormalizable quantum field theories, where the numerical values of the parameters cannot even be sensibly defined without first arbitrarily specifying a renormalization scheme. In this sense, the authors claim, the naturalness problem of the SM is ill-posed and not tenable.

In his contribution, Bain considers four arguments for adhering to naturalness, but argues that only one of them is tenable. He claims that, on the one hand, only in a single historical case can one argue that naturalness made a successful prediction (the case of the charm quark mass), while other cases of natural EFTs must be considered post-dictions. On the other hand, there are several counter examples where naturalness appears to fail. He continues to argue against considering natural theories more likely than others, for example on the grounds that they do not need to be fine-tuned: after all, the notion of fine-tuned parameters requires a probability distribution for these parameters, which is unknown in fundamental theories. William’s “central dogma” of a decoupling of widely separated scales is dismissed as being different from the naturalness requirement. Finally, Bain claims that among the four reasons to argue for natural EFTs is their compatibility with the concept of emergence, where emergent phenomena are understood as being both dependent upon, and novel with respect to, more fundamental theories.

The article by Jegerlehner claims that the SM does not suffer from a naturalness problem to begin with. It describes a comprehensive renormalization group analysis, finding that the quadratic term of the Higgs potential changes sign at energies below the Planck scale. This implies the transition to a phase where the Higgs vacuum expectation value is zero. Interpreting the vacuum energy induced by the Higgs potential as cosmological constant, this phase transition is claimed to solve both the fine-tuning problem of the Higgs mass as well as that of the cosmological constant, which is also the subject of the article by Gubitosi, Ripken, and Saueressig.

While the ratio between the observed and the “natural” value of the Higgs mass is about \(10^{-32}\), for the cosmological constant this ratio is of the order of \(10^{-120}\). In this context, however, one could argue that the proper theory to explain this number does not even exist, since as of today, there is no viable quantum version of gravity. Irrespective of that, the concept of asymptotic safety might provide an answer to this fine-tuning problem. The necessary pre-requisite for this is the existence of a non-Gaussian ultraviolet fixed point of gravity. The contribution by Gubitosi, Ripken, and Saueressig reviews the corresponding formalism, possible approaches for a non-perturbative solution of the renormalization group equations in a gravitational theory, and a number of promising results that have been obtained in this field so far.

The decision of whether a parameter assumes a “natural” value or not clearly assumes some knowledge of a (possibly hypothetical) probability distribution for that parameter. This is certainly problematic, because it requires some “meta theory” to determine the probability distribution. A careful analysis of this aspect is given in the contribution of Wells. This discussion takes into account the renormalization group flow of parameters in a quantum theory, inducing a probability flow which may strongly alter the distribution. Wells’ conclusion is that naturalness might still be a good guiding principle after all, but not a strict one: similar to statistical fluctuations, a few theories may be fine-tuned, but they should be considered exceptional cases.

The contribution by Friederich provides a more general discussion of fine-tuned (or unnatural) theories. The fact that relatively small variations in the numerical values of the parameters of our universe (for example the difference in the masses of the neutron and the proton) would prevent the formation of stable complex matter and thus the existence of life is being used by many to argue that our universe is just one of many. The more universes there are, the larger the probability to find at least one which allows for the emergence of life. This argument has also been used to “explain” the Higgs naturalness problem. However, it has been criticized by many to resemble the “inverse gambler’s fallacy”: that the observation of a relatively unlikely event (e.g. a double-six in the throw of two fair dice, or a small Higgs mass) implies that this event is part of a multitude of events (that the dice have been thrown many times before, or that many universes have been created). Friederich suggests a new argument in favor of the multiverse which avoids the inverse gambler’s fallacy by separating the “background knowledge” of the existence of life from actual fine-tuning arguments.

The multiverse conjecture is particularly popular in the context of string theory, which is currently believed to imply a vast number of possible universes. In this context, an alternative notion of naturalness has been proposed. While the common notion of the Multiverse solving the fine-tuning problem is a purely statistical (the more universes, the more likely it is to find one with a small electroweak scale), the so-called string-naturalness proposal is to call a theory “natural” if it arises from a “typical” string vacuum. In his contribution, Williams points out how vastly different such a notion is from the usual definition of naturalness which prohibits a strong dependence of long-distance phenomena from short-distance physics, and discusses the consequences for particle phenomenology.

Finally, the article by Päs goes even beyond the multiverse, suggesting that it is a necessary consequence if the universe is considered as a single entangled quantum state. He argues that an understanding of naturalness is tightly related to the question of what characterizes a truly fundamental theory. He also points out a possible relationship between these views and recent advances in gravitational theories such as the “ER = EPR conjecture”.

Even though it is impossible to do full justice to the concepts of naturalness, fine tuning, and hierarchy within a single volume, we believe that this Special Issue represents the diversity of views and formulations of these issues and their current status within high-energy physics. As editors, we would like to thank all authors for their time and effort in preparing their contributions.