1 Introduction

Over the last century, the concept of particle has emerged as fundamental in the field of physics. It encompasses a broad spectrum of entities, elementary or composite, referring to small stable or unstable (decaying) units that exist independently or as constituents of larger systems. Mathematically modeled according to the principles of relativistic quantum mechanics they are distinguished by the properties of mass, charges, and spin. From today’s perspective, elementary particles not only act as constituents of matter, but also mediate interactions. This is why their study has proven to be essential to understand nature at the microscopic level. It is notably at the turn of the nineteenth and twentieth centuries, around questions related to the existence of the electron and the composition of atoms, that the first techniques were developed to examine the properties and interactions of elementary particles. In turn, a deeper understanding of the underlying physics resulted in the development of new techniques and the discovery of new particles, and so on. This progression led to the theoretical development and experimental verification of the Standard Model, currently our best theory for describing elementary particles and their interactions. And in fact, nowadays, experimental tests of the Standard Model tightly restrict the options for new particles as they are conjectured to describe as-of-yet-unexplained phenomena, such as dark matter. In order not to interfere with current precision measurements, such new particles are either very weakly coupled to the known particles, or they are very heavy, or both.

The way to search for new particles depends on whether they are stable or not. If they are stable, they should occur with a certain abundance in our environment. If the abundance is large, the particles must be very weakly interacting with the known particles, because otherwise we would have seen traces of these particles already. Many dark matter candidates are of this kind, for example [15]. For the same reason, the stronger the interaction of any stable particle, the rarer it must be. A candidate for this is the magnetic monopole [103]. In both cases, one possible search strategy is to build detectors with a large fiducial volume through which the new particles in our environment can pass, and look for events that indicate their interaction with the detector material. Many of such experiments are currently in operation, while others are planned or under construction (see, e.g., [128]).

Unstable particles, on the other hand, first need to be produced before they can be observed. More precisely, from today’s point of view, the observation of an unstable particle requires its on-shell production in particle collisions, which means that its energy \(E\) and its 3-momentum \(\overrightarrow{p}\) must combine with its rest mass \(m\) as \(\sqrt{{E}^{2}-{\overrightarrow{p}}^{2}{c}^{2}}=m{c}^{2}\), where c is the speed of light. The minimal energy required to produce the particle is thus \(E=m{c}^{2}\). Particle collisions happen permanently in nature. For example, muons were discovered by detecting them during their average lifetime of 2.2 microseconds after being produced by the collision of cosmic protons with the Earth’s atmosphere. An interesting application of naturally occurring particle collisions would be the observation of pair-annihilation of dark matter particles into known Standard Model particles, as currently searched for by the AMS detector, located on the International Space Station [1]. Unfortunately, the flux of highly energetic cosmic particles decreases strongly with increasing energy, so their use for producing and studying particles is quite limited. This is why physicists already early on have resolved to create particle collisions in the laboratory. This allows to focus the particle reactions to a small region in space. The progressive developments of particle colliders have significantly shaped the field of particle physics over the past seven decades.

The two main metrics of particle colliders are precision and energy. High-precision colliders are characterized by a large event and a small background rate. Examples are the B-factories, like Belle II in Tsukuba (or the earlier experiments BaBar and Belle), which are focused on the properties of B-mesons and thus operate at energies of around 5–10 GeV [20]. Similarly, the energy of DAΦNE in Frascati was tuned to produce bound states of the strange quark and its anti-quark with a mass around 1 GeV/\({c}^{2}\)(so-called Φ-meson), and the various generations of VEPP colliders in Novosibirsk studied the production of hadrons at energies in the energy region around 1–2 GeV [3, 88]. Such experiments are not expected to discover any new fundamental particles but to test our models and to refine our knowledge of their parameters. Nevertheless, similar to other particle physics precision experiments, like the g−2 (pronounced “g minus 2”) experiment at Fermilab, they can give important hints for the existence of new particles, because the latter could impact the precision observables through virtual effects. In fact, both the B-factories and the g−2 experiment have produced measurements which are/were in disagreement with the Standard Model expectations (see more about this in Sect. 3.4). But even if these anomalies would be further confirmed, they would not be counted as the discovery of a particle according to current practice.

High-energy colliders, on the other hand, are mostly aimed at exploring new territory in the particle spectrum. Over the past half century or so, the increased understanding of the fundamental constituents of nature has motivated the construction of ever more powerful particle colliders. For the first few decades of the particle collider age, their energy grew exponentially, doubling about every six years, as illustrated by the so-called Livingston plot (Fig. 1). Every step in this process increased the potential to discover new, heavier particles. For example, the construction of the Super Proton–Antiproton Synchrotron (Sp\(\overline{\mathrm{p} }\)S) at CERN allowed for the discovery in 1983 of the weak gauge bosons W and Z, while the discovery of the top quark in 1995 required the construction of the Tevatron at Fermilab, and the Higgs boson could not be discovered before CERN’s Large Hadron Collider (LHC) was available. This most recent discovery of an elementary particle in 2012 provided the final element of the Standard Model. Without the Higgs field, the theory would make self-contradictory predictions for some processes that can be observed at the LHC. In this sense, the LHC was a “no-lose” experiment [34]. It was clear that it would either discover the Higgs boson as predicted by the Standard Model, or it would disprove the Standard Model and provide clear indications for new physics.

Fig. 1
figure 1

Time evolution of the energy reach of particle colliders. Updated version by Jordan Nash [107] of a plot produced by the NLC ZDR Design Group and NLC Physics Working Group at SLAC [91]

However, the particle content of the Standard Model is now experimentally confirmed. The overall high-precision agreement with experimental measurements indicates that it constitutes a self-consistent theory up to energies way beyond those reached at the LHC. This means that it does not provide any clues of when to expect the next particle discovery, if any. From a theoretical point of view, it is well possible that the mass scale for physics beyond the Standard Model is many orders of magnitude larger than the electro-weak scale, characterized by the Higgs mass of 125 GeV/\({c}^{2}\). The possibility of such a scenario is well known, of course; in the 1980s already, the perspective of emptiness in the energy interval between the electro-weak scale and the GUT scale—the energy level above which the electromagnetic, weak and strong force would be unified—then estimated to be around 1014 or 1015 GeV,Footnote 1 was popularized by the metaphor of a physics “desert” (see, e.g., [74]). This desert could even be extended to the Planck scale, \({M}_{P}= \sqrt{\hslash c/{G}_{N}}\approx {10}^{19}\) GeV/\({c}^{2}\), where the clear manifestation of new physics induced by quantum effects of gravity is expected (\(\hslash \) is the reduced Planck constant and \({G}_{N}\) is Newton’s constant of gravity). And with the completion of the Standard Model, we could be standing right at the edge of such a desert, as we will discuss in more detail below. It is, however, impossible to say how large it actually is. A short walk, if it does not lead to some fruitful oasis, may even take us across this desert, reaching vast unexplored territories. But it may equally well be impossible to traverse it with current methods.

The situation is severed by the observation that, with the LHC, the last step in the Livingston plot of Fig. 1 already lags behind the exponential growth. And the trend toward saturation would even be continued with the intended design of future projects such as the Future Circular Colliders at CERN, whose proton–proton collider (FCC-hh), according to the most positive scenarios, could begin operation by 2065 and reach a center-of-mass energy of 100 TeV [18, 155]. Partly, this break is due to financial reasons, of course, considering that the construction costs of the colliders also increased roughly linearly with their energy [130]. More serious in this respect are technical challenges or even physical constraints though. Some of them indicate a strict maximal energy accessible to particle colliders, unless significant technological breakthroughs are made (see, e.g., [17, 131]). For example, in the case of a circular collider, the dominant problem is synchrotron radiation which increases with the fourth power of the particles’ energy. To compensate for the associated power loss when doubling the collider energy, one would need to increase the radius of the collider by a factor of sixteen. Given the Earth’s radius, this implies an estimate of the maximal reach of a circular proton collider of about 1000 TeV. For an \({e}^{+}{e}^{-}\) collider, this limit is much lower because synchrotron radiation is proportional to the inverse mass squared. In parallel, the innovative prospect of plasma wakefield technology lets us expect compact linear crystal muon colliders of higher energy. For a length of 10 km, however, they would also remain limited to a maximum energy of the order of 10 PeV [131]. It therefore seems legitimate to consider that the upper bound of the range of any future particle collider is of the order of this energy value.

Of course, this does not mean that future particle discoveries are excluded. It is well possible that new physics is just around the corner. It is not unreasonable to assume that the successful history of particle discoveries at colliders continues, and that the next collider will open the avenue toward a new sector of particles. Similarly, the current or the next generation of dark matter experiments may be sufficiently sensitive to provide conclusive signals for a new kind of very weakly interacting particles. But, as will be discussed specifically in this article, we still have to face the possibility that the age of particle discoveries as we know them today is over. In fact, it is obvious that it will be over at some point, but the present situation is special in that there are no compelling experimental or theoretical indications for new physics significantly lighter than the Planck mass. This is in contrast to the past 100 years or so, where theoretical consistency of the by-then-best underlying theory often implied the prospect of new particle discoveries. Examples of this are the positron, which was expected from the Dirac equation, the neutrino from energy conservation, a number of hadrons and eventually the quarks and gluons from internal symmetries, all the way to the W and Z boson and finally the Higgs boson, which were needed for reasons of unitarity. With the discovery of the Higgs boson, however, no such trusted theoretical arguments point toward new particles in the accessible energy range any longer, as our developments will show.

Nevertheless, the end of discoveries of new particles in today’s sense would not entail the end of progress in fundamental physics altogether. Nor would it invalidate the construction of a next-generation collider. After all, quantum physics implies that measurements at one particular energy \(E\) are affected by the physics at all mass scales \(M\) (or, equivalently, distance scales \(l=\hslash c/M)\) through virtual effects, albeit suppressed by some power of \(E/M\). Rather, we may soon experience a change in the scientists' perception of discoveries in fundamental physics, which may no longer be tied to the current notion of particle. This is all the more plausible if we also take into consideration that this notion is based on quantum field theory, a theory which implies many phenomena that cannot be described by particles as we define them today. In other words, we could be witnessing the end of an era where particles were central to our understanding of nature. To justify the relevance of this questioning, as well as to tackle some of its potential far-reaching implications, we wish to take a historical stand in this paper. Indeed, this view is prompt to reveal important specificities of the development of particle physics. In particular, it underlines the close relationship between the evolution of observational methods and the understanding of the very idea of particle. In this sense, rather than an approach that would be limited to a description of the different steps of development of the field of particle physics, the focus of our analysis will be on the concept of particle itself, on its perception and use by physicists.

The expression “particle” itself has a history that stretches from antiquity to the present day. During this period, its most elementary definition which dates back to atomism in ancient Greek philosophy, that of a spatially localizable discrete entity, has experienced different challenges in natural philosophy and science. Such challenges—wave–particle duality being probably the best example—in fact led the physical concept of particle to undergo a significant evolution across time. Nowadays, it seems difficult to univocally associate a modern concept like the virtual particle with the antique idea of the atom. Nevertheless, from our historical approach, we will see that each step in this evolution naturally generalized the previous notion of a particle in such a way that the term “particle” has persisted and even prevailed to denote the fundamental entities of nature. Also, we will examine how these generalizations opened up new ways for particle observation, and thus for the discovery of new particles. As such, this approach will provide avenues of reflection as to how the current situation could be addressed, and some of its consequences perceived. It will lead us to defend that, in the current situation, particle physics will (have to) undergo yet another evolutionary step in particle observation and maybe even in the concept of particle itself.

No one can say what the future will be made of, and toward the end, this article will inevitably incorporate a speculative dimension that we fully assume. But our goal is not to solve the current problems of high-energy physics. We first and foremost wish to draw attention to the fact that the dynamics of their resolution must, in our opinion, be part of a conceptual reconfiguration, as particle physics has already experienced several times in the past. For this purpose, this paper is divided into three main sections, related to the past, the present, and the future, respectively. Section 2 presents a history of the development of the concept of particle. It puts forward a concept that stands out in practices for its illustrative, operational, and heuristic values, which shaped the particle era, especially since the end of the nineteenth century around its imposition as fundamental for the developments of physics. It also reveals a very diverse and progressive path of development dealing with the particle properties relevant to their discoveries and observation. Section 3 discusses the current status of particle physics, paying particular attention to potential sources for hints to new physics. The respective analyses of observational and theoretical sources, as well as experimental anomalies, then reveal that the explanation of the shortcomings of our current model does not necessarily imply discoveries based on our current notion of particle observation. Section 4 builds on previous developments to discuss the exploration of the desert ahead. It raises the question of the adequacy of current practices to account for the gap that separates us from the Planck mass and asks about the prospect for future discoveries, highlighting in particular the possibility of an increased interest in virtual effects. Finally, in conclusion, the concept of particle is examined again to show that, if it is not deeply reconfigured, it will experience a weakening of its heuristic and operational values.

2 History of the development of the concept of particle

2.1 From the Greeks to Thomson and Rutherford: particles as building blocks of matter

The concept of particle found its original expression in antiquity in the work of the first known atomists, Leucippus and Democritus. According to the fragments of texts by these two philosophers, the atoms form the most elementary constituents of visible bodies; they are indestructible, homogeneous—i.e., they have no internal structure—able to attach to each other and to move in the void [136]. Nevertheless, until the turn of the eighteenth and nineteenth centuries, there was no empirical evidence for the existence of atoms or other elementary particles. This makes the discussion of their existence so early all the more remarkable. One can attribute the long-standing atomic conjecture to its high heuristic power. The hypothesis of atoms was useful in finding explanations for what is described in philosophy as the world of appearances. The different compositions of the invisible atoms account for the variability of the phenomena, and the invariant elements in the processes of change are reduced to the persistence of atoms. Despite its plausibility, however, the particle idea did not play an important role in natural science until modern times. This can notably be explained by the influence of an important Aristotelian postulate that could only be opposed with the proof of the existence of the vacuum: atomism was faced with horror vacui, or plenism, that assumed the continuity of matter since the void between the particles was a non-being that could not exist [120].

The first empirical evidence for the existence of atomic particles came from chemistry. From investigations of Joseph Proust and Joseph Dalton on compounds, it became apparent that different substances combine in constant proportions representable as ratios of whole numbers [49, 119]. The atomic masses of different chemical elements began to be determined in the first half of the nineteenth century, and today, the periodic table can be seen as a paradigm of the application of the particle concept. However, since atoms themselves could not be detected—at least until Max von Laue demonstrated in 1912 the arrangement of atoms in a crystal using X-ray diffraction—, the atomic hypothesis remained long controversial among scientists.Footnote 2 A very prominent example of the tension between continuous and discrete approaches in science is the strained battle at the turn of the nineteenth and twentieth centuries between Wilhelm Ostwald, a fervent defender of an “energeticist” approach to matter, and Ludwig Boltzmann, an ardent partisan of the atom [54]. The progressive path toward the confirmation of the atomist viewpoint then took a paradoxical turn: the study of the properties of matter led to the destruction of the notion of atom indivisibility, which dated back to antiquity.

After decades of speculation about electrically charged particles, the electron was discovered in 1897 as a consequence of experiments on cathode rays in electric and magnetic fields.Footnote 3 The detection by fluorescence of their deflected trajectories led Joseph John Thomson to the “inescapable” conclusion that cathode rays are made up of particles of matter, whose ratio of mass to charge was found to be much lower than that of the hydrogen ion [142].Footnote 4 Note that Thomson did not observe individual electrons. His reasoning was above all sustained by the a priori theoretical conception that matter was composed of particles. The strength of his argument was then to interpret the results of his experiments in such a way as to provide a first sufficiently convincing notion of particle observation. Rather than the localizability character, the crucial property for Thomson was the possibility to put in correspondence the behavior of cathode rays in electric and magnetic fields with the kinematics of massive charged bodies. This example thus shows that the particle conception developed its heuristic power also in the context of discoveries.

Later, in further interpreting his work, Thomson did not completely abandon the old conception of the atom as an indestructible and homogeneous particle. In 1904, his so-called plum pudding model assumed electrons (plums) enclosed in a sphere (pudding) of uniform positive charge [143]. Ernest Rutherford, whose early work on penetration and deflection by a magnetic field of radioactive components had also established the particle nature of \(\alpha \)-rays, then dismissed such position in 1911. In an experiment designed to measure how an \(\alpha \)-particle beam is scattered when it strikes a thin gold foil, the observation of high deflection angles, unexpected in Thomson’s model, confronted Rutherford with the evidence of the complex and relatively empty structure of the atom. His new model postulated a nucleus, concentrating most of the mass and all the positive charge, orbited by electrons [126]. Noteworthy, Rutherford’s scattering method, in which the kinematics of a beam of charged particles aimed at a target is studied, subsequently became paradigmatic in particle physics and forms the basis of today’s accelerator experiments.

In the end, the discovery of the nucleus was the culmination of a movement that, by revealing the structure of the atom, established the role of particles as the building blocks of matter and turned the physicists’ attention toward the subatomic world. A major guiding principle for this movement was its orientation toward Newtonian mechanics, as illustrated by Rutherford’s atomic model, in which the nucleus is orbited by electrons like the sun is surrounded by planets. One can therefore speak of a classical particle concept, characterized by the specification of conservation laws (mass, energy, charge) and by strictly deterministic spatiotemporal motions of spatially localizable discrete entities [65, pp. 210–213]. We must then underline an important mechanism at play here, which, as our historical developments will show, has been repeated at different stages of the development of particle physics. The process that led to the establishment of the concept of classical particle was based on the theoretical attribution to the notion of particle of different properties that account for phenomena related to matter. In chemistry, it is the introduction of the notion of atomic mass that initially gave empirical credit to the atomic hypothesis and deeply renewed the field. Moreover, the attributes of mass and charge ascribed to particles allowed the unveiling of the subatomic world. More precisely, these two properties offered physicists the opportunity to conceive what they considered to be the first methods of particle observation. It was based on a simple principle of kinematics: in classical physics, two entities of the same mass-to-charge ratio follow an identical trajectory in vacuum when subjected to the same electric and magnetic fields.

Rutherford’s work had, however, a contradictory nature: according to the laws of classical electrodynamics, his planetary model of the atom would turn out to be unstable, with electrons collapsing into the nucleus. Therefore, when the classical particle concept proved central to explore the discrete character of matter, its heuristic power already happened to be exhausted with this final proof. However, with the following experimental evidence of the discrete character of light–matter interactions at the beginning of the twentieth century, a redefinition of the meaning of the particle concept was made. Non-classical features then allowed the concept to extend its usefulness for research into the foundations of physics.

2.2 The twentieth-century revolution: particles go quantum

2.2.1 Particles as constituents of radiation

Similar to investigations into the structure of matter, the history of the concept of light until the beginning of the twentieth century is also characterized by a form of tension between discrete and continuous pictures, illustrated in particular by the opposition at the turn of the seventeenth and eighteenth centuries between the wave approach of Christian Huygens and the corpuscular approach of Isaac Newton to optics. However, while the atomists were covered with success during the nineteenth century, it was the wave point of view that prevailed for light most especially thanks to the interference experiments of Thomas Young around 1800 and their theoretical description by Augustin Jean Fresnel in 1818 [51]. Later, in 1873, developments in the theory of electromagnetism led James Clerk Maxwell to suggest that visible light (as well as invisible infrared and ultraviolet rays by inference) consists of propagating disturbances (radiation) in the electromagnetic field [50].

Nevertheless, it is well known that the discretization of energy and the resurgence of the corpuscular model of light soon became foundations for the developments of the quantum theory [92]. One of the first significant deviations from the classical image of the particle thus resulted from Max Planck’s work on black-body radiation in 1900. To interpret the color variations of an incandescent body as a function of temperature and solve different mathematical issues related to this problem, Planck had to suggest that the energy be quantized, i.e., for each frequency, it is emitted in packets of energy, also called quanta [118]. This subversive proposition (energy had become discrete!) was taken up by Albert Einstein in 1905. In his explanation of the photoelectric effect, he postulated that all electromagnetic radiation can be divided into a finite number of “quanta of energy.” He also clarified that the latter are “localized points in space [that] move without dividing, and can be absorbed or generated only as a whole,” dealing with these so-called light quanta—the term photon was introduced only in 1926 by Gilbert Lewis—as elementary particles ([60], p. 133, [96]). In addition to being accepted as building blocks of matter, particles were now also recognized as constituents of radiation. The parallel development of special relativity moreover firmly established the particular nature of the photon as a massless particle that cannot be at rest.

The quantization rules resulting from the discretization of the electromagnetic field led to profound changes in the way physicists conceived of matter and its interactions, and therefore significantly enriched the concept of particle. Of particular importance, Niels Bohr developed in 1913 a new stable model for the atom that replaced Rutherford’s [26, 90]. For this, he suggested that electrons are able to move in discretely distributed stable orbits and assumed that they can only gain or lose energy by jumping from one authorized orbit to another, absorbing or emitting electromagnetic radiation. Later, in the early 1920s, a quantum approach of the anomalous Zeeman effect discovered in 1898 led various physicists such as Wolfgang Pauli, George Uhlenbeck, and Samuel Goudsmit to forge the notion of spin, which has no equivalent in the classical picture [44].

2.2.2 Wave–particle duality

Non-classical features of the particle concept included not only the discretization of light and the attribute of spin, but also a fundamentally new relationship to wave phenomena. In the annus mirabilis of 1905, the introduction of the notion of photon to account for the photoelectrical effect was soon accompanied by an unexpected twist. In his paper that introduced special relativity, Einstein treated the phenomenon of light as a continuous field of waves [61]. This apparent contradiction actually testifies to the fact that he had embraced the idea that light has a dual nature, and it was in 1909 that for this special case he formally introduced wave–particle duality into physics [62]. Nevertheless, this idea received little consideration until the observation in 1922 of the Compton effect. Resulting from the inelastic scattering of light by an electron, it made it possible to attribute a particle-like momentum to photons [133].

Two years later, Louis de Broglie thus formally established the relation between wavelength and momentum, λ = h/p, and developed the hypothesis that all matter has a wave nature; that is, each particle can exhibit wavelike behavior [53]. This revolutionary assumption—confirmed experimentally in 1927 by George Thomson and Andrew Reid, but also, independently, Clinton Davisson and Lester Germer, who observed interference fringes in an electron diffraction experiment [52, 141]—genuinely linked the contradictory aspects of discrete and continuous properties of matter. In addition, wave–particle duality became central to quantum mechanics, as illustrated by its role in the development of Schrödinger’s “matter wave” equation and Heisenberg’s uncertainty relations, as well as Born’s probabilistic interpretation and Bohr’s principle of complementarity ([35]; [127], pp. 37–38 and 48–49). Then, particles were no longer expected to behave experimentally and to be approached theoretically like classical ones, but to be characterized by a predominant and intrinsic feature of wave–particle duality.

2.2.3 Particles seen as operational

To understand more in depth the impact of wave–particle duality on the concept of particle, one must have a specific look at the Born rule and its interpretation. This elementary postulate of quantum mechanics—developed by Max Born in 1926 in the context of scattering theory—states that the probability density of finding a particle at a given point, when measured, is proportional to the square of the magnitude \(|\Psi|^2\) of the particle’s wave function at that point [27]. Ψ thus represents the mathematical description of a diffracted wave, and the “rule” makes the link between the mathematical formalism of quantum mechanics and experiment. In his probabilistic interpretation, Born therefore considered the waves that propagate in a system as probability waves, for which the Schrödinger equation predicts the probabilistic distribution of the scattered particles, in other terms, the probability of finding the particles experimentally at any point in space. As advanced by Brigitte Falkenburg, the wave–particle duality is understood in this context as the “duality of probability waves and particle detections […] Operationally, there are particles. Axiomatically, there are fields and waves” [65, p. 271]. This view can be enhanced by a specific approach to the uncertainty principle which asserts a fundamental limit to the accuracy with which the values of complementary variables for a particle, such as position and momentum or time and energy, can be measured. If one renounces considering the particle as a spatially localized object characterized by definite values (like position and momentum), but takes it as a wave having physical extension in space, it is possible to represent it by a wave function which describes its spatial distribution and contains all the information relating to this “particle.” Then, measurements only consist in extracting part of this information, thanks to mathematical operators.

Born’s probabilistic interpretation has been long criticized, in particular for its lack of consistency on the notions of measurement and probability. But from the late 1920s it played a fundamental role in the development of a new philosophy of physics related to quantum phenomena [16, 93]. Born’s position thus substantially favored an approach of the concept of particles in a sense limited to its operational determination, according to which “particles are collections of dynamic properties that may be localized independently in a particle detector” [65, p. xi]. Such operational definition—which revealed rather influential up to the present dayFootnote 5—can be understood as one possible definition of the quantum particle concept. Ultimately, with the advent of quantum physics, the abolition of the classic antithesis of wave and particles, the inclusion of non-classical variables and the recognition of particles as constituents of radiation led to an extension of the former classical particle concept and its field of application. Also, the operationalist approach asserted the fundamental experimental role of particles, considered as suppliers of partial information accessible by measurements. In this respect, it must be underlined that the notion of particle observation was initially not affected in itself by the new insights of quantum mechanics. Kinematics remained central to experimental methods. Rather, the localizability character of particles changed, being now closely tied to their observation.

However, the situation was about to change. Further developments in quantum theory, coupled with the introduction of new instruments, led scientists to reconsider their approach to matter and its interactions. Discussed in Sects. 2.3 and 2.4, they stress the role of the quantum particle concept beyond its simple operational nature. In particular, they highlight not only a new heuristic power, but also the importance of representations with the help of particles in the physicists’ thinking.

2.3 The birth of quantum electrodynamics: creation and annihilation of particles

2.3.1 Dirac’s hole theory

The solid establishment of quantum mechanics in the late 1920s actually opened an ambivalent period for the approach of particles by physicists [33, 108]. On the one hand, the community was convinced that they had achieved a form of stability, illustrated by the widely held belief that there are only three elementary particles: two building blocks of the atoms, the electron and the proton, and the photon, constitutive of electromagnetic radiation. It led during the 1930s to heated debates linked to the postulate or the discovery of new particles.Footnote 6 On the other hand, however, subsequent research in quantum theory, in particular with a view to a better understanding of the interactions of matter, constantly pushed physicists toward the need to question the validity of their model. Therefore, their views on particles had soon to be deeply reconsidered in the frame of the new quantum electrodynamics.

Wave–particle duality had not exempted theoreticians from making a choice between a continuous and a discontinuous approach to matter. Therefore, as elaborated at length by Silvan Schweber, the developments of quantum electrodynamics from 1927 to 1950 can be understood as “an oscillation between two viewpoints: one which takes fields as fundamental, in which particles are the quanta of the fields; and the other which takes particles as fundamental, and in which fields are macroscopic coherent states” [129, p. xii]. Schweber pointed out that the quantum field-theoretic viewpoint was richer in potentialities and possibilities than its particle counterpart throughout this period, as illustrated by its various successes, such as Fermi’s theory of beta decays, Yukawa’s meson theory of nuclear forces, and Pauli’s spin-statistics relation (to be discussed in further developments). This powerful approach nevertheless suffered from various difficulties, which will be discussed with their solutions in Sect. 2.4.1. We first want to highlight the other side of oscillations, the particle point of view, to which Schweber’s historical account also gives much credit. Mainly embodied in Paul Dirac's hole theory, it forced physicists to reconsider their conception of matter, served as a major aid for explanation and problem-solving, and thus brought out the heuristic power of the concept of particle.

Despite its limited field of application—it only applied to fermions of spin ½—Dirac’s hole theory notably helped to explain bremsstrahlung and Compton scattering, and became an effective tool to “calculate” processes in quantum electrodynamics up to energies of the order of 137 mc2. Moreover, in many respects, it “changed our whole outlook on atomic physics completely” [80, p. 49]. If Dirac's initial work on radiation and dispersion in 1927 firmly established that photons play the role of force carrier for electromagnetic interactions and introduced the physical basis for conceptualizing the notion of virtual particleFootnote 7 (see Sects. 2.3.2 and 2.4.2), this quote from Heisenberg actually refers to antimatter, postulated as a direct result of the 1928 derivation of the relativistic wave equation for electrons [55,56,57]. The hole theory that Dirac developed to account for this result then established on a solid mathematical and theoretical basis the concept of the creation and annihilation of pair particles—which later allowed for a completely new description of the vacuum ([30]; [89], chap. 13). The fundamental issue of the creation of matter suddenly found an explanation through how radiation (quanta of field) could convert into—and result from—matter (pair of particles). Elementary particles thus lost part of their character as fundamental entities. Albeit impartible, they could no longer be considered permanent [108].

2.3.2 Yukawa’s meson theory

Along with quantum electrodynamics, nuclear physics is the other area which, in the early 1930s, profoundly changed particle physics. The main stake was the development of a theory of nuclear forces, which found its successful expression in 1934 with Hideki Yukawa's theory of mesons [32, 154]. Although calculations were based on field-theoretic techniques, Yukawa's descriptive approach to the phenomena at play was largely based on the narrative of a proton and a neutron interacting by emission and absorption of a heavy particle. This illustrative dimension testifies, in physicists’ thinking, to the weight of the particle viewpoint which, by analogy with quanta in the electromagnetic field and by virtue of the wave–particle duality, led to the one-to-one correspondence of the specific quantum field studied with a new particle, estimated after field quantization at about 200 electron masses. The outstanding prediction of this “heavy quantum,” soon to be named meson and known today as pion, then challenged the commonly shared particle concept in many respects [32, 108].

Primarily, Yukawa’s meson is a short-lived particle in the nucleus which briefly violates energy conservation within the limits given by the Heisenberg uncertainty relations—in other terms, it is produced off-shell—and therefore cannot be observed directly. In this sense, the pion was the first particle to be initially postulated as virtual. Also, since its creation and subsequent annihilation in the nucleus provided an explanation for the mechanism of nuclear force, it secured the idea already introduced in 1932 by Heisenberg that massive particles can act as force carriers [79]. This point marked yet another generalization of the role played by particles in modern physics. In addition to building blocks of matter and constituents of radiation, they were now recognized as mediators for interactions. In this framework, one of the pioneering aspects of Yukawa's theory was his postulate that the force range is inversely proportional to the mass of a particle, which drew the attention of physicists to the field of high-energy physics, to the need to access higher energies in view to study smaller scales [32].

Finally, the success of the meson theory reinforced Enrico Fermi's idea that elementary particles can decay. In 1933, the Italian physicist had explained \(\beta \)-decay radiation in a close analogy with quantum electrodynamics [68]. Relying on the notion of creation and annihilation of quanta, as well as on Pauli's postulate of the neutrino in 1930, he had ruled out the idea that electrons preexist inside the nucleus and introduced the one that neutrons, and therefore elementary particles, can decay, i.e., that they can be unstable. To summarize, in the words of Jaume Navarro: “The classical concept of ‘elementary’ particles was gradually replaced by that of ‘fundamental’ particles, their role in the structure of matter no longer being of a simple mechanistic type” [108, p. 454].

2.3.3 Particle physics as an experimental field: observing tracks

A correct account of the evolution of the concept of particle in the period 1930–1949 cannot do without considerations of experimental physics, which provided essential results to establish and confirm the theoretical models mentioned above ([89], chap. 13; [108]). In this, the particle concept unfolded its heuristic power in numerous discoveries. Remarkably, thanks to a series of experiments consisting in bombarding light elements with \(\alpha \)-particles, the discovery by James Chadwick of the neutron in 1932 greatly impacted nuclear physics [38]. Of particular importance was also the field of cosmic ray physics, which in the 1930s became the paradigmatic scheme for the experimental study of elementary particles. While physicists since the 1910s knew they could observe tracks left by individual stable particles in cloud chambers, the demonstration at the end of the 1920s that cosmic rays were made of high-energy particles brought this tool to the field and confirmed its role as a true particle detector [19]. Accepting tracks as a new means of observing particles was a necessary prerequisite for the discovery of the positron by Carl Anderson in 1932, and of the creation of electron–positron pairs by Patrick Blackett and Giuseppe Occhialini in 1933 [4, 23]. Both cases thus favored the reception and development in the 1930s of Dirac’s hole theory.

In 1936, Seth Neddermeyer and Anderson also discovered the muon in cosmic rays [109]. Although this particle was not tied to any theoretical scheme, its observation was no less impactful than previous ones. On the one hand, due to its mass, the muon was quickly believed to be the meson postulated by Yukawa, which, having received sufficient energy, would have been emitted by the nucleus—in other words, produced on-shell.Footnote 8 On many points, this error turned out to be beneficial since it shed light on Yukawa's work that was until then largely ignored by Western physicists [32]. On the other hand, the muon's inherent instability—its rapid decay into electrons—was directly observed around 1940 through tracks in a cloud chamber by Erin Williams and Gary Roberts [105, 151]. This first observation of the spontaneous decay of fundamental particles thus opened new perspectives and led physicists to broaden their criteria for new discoveries [140]. Therefore, in 1947, if Cesar Lattes, Giuseppe Occhialini, and Cecil Powell discovered the charged pion thanks to tracks it left on emulsion plates, Georges Rochester and Clifford Butler only needed to observe the tracks of decay products in a cloud chamber to infer the existence of the neutral kaon which itself does not leave a track in the chamber due to its vanishing electric charge [94, 122]. With such “v-events”—named as such in reason of the shape of the observed tracks—the notion of particle observation had been extended once again, from visible to invisible tracks by employing energy–momentum and charge conservation.

The 1930s and 1940s were a period of great fertility for the concept of particle, which helped to profoundly enrich the theories of quantum electrodynamics and nuclear physics. Also, its experimental application, especially in cosmic ray studies, brought successes that gradually contributed to establishing particle physics as an independent field [33]. This period was mainly characterized by major changes in the way physicists thought about matter. In particular, it saw the demise of the immutable character of elementary particles—in their classical understanding—which was replaced by the notions of creation, annihilation, and decay. This revolutionary step then opened new perspectives for the application of the concept of particle and led to reconsider those of particle observation and discovery. Indeed, if the cloud chamber, in its basic principles as a particle detector, relied strongly on the kinematic approach developed at the turn of the nineteenth and twentieth centuries—magnetic fields are applied to observe the deflection of charged particles—the interpretation of tracks, for its part, was pushed further thanks to the extension of the particle concept within the framework of the theoretical developments of quantum electrodynamics. Therefore, despite the different challenges raised by wave–particle duality and observational methods, the ability of the concept of particle to guide physicists toward the prediction and explanation of different phenomena ensured its relevance to modern physics practices. This was soon confirmed by further developments in quantum field theory which are discussed in the next section.

2.4 Quantum field theory: toward particles as resonances

2.4.1 Infinities and renormalization

Although Dirac’s hole theory gave rise to an entirely new perspective on particle physics, it also suffered from various flaws that eventually led to its abandonment. In particular, the initial postulate of the Dirac sea, which views the vacuum as an infinite “sea” of filled negative-energy states, and interprets potential “holes” as positrons, was not well received by the community. The field viewpoint in quantum electrodynamics was then allowed in the 1930s to reformulate such a model, resuming all its valid predictions. In particular, the quantization of the charged Klein–Fock–Gordon field by Pauli and Victor Weiskopf in 1934 demonstrated the possibility of pair production without the Dirac sea [113]. It was in this theoretical framework that Yukawa developed his meson theory but also that in 1940 Pauli elaborated his spin-statistics theorem, namely that particles of zero or integer spins obey the Bose statistics, while particles of odd half-integer spin obey the Fermi statistics [112]. Nevertheless, the development of quantum electrodynamics, as well as its extension to a broader class of fields, namely quantum field theory, had to face from its beginnings several difficulties—as it was also the case for the particle viewpoint in fact. The local coupling of the charge current density to the electromagnetic field was responsible for divergent integrals and thus undefined results in calculations dealing with the self-energy of the electron and the vacuum polarization. Locality there resulted from the use in the conceptual framework of quantum field theory of the point-like particle model, initially introduced in quantum electrodynamics to describe the electron. In this model, particles are assumed to have no substructure and no spatial extension [129, pp. 85–88].

The long-standing problems raised by such divergences led to an attempt by Heisenberg in the 1940s to substitute quantum field theory for a less microscopic approach which avoided spatiotemporal considerations and replaced them with abstract mathematical properties of the S-matrix [47, 85]. In this program, the so-called S-matrix theory, particles lose their role as fundamental entities. Geoffrey Chew, one of its main proponents in further developments, even clearly called at several instances for the abandonment of the concept of elementary particle [39, 40]. Nevertheless, in the meantime, major theoretical contributions to quantum field theory had superseded Heisenberg’s initial proposal. In the late 1940s, four leading physicists stood out: Shin'ichirō Tomonaga, Julian Schwinger, Richard Feynman, and Freeman Dyson [129]. The techniques they developed, named renormalization, aimed to isolate and discard infinities by replacing them with finite measured values. Together, they thus provided covariant and invariant gauge formulations of the quantum field theory, which allowed computation of observables in principle at any order of perturbation theory. This approach immediately met with success, in particular thanks to the explanation of the electron’s anomalous magnetic moment and vacuum polarization it provided. Subsequently, quantum field theory became the framework underpinning the development of the Standard Model.

2.4.2 Feynman diagrams: virtual particles

In line with Schweber and the oscillations between two viewpoints he put forward, it should, however, be underlined that Tomonaga, Schwinger, and Dyson were all field theorists who favored an operator-based approach, while Feynman, for his part, can be considered a particle theorist [129].Footnote 9 Particles were building blocks for him, and his well-known diagrams, which complemented the theoretical framework of quantum field theory, were initially aimed at developing a space–time approach to quantum electrodynamics when first published in 1949 [69]. As Rutherford and Born before him, Feynman was working in the context of scattering theories. His two-dimensional diagrams—three-dimensional space is projected onto the horizontal axis and time onto the vertical axisFootnote 10—thus schematically describe physical interactions as a sequence of particle creations and annihilations [153].

Feynman's method provided the first systematized, generalized, but also visual, description of the concept of virtual particle. In 1949, interactions were described as resulting from the exchange of such entities, represented by internal lines in a diagram. This has led to an understanding of virtual particles which occupies a very particular position compared to what is usually described under the concept of particle. According to Falkenburg, although they can be defined by their discrete nature as well as by their properties of mass, energy, charge, and spin, they do not satisfy the energy–momentum relation, i.e., may be off-shell, and are non-independent—they belong to interactions. Therefore, unobservable since non-localizable by a particle detector, virtual particles do not fit the operational dimension traditionally attributed to particles [65, pp. 233–238]. However, in contrast, they still play a major theoretical role in explaining various phenomena and are today essential parts of the conceptual apparatus of the theories of weak, strong, and electromagnetic interactions.

Indeed, Feynman diagrams, which initially aimed to simplify the calculations used to describe the dynamics of relativistic quantum systems, commonly provide the calculations for such theories. Mathematical terms can be attached to each element of a Feynman diagram, translating the visual representation of an interaction process into an equation that provides its probability amplitude. Easy to apprehend intuitively, this illustrative, innovative, and elegant approach has thus provided a precise method for calculating physical processes in principle at any order in quantum field theory. Even more, Feynman diagrams have prompted themselves as a stand-alone theoretical device which is not simply applied to the latter, but that encodes it, albeit only in the perturbative limit [77]. Soon after their introduction, they gained popularity, including in the fields of nuclear, particle, and solid-state physics, and became standard tools in modern physics [85, 153]. Their suggestive graphical notation, that implies a significant pragmatic simplification in terms of calculations, then highly popularized the view of interactions through particle exchange or the creation of intermediate particles. It cemented the physicists’ way of arguing, communicating, and also thinking in terms of particles. Building on this success, the relevance of the particle concept for scientific practice was therefore reinforced during the second half of the twentieth century.

2.4.3 The age of accelerators and the development of the Standard Model

Beyond our developments in Sects. 2.3.2 and 2.3.3, Yukawa’s prediction of a new particle in the mid-1930s, followed by the subsequent discoveries of the muon and the pion, marked the opening of a remarkable phase of particle discoveries that extended to the observation of the Higgs boson. The pion and the muon can actually be considered paradigmatic for this phase in the sense that one of them was an expected, the other an accidental discovery.Footnote 11 Indeed, this scheme continued for several decades. Accidental discoveries revealed theoretical structures which implied further discoveries, possibly at higher energies. This motivated new experiments, which in turn allowed for further accidental discoveries, and so on.

In view of this successful interplay between theory development and discoveries, along with contributions to quantum field theory discussed above, the end of the 1940s was also characterized by an important change in the conception of experimental particle physics. In spite of the various achievements in the field of cosmic ray studies, the natural possibilities it offered soon proved insufficient. Pursuing the goal of mastering particle interactions and achieving higher energies, physicists thus began to invest more and more effort toward developing more sophisticated detectors—such as the bubble chamber invented in 1952 by Donald Glaser or the multi-wire proportional chamber designed in 1968 by Georges Charpak—and particle accelerators [64, 99]. The latter, consisting of the projection on targets or the collision of high-speed particle beams, emphasized even more the role of scattering experiments for practices in particle physics. Also, it led to the next major rethinking of particle observation techniques. Important theoretical efforts from the 1930s had been put in nuclear physics and quantum electrodynamics toward the study of the analytic properties of scattering amplitudes of many-body systems [24]. They helped to establish that anomalously large scattering cross sections result from resonance effects, for which it was later assumed that they are caused by the creation of on-shell intermediary particles adding to the cross sections of the particles in the collision. The search for resonances—recognizable as peaks in the graphical representation of the ratio between the cross sections of the particles studied and their total energy—then became the standard method for observing new particles in accelerators. In that context, Feynman diagrams thus provided the theoretical support for increasingly complex calculations linked to the physics probed in the latter.

The first particle ever discovered in an accelerator was the neutral pion in 1950 [25]. From this stage onwards, particle physics developed rapidly on a large scale. In particular, the 1950s were marked by the discovery of dozens of hadrons which were initially considered to be elementary particles in their own right. The confusion caused by this “particle zoo”—so named because of its extent and variety—then prompted physicists to believe in the existence of smaller constituents of matter.Footnote 12 It led to several theoretical predictions (gluon, quark, tau, Higgs, W and Z bosons) that made it possible to establish the so-called Standard Model, whose commonly accepted and widely disseminated illustration today (Fig. 2) is paradigmatic of a modern physicists’ thinking largely governed by the particle concept [83]. Indeed, the bosons, as force carriers, and the fermions—being denominated by their flavors, a notion introduced in the 1960s to distinguish certain classes of particles whose properties are similar—, as generations of matter, are all equally qualified as elementary particles and theoretically represented as such. The current consistent formulation of this model was finalized in the mid-1970s with confirmation of the existence of quarks thanks to deep inelastic scattering experiments—a process developed as an extension of the Rutherford scattering method at higher energies to probe the substructure of hadrons [117]. The Standard Model has since been the subject of a quest toward its full validation (see Sect. 3). The discovery of the W and Z bosons in the early 1980s, the top quark in the mid-1990s, and the Higgs boson in 2012, were thus among the ultimate experimental challenges of modern particle physics [9, 36, 43, 48, 138, 139, 144, 145]. With this, the time of accidental discoveries of fundamental particles was facing its end, which is the more remarkable as the search for these heavy Standard Model particles led to the construction of very powerful new accelerators and experiments. They required increased technical, human, and financial resources that today raise the question of our capacity to reach ever-higher energies to discover new particles.

Fig. 2
figure 2

Standard Model of elementary particles (Wikimedia Commons/Fermilab, Office of Science, United States Department of Energy, Particle Data Group)

2.4.4 Notes on some limitations of the particle concept in quantum field theory

If we have seen how the concept of particle played an important role for our understanding of matter interactions, it must be pointed out that this is the more remarkable as an interpretation of quantum field theory in terms of particles as fundamental entities seems strictly possible only for non-interacting, i.e., “trivial” theories [71]. The particle approach in the theory remains only approximately valid in the case of a small interaction. The interaction terms, understood as introducing transitions between excited modes of harmonic oscillators, can therefore be interpreted as scattering effects between particles [148]. In the perturbative regime, where the field operators and states of the interacting theory can be approximated in terms of those of the free theory, the particle concept can be sustained to some extent, which permits the Feynman diagrammatic visualization of the theory in this case. While this situation partly explains why the concept of the particle was able to remain so fruitful during the second half of the last century, it also clearly highlights its limits, some of which have already manifested themselves in the past.

In this sense, quantum field theory, while originally emerging from the necessity to describe relativistic elementary particles, really is a theory of fundamental fields which also incorporates physics that cannot be described by particles. Paradigm examples for this are collective field excitations such as topological defects and instantons. But even in collider physics, non-perturbative and thus non-particle effects of quantum field theory are well known. For example, at low energies, hadronic interactions are too strong to allow for a perturbative treatment in quantum field theory, and thus the Feynman diagrammatic viewpoint of “exchanging particles” becomes invalid. Other means of calculation are required in order to make quantitative predictions for observables. In the 1960s, it enabled the S-Matrix program to be particularly influential, before being largely abandoned in the 1970s when quantum chromodynamics was recognized to solve the problems of strong interactions within the framework of field theory [47]. More precisely, today’s standard approach in the low-energy regime of strong interactions is lattice gauge theory [46, 152]. It discretizes space–time into a four-dimensional lattice and assigns numerical values of the quark and gluon fields at the sites of the lattice, which defines a so-called field configuration. This allows for a numerical evaluation of the path integral, which corresponds to an average over a large number of field configurations. The particle aspect of quarks and gluons is completely irrelevant in this theoretical approach. Phenomenologically, this is manifested by color confinement, i.e., by the fact that quarks or gluons cannot be observed as isolated particles.Footnote 13 Rather, it is a non-perturbative combination of the quark and gluon fields which gives rise to particles, the so-called hadrons, often described as bound states of quarks and gluons which do not carry any overall strong charge (color). Nevertheless, experimentally probing hadrons with, say, high-energetic electrons—in a process conceptually similar to Rutherford scattering—does allow for a perturbative description and an interpretation of the observed phenomena in terms of a particle picture of quarks and gluons. Historically, it was such deep inelastic scattering data which first led to the discovery of the constituents of the proton and their subsequent interpretation in terms of quarks. The discovery of the gluon required the interpretation of “jets” in high-energy collisions as originating from the initial production of quarks and gluons [87].

Yet another illustration of the problem of particle interpretation in an interacting theory is to recall the problems that physicists were facing in the search for a potentially heavy Higgs boson. In the Standard Model, the Higgs mass \({M}_{H}\) is a free parameter and needs to be determined by a measurement. Consequently, also the width \({\Gamma }_{H}\) of the Higgs (its inverse lifetime) was not known before its discovery, because it depends on its kinematically possible decay modes. For a Higgs mass \({M}_{H}\sim 1\) TeV/\({c}^{2}\) or larger, the Standard Model predicted \({\Gamma }_{H}\sim {M}_{H}\), in which case it would become questionable to identify a peaked signal on top of the experimental background (see, e.g., [121]). Eventually, the Higgs mass was found to be well below that value, and the Higgs was discovered by a classical identification of a peak structure in the spectrum of the invariant mass of its decay products.

These episodes highlight the fact that discoveries in fundamental physics may require to take into account aspects of quantum field theory that go beyond its association with particles. In fact, it is notable that the S-matrix program seems to enjoy a renaissance with the development of modern amplitude methods (see, e.g., [104]), albeit mostly in the form of a calculational tool and focused on idealized theories which are rather detached from phenomenology.Footnote 14 In contrast, as a conclusion on the last 70 years, it should be stressed that the concept of particle, strong of its operational dimension, has so far proved its capacity to bring together theoretical and phenomenological approaches. In the age of particle colliders, it was thus a rather new understanding of particles as mediators of interactions that was brought to the fore and fully exploited by physicists. The junction of theoretical and experimental efforts then allowed the identification of particles as resonances, offering to the community a new means of observation.

2.5 Intermediate conclusion

While the history of physics has been characterized by a lingering tension between discrete (particle) and continuous (field) approaches to matter, it becomes clear from the previous sections that our current understanding of nature relies still heavily on explanations with the help of the concept of particle. Originally, this is not due to some specific form of observation which may have been targeted at particles. In fact, neither Dalton, nor Thompson, nor Planck or Einstein actually observed individual particles. They used specific properties which, by that time, could be associated exclusively with particles, to draw conclusions that matter is made of atoms, cathode rays are made of electrons, and electromagnetic radiation is made of photons. In other words, they made use of the heuristic power of the particle concept. Later, although challenged by wave–particle duality and its approach in Born’s interpretation of quantum mechanics—which tended to minimize its purely theoretical role in favor of its operationalist dimension—the particle concept continued to play a major formative role in quantum theory and its developments. On the one hand, as episodes such as Yukawa’s descriptive approach to nuclear forces or the development of Feynman diagrams show, it stood out for its illustrative value. As another example, not least due to its classical analog of, say, planets moving in the gravitational field of the sun, the narrative of an electron “moving” around the proton on discretely distributed “orbits” is much more common than the alternative account, where the electron wave function in an energy eigenstate forms a standing wave along a circle around the proton. Representations based on discrete orbits are frequent in atomic and molecular physics. On the other hand, subsequent theoretical developments in quantum electrodynamics have in fact continued to prove the theoretical fertility of the particle concept. Not only did it make it possible to carry out increasingly complex calculations, but it also explained various phenomena. This ensemble was then secured and enriched by experimental observations, such as tracks in cloud chambers and resonances in accelerators. In this context, the great success met by the application of Feynman diagrams in the postwar period, as part of the development of a rather field-theoretic model, also asserts itself as a paradigmatic proof of the major importance of the particle concept in modern physics.

To sum up, the persistence of the highly formative role played by the particle concept in natural philosophy and physics, in other terms, its powerful heuristic, illustrative, and—in elementary particle physics—operational values, characterize what could be defined here as the particle era. To draw a clear cut of the beginning of such period is not an easy task, since one can speak of an era of the particle concept in different meanings: in the comprehensive meaning, which goes back to antiquity, in the restricted historical meaning, which begins with the early modern period, and in the meaning of the emergence and successes of today’s elementary particle physics. The last meaning can be considered to have begun somewhat with the first experimental evidence for a discrete character of matter and the applications of the atomic hypothesis in chemistry—the discovery of subatomic structures at the end of the nineteenth and the beginning of the twentieth centuries represents a real turning point. In this sense, Thomson’s “inescapable”Footnote 15 hypothesis of the electron works as the symbol of the opening of a period when the concept of particle fully develops its usefulness for research into the entire foundations of physics.

The previous sections also sketched a very diverse and progressive path of development dealing with the particle properties relevant to their observation or discovery. As the particle concept was enriched by theoretical and experimental achievements, the phenomenological perspectives were broadened, giving rise to new interpretations of different events in terms of particles. With the establishment of the concept of classical particle, the notion of mass-to-charge ratio allowed for the localization in large collections, be it as constituents of matter, as beams, or in some other form, of stable or at least sufficiently long-lived (quasi-stable) particles such as electrons, nucleons, or nuclei such as α-particles. Their properties could be studied instantaneously, for example through their emission, absorption, or reflection of light, or by probing them with beams of other particles. With quantum theory, the field of application of particles has been progressively enlarged. From building blocks of matter, they also became constituents of radiation and mediators for interaction. New properties have also been attributed to particles. In particular, their mutability—i.e., their capacity to be created, annihilated and to decay—became central in the 1930s for the interpretation of “tracks” in the context of the reconfiguration of experimental practices with the help of cloud chambers. Observing new particles through their decay products even became the norm for the next generation of discoveries, which saw physicists shift from studying natural phenomena such as cosmic rays to artificial production in particle accelerators. Meanwhile, the theoretical understanding of particle interactions had progressed to such an extent that particles produced on-shell could be put in one-to-one correspondence to resonances (peaks) in scattering cross sections. Therefore, the focus of investigation moved from the individual patterns of tracks to the search for peaks, i.e., an increased frequency of the occurrence of certain final states relative to the background. This “bump-hunting” is the canonical way of searching for new particles until today. In the process that led to this situation, our historical developments then underline how remarkable it is that each major phase of development in the particle era was accompanied by a real reconfiguration of the concept of particle, which in turn, as new properties were emerging, modified our perspectives related to the notion of particle observation.

3 The Standard Model and the mass scale of new physics

To date, the Standard Model is the last link in a chain of developments that has seen particle physics practices structured around the various conceptual evolutions discussed up to now. It is considered as our best theory to describe elementary particles and their interactions. Overall, it can be understood as a generalization of quantum electrodynamics, in the sense that it extends the structure of interactions from Abelian to non-Abelian gauge theories. In addition to the photon, which mediates the electromagnetic interaction, the Standard Model incorporates the \(W\) and the \(Z\) bosons giving rise to the (electro-)weak interaction, as well as the gluons for the strong interaction. However, it can in no way be considered as a “final” theory, in particular because it does not take into account the fourth fundamental interaction, gravity. In this sense, it is expected that “new physics” still has to be discovered and that the historical evolution described so far will be extended in some way. A first glimpse of potential forthcoming discoveries can even be provided by an analysis of the Standard Model limits and its proposed developments. Thus, this section discusses a certain number of sources of theoretical and experimental nature which could give hints about this new physics. This will help us to lay the foundations for the discussion, in what follows, of considerations on the future of the notions of observation and discovery of particles, and consequently, to extend our previous thoughts on the particle era.

From today’s understanding, any new physics, i.e., fundamental structures not described by the SM, will be related to a mass scale \({M}_{BSM}\), where “BSM” stands for Beyond the Standard Model. According to (perturbative) quantum field theory, this new physics should be associated with new particles which have masses around \({M}_{BSM}\). Their observation, which from our current notion of this term implies on-shell production, would thus require energies \({E>M}_{BSM}{c}^{2}.\) For the continuation of on-shell discoveries, it is therefore crucial that the mass scale of new physics is within the reach of conceivable particle colliders, which, following our rather optimistic estimate of the introduction, is limited to the order of 10 PeV. We will nevertheless show in the following that currently there are no compelling reasons to expect the mass scale \({M}_{BSM}\) to be significantly below the Planck mass \({M}_{P}\), which means that there is the possibility that on-shell discoveries already belong to the past. We hasten to add that these considerations certainly do not exclude the possibility of physics at accessible mass scales. As outlined in the Introduction, new physics may also be associated with relatively light particles, which would have to couple very weakly to the known particle spectrum though. But the fact that no new physics is required from theoretical and experimental considerations up to very large mass scales is quite unique to the history of fundamental physics.

3.1 Precision tests and theoretical consistency

The Standard Model depends on 19 parameters which describe the properties of the fundamental particles/fields whose numerical value has to be determined by measurements: three gauge couplings, the masses of the three charged leptons and the six quarks, the Higgs mass and its self-coupling, four quark mixing parameters, and the \(\theta \)-parameter which governs CP violation in the strong sector (neutrino masses are assumed to vanish in the Standard Model, see below). Once fixed, these parameters allow one to make theoretical predictions for other observables. Comparison to experimental data provides internal consistency checks on the Standard Model. The program of such precision tests has been pursued particularly intensively since the 1990s with the start of LEP, an \({e}^{+}{e}^{-}\)-collider at CERN which focused on the physics of the electro-weak gauge bosons \(W\) and \(Z\) [8]. For fermions, this was complemented by the so-called B-factories, which operated at significantly lower energies and explored the flavor sector of the SM, with focus on the quark mixing parameters. This was notably the case of the BaBar and Belle experiments, respectively at the SLAC National Accelerator Laboratory and the High Energy Accelerator Research Organisation in Tsukuba, which operated in the 2000s [20]. Such measurements continue to this day at the LHC, which supplements them by the exploration of the Higgs sector, and would be further enhanced by the projected building of an electron–positron collider operating as a Higgs factory.

Two of the essence plots from these measurements are displayed in Fig. 3. The left plot shows the electro-weak fit, i.e., how the theoretical expectations derived from the SM model fit the electro-weak precision data. The right plot shows the unitarity triangle, which is a graphical representation of a set of numbers called the Cabibbo–Kobayashi–Maskawa matrix, providing information on the strength of the flavor-changing weak interaction. Since no significant tension in over-constraining the Standard Model parameters is observed, there is no hint for BSM physics that could be derived from these plots.Footnote 16

Fig. 3
figure 3

The electro-weak fit [137] and the unitarity triangle [42]

Recall, as mentioned in the Introduction, that physics at all mass scales \(M\) has an impact on measurements at a particular energy \(E\). Therefore, precision tests not only investigate the internal consistency of the Standard Model, but are also affected by BSM physics at the mass scale \({M}_{BSM}\). However, since these effects are parametrically suppressed by some power of \(E/{M}_{BSM}\), the sensitivity is highly correlated with the experimental uncertainty. Even in very specific BSM models, such as the Minimal Supersymmetric Standard Model (MSSM), the current lower limits on the new mass scale \({M}_{BSM}\) therefore barely exceed the few-TeV range (see, e.g., [12, 13]).

However, aside from these phenomenological tests of the Standard Model, information about the mass scale of new physics can also be obtained from theoretical considerations. In quantum physics, theoretical expectations are given in terms of probabilities for the outcome of a certain measurement. The probabilities for all possible outcomes of a measurement need to add up to 100%. This so-called unitarity condition implied a “no-lose theorem” for the LHC, which affirmed the need for new experimental discoveries to preserve the unitarity of the Standard Model in the accessible energy range (see, e.g., [78]). The Higgs boson was the simplest option for such a discovery, because it introduces only one additional parameter, the Higgs boson mass \({M}_{H}\). The larger the Higgs mass, the lower the energy at which unitarity violation occurs. This provided an upper limit for the Higgs boson mass of \({M}_{H}\le 1\) TeV/\({c}^{2}\). At the observed Higgs mass of \({M}_{H}= 125 \mathrm{GeV}/{c}^{2}\), the Standard Model preserves unitarity up to energies which are beyond the collision energies of any conceivable particle collider. This means that, for this particular value of the Higgs mass, the unitarity requirement does not provide us with any hints for the mass scale of physics beyond the Standard Model which could be reached by a current or future collider.

Another possible theoretical indication for new physics could come from the stability of the vacuum [135]. According to the classical approximation, the vacuum, i.e., the ground state of the Standard Model is reached for a constant nonzero value of the Higgs field of around 246 GeV. However, if one includes quantum effects, it turns out that this only corresponds to a local minimum of the energy, while the global minimum is at a different value of the Higgs field. This makes the current vacuum unstable. Sooner or later, it will decay, and our universe tunnels from the local to the global minimum of the energy. This means that the world as we know it would disappear. The tunneling rate for this transition depends again on the value of the Higgs boson mass. If it were around 100 GeV/\({c}^{2}\) or less, the universe would have decayed long ago. This would tell us that new physics at a mass scale \({M}_{BSM}\sim 1\) TeV/\({c}^{2}\) exists which stabilizes the vacuum. For the observed value of the Higgs mass of \({M}_{H}= 125 \mathrm{GeV}/{c}^{2}\), however, also without new physics the predicted lifetime of the universe exceeds its current age by many orders of magnitude. Therefore, the Standard Model does not provide any hint for the mass scale of new physics from the requirement of vacuum stability.

To resume, the high-precision tests and its self-contained theoretical structure make it rather difficult to modify the Standard Model. However, we know that the description of nature given by this theory is incomplete, because there are a number of phenomena which it cannot explain. The most important ones and their implications for the mass scale of new physics will be discussed in the following section.

3.2 Shortcomings of the Standard Model

Gravity

One of the most striking shortcomings of the Standard Model is that it incorporates only three of the four known fundamental interactions. After all, physicists have always aspired toward a Final Theory, or Theory of Everything, which would also include gravity [95]. Einstein, for more than thirty years after having introduced the theory of general relativity, presented himself as a paragon of this quest for unification, which drove many developments in modern physics [58]. To date, however, there exists no consistent four-dimensional quantum formulation of gravity.Footnote 17 Any unified description of the four fundamental interactions thus seems to require a theoretical framework which is beyond regular quantum field theory and would imply new physics at energies of the order of the Planck mass, i.e., \(E\approx {M}_{P}{c}^{2}\). If this scale is related to new particles, their mass is certainly too large to produce them on-shell at any conceivable particle collider. On the other hand, while gravitons, the quanta of the gravitational interaction, are predicted to be massless, their coupling to Standard Model particles is suppressed by \(E/{M}_{P}\) where \(E\) is the collision energy. This is too small for gravitons to be identified as individual particles in any conceivable detector [125].

For many years, the most popular candidate for a unified description of all fundamental forces has been string theory, which adopts extended objects (strings or branes) as elementary constituents of nature [106]. This avoids some of the most severe divergences induced by point-like particles in quantum field theory (see Sect. 2.4.1). The spatial dimension of the strings is of the order of the Planck length \({l}_{P}=\hslash c/{M}_{P}\) though and thus unresolvable at accessible energies. From an experimental point of view, strings would be identified as particles. Different particle types correspond to different vibrational states, one of which would be the graviton. Despite intense theoretical research, no phenomenologically viable and testable models based on string theory have emerged up to now.


Dark energy

According to a number of observational data, the universe currently undergoes an accelerated expansion. Theoretically, this effect could be accounted for by introducing a form of energy which possesses negative pressure, usually termed “dark energy” [116]. Its simplest implementation is through a cosmological constant Λ in Einstein’s gravitational field equations. The fundamental nature of dark energy and its coupling to the Standard Model particles is currently unknown.


Dark matter

Both astrophysical and cosmological data are in vast contradiction with the combined assumption that (a) all matter in the universe is described by the particles of the Standard Model, and (b) their gravitational interaction is governed by general relativity. Thus, either Einstein’s field equations have to be modified, or a new type of matter has to be introduced which does not interact (or hardly interacts) with light, which is why it was generically coined “dark” [15]. Current observational data cannot distinguish between these two options. Even though a number of theoretical models which account for dark matter exist, the phenomenological consequences of these models are at the moment too diverse to guarantee the discovery of a dark matter particle in particle colliders.


Neutrino masses

Formally, the Standard Model assumes vanishing neutrino masses. With the experimental confirmation of neutrino oscillations, however, this assumption has been falsified [149]. It is possible to include the masses for the neutrinos into the Standard Model in the same way as for the other fermions. This would require to introduce right-handed neutrinos, and to couple them to the Higgs field. But the fact that neutrinos could be their own antiparticles, as hypothesized by Ettore Majorana in 1937, implies additional terms [100]. Currently, it is unclear which of the two options is realized in nature or whether it is a mixture of both.

It is clear, though, that the neutrino masses are several orders of magnitude smaller than the masses of other fermions. This has inspired ideas for neutrino mass generation via BSM physics which could explain this hierarchy, most prominently the so-called seesaw mechanism. A rough estimate of the mass scale related to the BSM physics is given by the square of the electro-weak mass scale O(100 GeV), divided by the neutrino mass. This leads to \({M}_{BSM}\sim {10}^{13}\) GeV/\({c}^{2}\), i.e., well beyond the reach of a conceivable particle collider (see, e.g., [132]).


CP violation

It can be shown that a theory which describes fundamental interactions must not be invariant under the simultaneous inversion of charge (C) and parity (P) in order to be able to account for the baryon–antibaryon asymmetry of the universe, i.e., the fact that all macroscopic objects in the observable universe consist of matter rather than antimatter [21]. Indeed, the Standard Model is not symmetric under a CP transformation. This manifests itself through particle properties like particular decay modes of neutral kaons. But the observed degree of violation of this symmetry is too small to describe the observed baryon–antibaryon asymmetry. Thus, there must be CP-violating interactions which are not described by the Standard Model, but they may very well occur only at energies which are far beyond any foreseeable particle accelerator technology.

In fact, the smallness of CP violation associated with Standard Model interactions is remarkable in the sense that the gauge symmetries of the theory allow for a large CP-violating term from strong interactions governed by the \(\theta \)-parameter (see Sect. 3.1). Experimentally, \(\theta \) turns out to be very small and compatible with \(\theta =0\). This is seen by some as a violation of naturalness (see Sect. 3.3) which requires an explanation and has been dubbed the “strong CP problem” of the Standard Model. The most popular solution is the Peccei–Quinn mechanism (and its many variants) which in 1977 postulated the existence of a particle with a mass of the order of 10–5 to 10–3 eV, the so-called axion [114, 115]. Despite its small predicted mass and several decades of experimental search, no conclusive indication for the existence of the axion has been found to this day. There is no compelling reason to expect that this will change in the foreseeable future.

3.3 Naturalness

The gap between the heaviest particle of the Standard Model (the top quark, \({M}_{t}\approx 173\) GeV/\({c}^{2}\)) and the Planck mass amounts to 17 orders of magnitude. This remarkable separation leads to the so-called naturalness problem which, over the past few decades, was considered a major issue and a guiding principle in the field [28, 150]. It arises because quantum field theory implies a cross-correlation among physical parameters, such that the measured value of a particle mass formally depends on all fundamental scales of a theory. This dependence is particularly strong for a scalar particle such as the Higgs boson and thus ties its mass closely to the mass scale of new physics \({M}_{BSM}.\) Since the 1970s, physicists have argued that a Higgs mass below 1 TeV/\({c}^{2}\) (as expected from other arguments such as unitarity, and later experimentally confirmed) is “unnatural” if \({M}_{BSM}\) is of the order of \({M}_{P}\). The main direction explored to resolve the naturalness problem was supersymmetry, which postulates that each Standard Model particle would have an associated “superpartner” particle whose spin differs by half an integer [123]. Additionally, alternatives to a fundamental Higgs field have been developed, most prominently technicolor or composite Higgs models, which speculate that the Higgs boson is a bound state of new particles and interactions (see, e.g., [67, 86]).

Common to essentially all attempts toward solving the naturalness problem is the prediction of new particles with masses around 1 TeV/\({c}^{2}\). The absence of signals for such particles at the LHC, combined with the discovery of a Higgs boson at a mass of 125 GeV/\({c}^{2}\) has cast serious doubts on the naturalness argument, supporting earlier allegations against it, based on its meta-physical character (see, e.g., [73]). On the other hand, it may well be that the original formulation of the naturalness criteria by Leonard Susskind, Gerard ’t Hooft, and Martinus Veltman was ill-conceived, in particular because the Higgs mass is not the only—and by far not the most severe—violation of naturalness [84, 134, 146].Footnote 18 Maybe an improved conception of the nature of physical laws may lead to more solid arguments for physics in the accessible energy range.Footnote 19 However, at this moment, we do not consider naturalness as a compelling indicator for physics at accessible mass scales.

3.4 Experimental anomalies

As discussed in Sect. 3.1, the large majority of experimental measurements are in impressive agreement with theoretical predictions based on the Standard Model. At any point in time, however, it is quite common due to the sheer number and variety of experiments that a few measurements deviate from expectations to some degree. So far, most of such effects could be explained by statistical fluctuations or mistakes in the theoretical description or the experimental setup. In this section, we will discuss the currently most prominent experimental deviations from the Standard Model.


Anomalous magnetic moment of the muon

The magnetic moment of an elementary particle is given by

$$\overrightarrow{m}=g\frac{q}{2m}\overrightarrow{s},$$

where \(q\), \(m\), and \(\overrightarrow{s}\) are the charge, mass, and spin of the particle, respectively, and \(g\) is a dimensionless factor. Neglecting quantum effects, the Dirac equation predicts \(g=2\) for a fundamental fermion such as an electron or muon. The deviation from this value due to quantum fluctuations is called the “anomalous” contribution and can be calculated very precisely within the Standard Model. In the case of the muon, the result disagrees with the latest measurement of this observable at the level of \({10}^{-9}\). Potential BSM effects to the anomalous magnetic moment of the muon are expected to behave as \({\lambda }^{2}{m}_{\mu }^{2}/{\Lambda }^{2}\), where \(\Lambda \) is the scale of new physics, and \(\lambda \) is the strength with which it couples to the muon. Thus, explanation of the discrepancy requires \(\Lambda \lesssim \lambda \cdot 2 \mathrm{TeV}\), which even for \(\lambda \approx 1\) is well in the current accessible range [11].


Flavor anomalies

The theoretical description of bound states of quarks (hadrons) such as B-mesons, which are composed of a bottom quark and a lighter quark, is notoriously difficult. However, one may define ratios of hadronic observables where most of the theoretical (and also experimental) uncertainties cancel. One such quantity is given by

$${R}_{D}= \frac{\mathrm{Br}(B\to D\tau {\overrightarrow{\nu }}_{\tau })}{\mathrm{Br}(B\to Dl{\overrightarrow{\nu }}_{l})}, \quad l\in \left\{e,\right.\left.\mu \right\},$$

where \(D\) is a charmed meson and \(\mathrm{Br}\) denotes the branching ratio. Over the past decade, a number of experiments have consistently found discrepancies to the Standard Model expectation of this and related quantities, combining to a deviation of about \(3\upsigma \).

A rather promising aspect of these discrepancies is that they can be taken into account consistently by adding only two new parameters to the Standard Model Lagrangian [2]. The numerical value of these parameters hints at new physics at a scale of about \({M}_{BSM}\sim 1\) TeV/\({c}^{2}\). From today’s perspective, this would imply the potential existence of a new particle (or several) which one should be able to discover at the LHC or a collider of the next generation [76].

Increasing the amount of data of the measurements described in this section might tell us whether these experimental anomalies are indeed signs of physics beyond the Standard Model. If so, new on-shell discoveries at the LHC or future particle colliders can be expected. However, the anomalies may equally well turn out to be statistical fluctuations, as was the case for the bump at 750 GeV in the di-photon spectrum observed by the LHC experiments in 2015, which had caused enormous excitement in the physics community [45]. In the context of this paper, we assume that the anomalies will indeed disappear with increased statistics. Otherwise, our discussion may be repeated in a few decades or so.Footnote 20

3.5 Summary

Our developments on the Standard Model and the mass scale of new physics show that, with the discovery of the Higgs boson in 2012, particle physics has entered an exceptional state. We know that the Standard Model is not a complete description of nature, but there is no compelling indication that the mass scale of BSM physics is significantly below the Planck mass. To a large extent, this situation is explained by the specific value of the Higgs boson mass, which is compatible with the Standard Model precision tests, warrants unitarity at all accessible energies, and defers vacuum decay to an elusive future. In fact, for these very reasons, discovering nothing but a Higgs boson in the mass range of 120–140 GeV had been termed the “nightmare scenario” of the LHC already before it started operation [41]. It completed the experimental confirmation of the Standard Model’s particle content and left no signpost of where to search for answers to the still open questions of fundamental physics. This situation led to the analogy of standing at the beginning of an unexplored desert, as employed in the Introduction of this article.

4 Exploring the desert ahead

4.1 Current orientations of particle physics

As discussed in Sect. 2, since Hideki Yukawa established in the mid-1930s that the range of a force is inversely proportional to the mass of the particle acting as its carrier, the attention of physicists has been directed to the field of high-energy physics. This resulted in a permanent increase in energy to study heavier particles and smaller scales. Not least due to the technical evolution of particle colliders, this has been a successful path of research, culminating in the Standard Model, which today is arguably the most encompassing and most precisely tested theory of nature ever developed. In view of this successful history, it is fully justified to argue for continuing along this path, of course. Indeed, the 2020 Update of the European Strategy for particle physics names the Future Circular Collider, a “hadron collider with sensitivity to energy scales an order of magnitude higher than those of the LHC,” as one of the high-priority future initiatives [63]. However, as we have argued in the Introduction, the inflexion of the Livingston plot and the presence of technical challenges, such as synchrotron radiation, are pointing to the existence of an upper limit for the maximal energy of particle accelerators. It therefore seems inevitable that the time of particle discoveries as we know them today will come to an end sooner or later. In fact, according to our developments in Sect. 3, we may have already reached this stage, because it is possible that there is no detectable new physics above the electro-weak scale for several orders of magnitude in energy. In this case, no conceivable particle collider will ever produce new particles on-shell any more. In the words of our analogy, the desert is too large in order to ever reach its far end. Of course, it cannot be excluded that there is new physics “around the corner,” as the anomalies discussed in Sect. 3.4 may suggest. They might be signals of an oasis within the desert—but also just mirages which will disappear with higher statistics of the measurements. But even if they turn out to be real effects of physics beyond the Standard Model, this would most likely only postpone the end of on-shell discoveries. An embedding of the associated new physics into a theoretical framework may lead to new insights, maybe even far beyond explaining any existing experimental anomalies. However, it is hard to believe that this next step forward will allow us to construct the “final” theory which is able to answer all the open problems discussed in Sect. 3 (and others that may arise in the meantime). The gap between the energies accessible at any conceivable particle collider and the scale of gravity is most likely too large to actually traverse the desert.

Let us remark that, as already pointed out in the Introduction, even though the main goal of going to higher energies is to raise the potential of on-shell discoveries, it also increases the impact of virtual effects due to new physics, as their suppression by powers of \(E/{M}_{BSM}\) becomes alleviated. Nevertheless, the main aspect of studying virtual effects is precision, which can be increased by increasing the statistics. Therefore, another path of development in current particle physics is toward colliders with high event rates, in particular so-called Higgs factories. They would allow to study this most recently discovered particle with very high precision and thus potentially provide information about as of now undiscovered virtual effects.

4.2 The future of discoveries

From our previous developments it follows that at some point in the foreseeable future, for all we know the hunt for bumps in some kinematical spectrum cannot be successful any longer, because the undiscovered particles would be too heavy to be produced on-shell at particle colliders. Rather than being the end of our quest for understanding phenomena of higher energy physics, this may mean that in view to cross the desert ahead physicists may have to accept and develop other means of observation. One concrete orientation for such a development is, for example, the search for heavy particles through their imprints on the primordial cosmological fluctuations (see, e.g., [7]). In any case, such a situation indicates that we may thus soon be witnessing yet another turn in how we define discoveries in the field. As discussed in Sect. 2, the notion of particle observation has already evolved several times in history, resulting from changes in the way physicists conceived of the concept of particle. As such, methods of observation were reconfigured in the occurrence of new phenomena, such as v-events in cosmic ray detection (see Sect. 2.3.3). Their interpretation was based on the fact that particles can decay and allowed for the discovery of the kaon, for example. In the future, physicists may naturally face similar situations.

But it must be underlined that reconfigurations of the concept of particle also led to reassessments of former practices. Recall how Thomson’s conclusion that cathode rays consist of particles was quite bold in light of his experimental data, but heavily relied on hypothetical properties attributed to individual particles, such as mass and charge. It led to define a first notion of particle observation, even though the actual observational status of electrons had not changed. After all, cathode rays had been “seen” before [6]. Similarly, positron-like tracks had been already observed in cloud chambers before Anderson’s experiments in 1932. But they could not be interpreted as positrons without theoretical acceptation of Dirac’s antimatter postulate [124]. It was the theoretical insights that guided the different actors on the way to what is considered as the “discoveries” of the electron and the positron. Again, such a situation could very well be experienced by physicists in the future.

The current state of theoretical physics even signals possible directions for a future development. Indeed, the bumps that are associated with particle observation are nothing but slices along the real axis of a function of generalized complex kinematical variables, and defined in terms of the quantum fields (rather than particles). The peak position (associated with the mass of the particle) and the width of the peak (corresponding to its inverse lifetime) are just two possible characteristics of this function. From today’s perspective, it is the direct measurement of these two parameters which establishes the observation of a particle, but actually this is used as an indicator of the existence of the associated quantum field. However, as we argued above, this paradigm has to be given up, because the energy of particle colliders will be insufficient for exhibiting the peak structure, and our attention may have to be turned toward other properties, in particular of the quantum fields, in the exploration of the desert. From a purely theoretical perspective, the distinctive role of the actual peak for a particle observation is anyway difficult to argue for. Consider a hypothetical future collider with a sharp upper limit on its energy reach, and imagine the somewhat idealized situation that it reveals the lower tail of a new peak structure with very high precision, but not the peak itself. The unique mathematical shape of the peak would allow for a precise reconstruction of the peak. Despite the fact that the associated particle could not be produced on-shell, the particle physics community would certainly (have to) accept this as the discovery of a new particle. One of the central questions is how far one will be able to move away from this idealized situation and still be able to claim a discovery.

The imprints of quantum fields on observables aside from the peak structure are of course well known. Typically, they are referred to as effects of virtual particles, or simply “virtual effects.” They have been useful for learning about as-of-then undiscovered physics before. For example, the mass of the top quark could be predicted rather precisely through such effects before this particle was discovered in 1995, i.e., produced on-shell. This was achieved by comparing precision measurements at LEP and other colliders to calculations of the associated quantum effects, based on the Standard Model (see, e.g., [14]). In some sense, we can speak here of an indirect measurement of the mass of a particle through virtual effects. However, it should be noted that this measurement was based on the assumption that the top quark actually exists, and that at the time, the virtual effects it had caused were considered insufficient to “inescapably” (Thomson) infer its actual existence. Similar analyses led to predictions of the Higgs boson mass before the actual discovery of this particle (see, e.g., [70]). And finally, the fact that the cross section for Higgs boson production is so close to the value predicted by the Standard Model implies that there is no fourth generation of quarks, for example.

In fact, the particle physics community has been preparing for this situation for several years now. Virtual effects of unknown heavy physics can be parameterized in a theoretically consistent way in the form of effective field theories (EFTs). They represent an expansion of some unknown extension of the Standard Model Lagrangian, \({L}_{BSM}\), in terms of powers of \(E/{M}_{BSM}\). The leading term in this expansion would just be the Standard Model Lagrangian. Going to the next order in \(E/{M}_{BSM}\), the expansion of an arbitrary \({L}_{BSM}\) in terms of an effective field theory (so-called SMEFT) involves more than 2000 parameters, which are called Wilson coefficients, and this number grows rapidly when going to higher powers in \(E/{M}_{BSM}\) [81, 101]. On the other hand, one would expect that a specific Standard Model extension \({L}_{BSM}\) depends only on a small number of parameters. All of the thousands of Wilson coefficients are thus determined by just a few parameters of the overarching theory \({L}_{BSM}\). Consequently, a possible path toward new discoveries could look as follows: some experiment finds a deviation from the Standard Model prediction; the deviation can be accounted for by assuming nonzero values for a few of the Wilson coefficients; this restricts the range of possible candidate theories \({L}_{BSM}\) to those that are compatible with these Wilson coefficients; on this basis, one will be able to make suggestions for new experiments which give access to other, possibly sizable Wilson coefficients, and so on. Ideally, in the end there will only be one candidate theory. It can be further tested in this way, until there is “no escape” but accepting this theory as correct. Like the prince in the fairytale found Cinderella by her shoe, physicists could find their Cinderella theory by its Wilson coefficients.

Admittedly, this is a very optimistic view, and we are well aware that measurements of the Wilson coefficients are tremendously difficult (see, e.g., [10]). However, we use this example in order to illustrate that future discoveries may not so much concern individual particles rather than whole theories, or parts of theories. It is also just one possibility how future discoveries in fundamental “particle” physics could evolve. Similar to the transition from fluorescing beams to tracks to bumps in the cross section, this does not need to imply a degradation of the ontological status of the fundamental entities under consideration, albeit that the notion of discovery will incorporate new properties of quantum fields that go beyond our current approach to the concept of particles.

5 Conclusion: the fate of the particle concept

From our historical considerations, we have shown that the particle era was characterized by the persistence of the highly formative role played by the concept of the particle in natural philosophy and physics. Specifically, we highlighted its powerful illustrative, heuristic and—in elementary particle physics—operational values. Based on our developments on the current and potential future state of high-energy physics, the continuity of such an observation needs to be assessed.

With particle colliders reaching their limits, the notion of particle observation as we know it today will no longer be successful. Experimental access to the physical characteristics that define particles in current practices will probably cease to be possible for future discoveries. As we argued above, one possibility is that the operational focus will shift from particles to fields. After all, quantum field theory incorporates much richer physics as can be described on the basis of a pure particle concept. Its basic theoretical entities are quantum fields, and particles are only one of their phenomenological manifestations. This goes hand in hand with the indication in our discussions of a decrease in the importance of the heuristic value of the particle concept. Its unsuitability for theories dealing with strong interactions reveals important potential constraints. Thus, if the concept of particle deployed its heuristic power during the development of quantum field theory, it was at the cost of significant approximations and a limitation to small interaction strengths. On the other hand, the concept of particle has proved, and may continue to be, fruitful to be manipulated in everyday scientific language. It imposed itself at different stages of the development of modern physics, although it was not always the most epistemically adequate. This is shown, among other things, by the descriptive approach of Yukawa in the theory of mesons or the representation of the Standard Model. But whether a concept that stands out solely for its illustrative value can be sufficiently relevant in science is questionable.

In summary, it follows from our considerations that one can expect a profound weakening of the operational and heuristic values of the particle concept. This situation raises many questions, some leading to far-reaching hypothesis. Was the concept of particle a long-lasting workaround in particle physics, a last remnant of classical ideas? Could this be the end of the particle era? If today no answer to these questions can be ruled out, present and future physicists will write the main lines. Standing in front of a desert, they may then be tempted to look behind, as we have done in this paper. Some could then envision that after all, it may not be necessary to actually cross the desert in order to discover new fruitful territory. While Columbus’ discovery of a new continent required him to actually set foot on it, Galileo was able to discover the true nature of planets from his home. All he needed was the proper tools (telescope) and the correct interpretation of his observations (moons orbiting Jupiter). Similarly, we may not need to produce particles on-shell anymore in order to learn about the fundamental laws of nature, as we have argued with the example of measuring Wilson coefficients and comparing them to the predictions within specific theories.

The strong malleability of the particle concept across history is remarkable. Throughout the path of development of the particle era, the way particles were conceived in scientific practices has already undergone significant changes. There is ultimately very little commonality between classical particles and resonances, although our historical outline shows that, up to now, each step in this evolution is a generalization of the previous step, and thus they are all connected to the original idealized notion of a discrete classical particle. The wave function of quantum mechanics still has a connection to the localized particle through the measurement process, and a sufficiently highly boosted Higgs boson could travel a measurable distance in the detector before it would decay into tracks. It is nevertheless possible that even this connection could no longer be established at some point in the future; after all, alternative concepts have already been proposed, albeit without experimental confirmation (see, e.g., the concept of “unparticle” discussed in [72]). In any case, the end of on-shell particle discoveries in high-energy collisions imposes that our current notion of particle observation, strongly tied to the concept of “resonances,” will no longer play the central role in particle physics. The concept of particle would then have to be reinvented once again, and yet another chapter in its history would have to be opened.