1 Introduction

“Simplicities and Complexities” was an international, interdisciplinary conference held in Bonn, Germany from May 22–24, 2019. It was organised by Miguel Ángel Carretero Sahuquillo (University of Wuppertal), Dennis Lehmkuhl (University of Bonn), Martina Merz (University of Klagenfurt), Gregor Schiemann (University of Wuppertal), Michael Stöltzner (University of South Carolina), and the authors on behalf of the DFG- and FWF-funded research unit, “The Epistemology of the Large Hadron Collider” (Grant FOR 2063) to bring together thinkers from across the humanities and sciences. The conference explored the assumption that simplicity is an epistemic ideal in the sciences, particularly physics, while recognising and examining the corresponding understandings that natural phenomena are highly complicated and that the practices used to discover and describe them are often equally complex. Simplicity has never been a clear-cut concept, neither in the humanities nor the sciences, which has led to many attempts to define exactly what it means in scientific contexts, producing a plurality of understandings. Likewise, there are a variety of notions of complexity across the natural and social sciences, as well as in the humanities. The aim of the conference was to analyse, differentiate, and connect these various notions and practices of simplicity and complexity, guided by questions concerning the roles of simplicity, the ways complexity in nature is addressed, the complexity of the epistemic strategies of science, the relation between simplicity and complexity, and to what extent different disciplines take simplicity (or complexity) as an epistemic ideal.

The research unit invited eleven established researchers from across the sciences and humanities to deliver keynote lectures over the three days of the conference. We also put out an interdisciplinary call for abstracts to be presented in parallel sessions held each day. The research unit chose twenty-eight abstracts written by thirty-seven researchers. Some of the abstracts were chosen for participation in an essay competition, with the prize being the opportunity to present their research to all attendees during an Award Lecture on the first evening of the conference. The parallel sessions themselves were roughly organised by discipline, with the major thematic categories comprised of the philosophy of mathematics and computer science, biology and ecology, and sociology and STS, with other talks in the philosophy of science, chemistry, and linguistics appearing where they fit best. The conference participants represented a diverse group of disciplines from across Europe and the Americas.

The research unit that organised the conference, “The Epistemology of the Large Hadron Collider”, was founded in 2016 to investigate the philosophical, historical, and sociological implications of the activities at the world’s largest research machine, the Large Hadron Collider in Geneva, Switzerland. Its general question is whether the quest for a simple and universal theory, which has motivated particle physicists for several decades, is still viable at a time when there are no clear indications for physics beyond the Standard Model and all experimental evidence increasingly comes from a single large and complex international laboratory. The research unit is organised into two phases of three years each. The first phase, now drawing to a close, is composed of twelve principle investigators, six postdoctoral researchers, and five doctoral students from the philosophy of science, history of science, and science studies. The research unit is organised into six subprojects that investigate smaller aspects of our larger research question, with at least one principle investigator from physics and one from the humanities or science studies in each group.

This report examines three of the dominant views—according to the speakers—concerning the status of simplicity as an epistemic ideal, taken from the disciplines of physics, ecology, and biology. In physics, simplicity is often seen as an ideal to be pursued; in contrast, in ecology one finds a tendency to treat simplicity as a potential obstacle in the path to understanding that needs to be overcome; and in biology, despite a history of turning away from the pursuit of simplicity in favour of an acknowledgement of the complexities of biological phenomena, there now seems to be a shift back towards simplification. Although speakers examining these three sciences sometimes deviated from these positions, the latter were sufficiently uniform to warrant special focus. However, there were also a host of interesting talks from a diverse set of disciplines, including anthropology, chemistry, computer science, history, linguistics, mathematics, medicine, science and technology studies, sociology, and, of course, philosophy. So, in turn, this report looks at highlights from talks on physics, ecology, and biology, before turning to a small sample of other positions on simplicity and complexity, including the topic of the award-winning lecture. Unfortunately, we are not able to go into detail on the rest of the many excellent presentations that were given at the conference. We encourage those who are interested to seek out more information about the conference, including the book of abstracts, at the research unit’s webpage.Footnote 1

2 Physics

Physicists generally treat simplicity as an epistemic ideal while simultaneously recognising that the phenomena they study are incredibly complex, with the practice of theorising and experimentation requiring complex calculations, models, computer simulations, and experimental setups. More than one speaker discussed the tension between the ideal of simplicity, and the complexity of the phenomena and working conditions in physics, especially at the LHC. For instance, Robert Harlander (RWTH Aachen) provided two case studies to illustrate this tension. The first involved the use of Feynman diagrams, which significantly simplify the mathematical descriptions of particle behaviour by graphically representing possible interactions using a small number of rules. At the same time, higher-order calculations of interesting physical processes can involve tens of thousands of these diagrams. Harlander noted that these pictorial representations remain important, even when such massive calculations are being performed by computers (and so no actual Feynman diagrams are produced). So, the question arises: are Feynman diagrams truly as simple in their implementation as they seem upon first glance? His second case touched on the tension between relatively simple theoretical concepts like supersymmetry (a symmetry of the Poincaré group relating fermions and bosons), and the complexity of the models generated using these simple ideas. Although supersymmetry can be called simple, experimental findings have ruled out its simplest expressions, leading to ever more complex specific models to match the more complicated picture of reality implied by experiments.

Similarly, Beate Heinemann (DESY Freiburg) detailed the challenges of retrieving the desired (simple) information from the LHC, given it and its detectors’ complexities. She described the material components of ATLAS, with layers of structures designed to detect the different products of proton-proton collisions resulting from millions of beam crossings every seconds. A trigger system is used to capture signals with certain properties and store them, while the vast majority of the raw data is discarded. What remains are billions of recorded signatures, which must be sifted for events of interest (based on an analysis’s objectives). Complex software then reconstructs the particles from the signatures. Along the way, numerous quality control checks are made to ensure the detector and analysis software are collecting data correctly. At the end of this massively complicated process, measurements of simple physical quantities (mass, spin, etc.) are not only produced, but deemed highly reliable, owing to the great deal of knowledge and experimental rigour operating in the background of the LHC and its experiments.

Physicist-turned-philosopher Richard Dawid (Stockholm University) meanwhile argued that pursuing the ideal of simplicity in fundamental physics today depends on projections of future physics. He made the case that the current pursuit of simplicity differs in character from its initial role as unification in theory building. As a general rule in science, unification creates a trade-off between generality and accuracy. However, in physics, unification acts as a guiding principle, leading from multiple theories of smaller domains to a unified theory that covers a wider domain with higher precision. The trade-off becomes one between accuracy and calculability. Dawid argued that, since simplicity can only be required to the extent that it is enforced by more fundamental phenomena, there are three different contexts to consider in modern physics. The first case includes effective theories, which are embedded in more fundamental theories. Here, a degree of simplicity can be inferred from conspicuous correlations in data hinting at the more fundamental theory. Next, there are theories that are not considered final and which cannot be embedded in a more fundamental theory. For this case, simplicity does not assist in discriminating between competing theories. Finally, there are theories that can be assumed to be final (i.e. there are no theories that are more fundamental). Dawid suggested that in string theory, his example of a final theory candidate, a specific kind of simplicity can be invoked to discriminate between it and competitors: a lack of free dimensionless parameters. Dawid concluded by noting that, in each step towards more fundamental theories of physics, different notions of simplicity must be invoked.

3 Ecology

In ecology, important examples can be found that stand in stark contrast to the situation in physics. Although many ecologists have followed the physicists’ lead in taking simplicity to be an epistemic ideal, several speakers argued that this has not been fruitful for them, but rather a hindrance to progress. Volker Grimm (UFZ Leipzig) argued that models in classical theoretical ecology often fail to be sufficiently predictive and explanatory once environmental conditions are widened, since they do not fully embrace the complexity of actual ecological systems. Even though emergent simple patterns are sometimes observed, the complexity of the underlying model is essential to understanding how simplicity emerges and is maintained. He therefore advocated “pattern-oriented modeling”: rather than focusing on a single pattern observed at one level of observation—including that of behaviour, population dynamics, community composition, or ecosystem function—a multiscope view is needed, which takes into account these multiple patterns simultaneously, thereby capturing the inseparable micro-macro link and reducing the risk of oversimplification. While ecology has traditionally focused on avoiding false inclusions, Grimm argued that more importance should be placed upon avoiding false exclusions. Only then do we stand a chance at explaining biodiversity, where most seemingly unimportant factors can become essential under certain conditions. Simplicity may be the ultimate ideal, but should not be the intermediate goal.

Tina Heger (University of Potsdam) provided the methodological analogue to Grimm’s story. She pointed out that one reaction to the complexity of the research subject was the formation of many independent sub-disciplines, each focusing on smaller, simpler sub-topics within ecology. While this strategy has its advantages, it has hindered the exchange of results and knowledge, and facilitates false exclusions (cf. Grimm’s talk). Heger’s proposed solution is the hierarchy-of-hypotheses approach, which allows clarification of the relationship of individual tests from different sub-disciplines to a single overarching idea, counterbalancing the drawbacks of isolated sub-disciplines. This hierarchical representation of complexity can help build a bridge towards simplicity.

Paul Grünke (Karlsruhe Institute of Technology) & Katharina Brinck (Imperial College London) claimed that the most prominent simplification in ecology is the identification of species as equal basal units. This choice of central observable entails several problems, including skewed data sets arising from an observational bias towards larger species, the ambiguous definition of the term across scales (e.g. mammals vs. bacteria), and the interdependency of roles (e.g. bacteria as both species and part of the species, or rather ecosystem, “human being”). They argued that theoretical (macro-)ecology needs to move away from this flawed implementation of reductionism. Instead, ecosystems are shaped in a co-adaptive dialogue of bottom-up and top-down control (cf. Grimm’s talk). In order to formalise and understand the complexity of ecological systems and natural phenomena across scales, a new framework is needed, with building blocks that are invariant across contexts and roles.

4 Biology

The contemporary situation in biology appears to be the opposite of that in ecology, at least in some respects.Footnote 2 The default is to accept that biological target systems are essentially complex and to give up on the idea of finding simplifying laws—coherent narratives are the best we can hope for. Several speakers pushed back against this trend. For instance, Victor Luque (University of Valencia) argued that evolutionary biology is actually much closer to physics in this respect. He identified the Price equation as the fundamental equation of evolution, a direct analogue of Newton’s second law and Schrödinger’s Equation. These laws are what Kuhn calls ‘guiding principles’ or ‘generalisation-sketches’, or what Sober calls ‘consequence laws’. Adding further specifications to the Price equation (cf. filling in the F-slot of Newton’s second law with different force laws) allows derivation of other evolutionary theorems, such as gene selection, phenotype selection, adaptation selection, and heritability. Hence, the Price equation integrates the simplifying approach of mathematical abstraction with the complexity of the biological world. It is thus possible after all for evolutionary biologists to follow physicists in wearing t-shirts summarising their respective fields with a single equation.

Marta Bertolaso (University Campus Bio-Medico of Rome) was somewhat more accepting of the received view, but nevertheless insisted that within scientific practice—especially in the case of cancer, a complex biological phenomenon par excellence—complexity and simplicity are two sides of the same coin. Carcinogenesis and cancer’s progression are related to a multitude of causes and mechanisms, operating at different levels (cf. Grimm’s talk), with a strong dependence on contextual factors. Despite this plurality of compromised processes, cancers robustly exhibit a set of relatively simple (dis)functional characteristics. Thus, no unique explanatory tool is either necessary or sufficient. Robustness is an ontologically emergent property of relative simplicity, irreducible to the myriad of complex architectural features which set the conditions for its emergence. Current philosophical models of explanation do not adequately cover the case of cancer. In light of both the complexity of the plethora of compromised processes that generate the same robust set of (dis)functional characteristics, as well as this lack of appropriate models of explanation, biologists may be forgiven for having turned away from simplicity. In order to remedy this attitude, we need a re-thinking of the concepts of biological causality and scientific explanation that expands these accounts to accommodate robust causation of (relatively) simple global structural changes (i.e. the robust recurrence of various developmental stages), despite a complex plurality of underlying possibilities.

Mike Buttolph (University College London) gave a historical analysis of the initial lack of recognition of Mendel’s theory of heredity, the theory which eventually gave rise to the science of genetics. Buttolph rejected noncognitive explanations, such as Mendel himself being unknown, and specific cognitive explanations, such as his analysis being incomprehensible. Instead, he put forward a different cognitive explanation: Mendel’s work was initially rejected because of the implied conclusion that patterns of biological inheritance are incredibly, and unacceptably, simple (in the sense of being quantitatively parsimonious).

5 Additional Perspectives

Although the views of physics, ecology, and biology at the conference largely aligned along three different perspectives, scholars from a number of other disciplines also chimed in with variegated perspectives on the conference’s thematic questions. Several of these talks focused primarily on the conference’s other primary theme, complexity, though many also discussed the interplay between simplicity and complexity. Unfortunately, we can only provide a short sampling here, beginning with the winners of the essay competition.

The Award Lecture by Giulia Terzian and María Inés Corbalán (Universidade Estadual de Campinas) focused on generative linguistics. They argued that the dominant ‘Minimalist Program’ tends to conflate two groups of notions of simplicity: ontological variants that apply to the object of investigation (i.e. the human language faculty) and methodological variants attributed to the linguistic theory adopted to describe that target system. Once these are picked apart, it becomes salient that there is currently no justification for any of these notions, let alone their convergence—nor is it plausible that this justification will arise from within linguistics. Terzian and Corbalán advocated for a naturalistic, interdisciplinary solution: only by drawing on recent results in cognitive science (especially empirical evidence supporting an evolutionary standpoint) and in philosophy of science (specifically views that see theoretical simplicity as a precondition for scientific understanding) can a single coherent story be told that justifies the double role of simplicity in generative linguistics.

Christian Feldbacher-Escamilla (University of Düsseldorf) linked several dominant notions of simplicity to the assumption that they provide some sort of epistemic value, that simplicity is somehow truth-apt. One way to test this assumption is to show that simpler models are less likely to overfit erroneous data than more complicated ones. Because this argument relies on a particular notion of simplicity—that of having fewer parameters—it is not clear how to relate it to other views of simplicity, like a model’s ontological or axiomatic parsimony. Feldbacher-Escamilla used structural equations (borrowed from Forster and Sober) to argue that axiomatic and ontic conceptions of simplicity can be reduced to parametric simplicity, thus putting us in a position where the truth-aptness of various classical conceptions of simplicity can be assessed.

Talia Dan-Cohen (Washington University in St. Louis) discussed the politicisation of opposing views concerning simplicity and complexity from the perspective of American anthropology. She considered the way complexity has been treated almost as a self-evident virtue in the humanities and social sciences in the wake of criticisms of reduction, leading to the perspective that experts should be accept, and even champion, complexity. However, since a preference for complexity acts in part to counter earlier, positivist-led efforts towards simplification, she suggested that complexity is not ideologically pure in its implementation in opposition to simplicity. She argued that we must better attend the underlying logics in evaluating scientific complexity, and provided two case studies to do just that: the use of complexity in both cultural anthropology and archaeology.

Finally, Lisa Kressin (University of Lucerne) described the conflict between two epistemic communities in sociology. The first, the quantitative, substantial community, aims at formalised explanations and causal laws, reducing complexity to allow the use of statistical analyses and standardised surveys. The second, using the qualitative, interpretive approach, argues against simplifying procedures, claiming they interfere with understanding the actual social world and that they introduce categories without proper reflection. Instead, this community uses less standardised data collection methods and emphasises understanding over explanation. Kressin noted that each approach has different criteria for assessing the quality of their data collection and analysis, with corresponding flaws (i.e. the quantitative approach faces reproducibility worries, while the qualitative approach must legitimise their data collection methods as scientific). Kressin’s empirical research into the discipline of sociology allowed her to show that, at least in the German-speaking world, these different attitudes create both symbolic and social boundaries between participants. What is particularly interesting is that similar divides in approach exist within and across other disciplines. For instance, discussions in our previous sections show such dividing lines between approaches, particularly between ecology and biology. Further, the perspective on simplicity continues to undergo historic evolutions in the various disciplines represented at the conference, which is especially evident in biology. Ultimately, Kressin’s work shows that considerations of simplicity and complexity can have wide ranging consequences beyond the merely academic considerations.

6 Conclusion

At the conference, physics, ecology, and biology each represented different approaches to the use of simplicity as an epistemic ideal, and its contrast with complexity. Physics tends to treat simplicity as something to strive for, even while acknowledging the complexity of both the world, and our means to study and describe it. Ecology, on the other hand, tends to treat simplicity as a stumbling block without proper consideration of the complexity of the different levels in an ecosystem. And biology, which has long disregarded simplification over open acknowledgement of complexity, may be recognising a role for simplicity again.

An important lesson arising from this conference is that, in the context of one’s own field of expertise, when dealing with simplicity in the face of highly complicated natural phenomena (as well as complex scientific practices), one would do well to compare one’s own field with perspectives from other sciences and the humanities. Different fields gave different answers to questions such as “which notions of simplicity are relevant?” and “is and should simplicity be an epistemic ideal?” In light of this, the reader is encouraged to browse through the book of abstracts available at the conference webpage (found in footnote 1) in order to reflect upon different ways one might engage with this tension between simplicity and complexity. For the same reason, the reader may want to keep an eye out for a special journal issue related to the topic of this conference—including contributions from several speakers—that is currently being put together by “The Epistemology of the Large Hadron Collider” project.