In the twenty-first century, computer simulations have become ubiquitous. It is hard to think of any sciences, from the natural to the social, to the life sciences and the humanities, that have not developed, in one way or another, methodologies involving computational tools and, in particular, computer simulations.
However, what are computer simulations? Surveying the increasingly vast literature in the field, the predominant impression is twofold: First, there seems to be a unified understanding of what computer simulation consists of. In research practices, in scientific literature, and in the public realm we find undiscriminating talk of “simulations”, as a collective noun, implying that they involve roughly the same approach regardless of the knowledge domains, applications or research aims to which they are applied. Second, computer simulations are seen as technological agents of change, a radical new technology which came into the world in the wake of the humanitarian and intellectual devastations of the Second World War, and in doing so, they almost immediately transformed knowledge production in the sciences and the way we act in the world.
Research has been devoted to the many questions raised by computer simulations, be they epistemological, political, social or economic.Footnote 1 But historical studies of computer simulation are strikingly absent from the field’s growing corpus. To right this, the aim of this special issue is to critically reassess the predominant view of computer simulations as a disruptive and unified computational technology by setting them in a historical perspective.
This special issue collects four historical case studies that focus on exemplary instances of what are today regarded as computer simulations: mathematical problem-solving with the ENIAC computer (Electronic Numerical Integrator and Computer) in the 1940s; the introduction of Monte Carlo simulations in particle physics of the 1960s; the development of the Paris-Durham shock model in astrophysics from the 1980s to today; and the history of digital modeling in design and architecture in the 1980s and 1990s.
The examples given in the papers demonstrate that in the second half of the twentieth century there was a highly fragmented landscape of different actors, questions, goals, and practices rather than a common understanding of “simulation” technology in diverse knowledge domains. Instead, a wide spectrum of mutually unrelated practices which were often, but not always, associated with electronic computers emerged in various disciplinary contexts and were only much later subsumed under the general heading of “computer simulation”. Perhaps even more important is another finding: the “same” computational practices took on distinct epistemic qualities depending on the context of use, disciplinary standards, and background of the protagonists involved.
In the first part of this introduction, we give a brief overview of current research on computer simulations in general, and their history in particular. The second part is devoted to a more detailed account of the case studies documented in this issue and their author’s methodological approach. We sum up the findings of these case studies in a final, brief conclusion.
What Are Computer Simulations and Where Do They Come From?
Computer simulations can be found in virtually any field of knowledge production today.
Arguably, the best known methods are that of “Monte Carlo” and “agent-based” simulations. A typical Monte Carlo simulation uses random numbers to provide a stochastic model of a physical phenomenon.Footnote 2 Agent-based models, in contrast, simulate the behavior of large numbers of independent agents—be they defined as individuals or collectives—as they perform actions in response to mutual interactions and changing external conditions. Ultimately, the purpose of agent-based modeling is to understand the emergent behavior of whole systems. Processes as varied as the spread of epidemics, evolution, flows of traffic, unemployment, and the response to different monetary policies are simulated in this way.Footnote 3
That said, contemporary computer simulations comprise a much broader range of computer-aided practices that change within and across disciplinary fields, research aims, and contexts of application. In physics, chemistry or climate science, for example, the term simulation can designate different computer models (Wise 2017). Some are based on discretized versions of the mathematical laws assumed to rule the natural processes in question. Others are programs which do not implement any physically or chemically significant algorithm, but reproduce the behavior of a system on phenomenological or statistical basis. In the life sciences and medicine, computer simulations take various forms, such as computational models of cells, neurons, and brains, or three-dimensional digital visualizations and animations of growth processes, allowing to perform experiments not only “in vivo” and “in vitro”, but also “in silico” (Wellmann 2018a). Early on, engineers and architects started to make use of computer programs to design, build, and test artifacts from cars to airplanes and houses to nuclear weapons. They developed a broad range of modeling and visualization techniques based on discretization of geometrical surfaces and solids, which are referred to as computer simulations or digital models (Johnson 2004). Not least in today’s popular culture the entertainment industry makes heavy use of computer simulation and artificial reality to design shared-world computer games or technologies of sensory immersion (Pias 2017).
Inventories and overviews of simulation practices and tools, methods, programs, and software packages in use in different disciplines are scarce (two exceptions are Varenne & Silberstein 2013 and Varenne et al. 2014). Despite the variety of disciplines involved, and the wide variety of applications and questions approached with the help of computer simulation, scholarship has almost univocally put forward the claim that simulations constitute a radical innovation in scientific methodology and a marked technological advance. Philosophers of science were among the first to argue that computer simulations were substantially changing scientific practices and methodologies (Hartmann 1996; Humphreys 1990, 2004; Rohrlich 1990; Winsberg 2010). In particular, they discussed the question of whether and how computer simulations could be considered as a new form of experimentation (Duran & Arnold 2013; Gramelsberger 2010). Equally, the notoriously difficult relationship between modeling and simulating has been the subject of long-standing debates. A recent extensive study makes a valuable contribution by analyzing this relationship in depth and in regard to a whole range of knowledge domains (Varenne & Silberstein 2013; Varenne et al. 2014). Of late, historian of science Norton Wise has introduced another thought-provoking view, arguing that understanding in the exact sciences takes on a narrative character due to the dynamical structure of computer simulations (Wise 2017). Scholars in the social sciences, and architecture especially, have highlighted that computer simulations are innovative in these fields because they generate connections between architecture, technology, culture or the public and thereby new ways of participation (Gleiniger and Vrachliotis 2012, Loukissas 2009). Criticism of the narrative of a radical change has been voiced by relatively few scholars. But Roman Frigg and Julian Reiss argue that computer simulations do not constitute a fundamental novelty in the scientific and technological landscape, they only create new versions of old problems. Instead of trying to pinpoint the allegedly innovative aspects of computer simulations, they advocate investigating the diverse practices and contexts in which simulations play a role (Frigg & Reiss 2009). A more nuanced view has been brought forward by Johannes Lenhard, who considers computer simulations as a new development, albeit in the traditional field of mathematical modeling (Lenhard 2019).
Of particular interest for this special issue is the scholarship on the history of computer simulations. It is surprisingly scarce and fragmented, since the subject has not yet been a sustained topic of focus within the wider field of the history of computing.Footnote 4 Isolated studies and volumes have been devoted to simulations in specific fields and subfields of science, such as plant biology (Varenne 2010, 2018), climate science (Edwards 2010; Heymann, Gramelsberger & Mahony 2017), architecture (Cardoso Llach 2015), the life sciences (Wellmann 2017, 2018b) and engineering (Johnson 2004).
With regard to the origins of computer simulations as a particular modus operandi in science, references are usually made to the only study which has tracked down the roots of such an approach. In his book Image and Logic Peter Galison investigated the emergence of the Monte Carlo method in the research context of the development of the H‑bomb in Los Alamos and its later diffusion into other areas of science and technology from the 1950s onward (Galison 1996, 1997). Galison offered a highly illuminating, yet relatively coarse-grained overview of the diffusion of both Monte Carlo methods and computing machines. He argued that they gave rise to an “artificial reality” at the border of theory and experiment. The Monte Carlo method and its implementation on the ENIAC computer, in this view, constituted not only a paradigmatic simulation, but also a template upon which later computer simulations were built. Computers and simulations were portrayed by Galison as a unique and homogeneous agent which, from the 1950s onwards, disrupted traditional epistemic orders in science and technology.
Largely unchallenged, this view has been adopted by many authors. This is especially true for the now popular idea that simulations emerged from the “military-industrial complex” (Pias 2011, Schroeter 2004). Some studies, however, suggest the need for revision and the development of a more nuanced picture. Paul Edwards, in his 1997 seminal book The Closed World: Computers and the Politics of Discourse in Cold War America, describes a specific historical constellation in which computers and simulations were crucial in creating a “closed world”, a worldview which governed U.S. international politics and military action in the 1960s. In his account, simulations based on game theory and on the methods of operational research as well as hybrid simulations of human-machine interactions were central (Edwards 1997). In her discussion of cellular automata, Evelyn Fox Keller points to the diversity of methods used in biology during the 1940s (Keller 2003), as does Gabriele Gramelsberger, in her historical reconstruction of the distinction between an analytical (symbolical) and a numerical approach to computation (Gramelsberger 2011a).
Four Historical Case Studies
The case studies presented here were chosen to demonstrate that in the second half of the twentieth century the trajectory of computer simulations can be characterized by an enormous diversity of practices and approaches, actors and fields, contexts and traditions. In particular, the studies’ topics highlight both differences in the simulative practices of neighboring disciplines such as astrophysics and particle physics, as well as similarities between more remote disciplines such as architecture and the natural sciences. Each paper’s shared point of departure is the approach of historical epistemology. To understand computer simulations, to assess their epistemic status, to interrogate claims of novelty and innovation, and, importantly, to get a nuanced view of how contemporary notions of computer simulation were formed, we need to learn about the various practices, historical actors, cultural contexts, and disciplinary fields that have shaped computational knowledge, framed scientific practices and codified disciplinary research. Looking to specific historical instances, the four case studies investigate the following questions: 1. How did scientists describe practices which we today tend to unify under the heading of computer simulations? 2. What fostered the introduction of computational tools, and was this engagement with computers considered novel and innovative by historical actors? 3. What changes—epistemological, experimental, disciplinary—were associated with the introduction of these computer-aided practices in specific fields and contexts?
Liesbeth De Mol, in her contribution, looks closely at how three mathematicians engaged with the ENIAC computer in the 1940s: number theorist Derrick H. Lehmer, logician Haskell B. Curry and icon of early computer science, John von Neumann. Changing perspective from simulation as an object of knowledge to simulation as a practice, De Mol provocatively claims that there was no simulation in the early ENIAC context, only various practices engaging humans with the machine. Each scientist developed computational tools and programming techniques that matched specific problems and methods in their respective fields: reducing a problem to simpler components, or framing it in axiomatic terms. Her detailed comparative analysis demonstrates that the relationship between machine, code and user was flexible at the time, allowing all three actors to shape it according to their own questions, aims, and ways of thinking. In this perspective, von Neumann’s use of the ENIAC as a tool for the Monte Carlo method—a view predominant today—is only part of a much more complex story.
Arianna Borrelli’s contribution is devoted to high energy physics. She shows that Monte Carlo computations were not directly imported from nuclear research into particle physics during the 1950s. In fact, this failed, and they only became relevant in the latter discipline in the early 1960s as event generators. At that point, Monte Carlo practices were not considered surrogates for experimentation nor artificial realities but were more often described as forms of computation or simulation that allowed for the production of fake events. However, once Monte Carlo event generators were widely employed in the late 1960s and 1970s, they not only helped physicists in their search for new particles, they also triggered an unforeseen epistemic shift in the concept of “particle”.
Nathalie Bredella shifts the perspective from the field of early physics and computation to the domain of architecture and design. Her contribution follows the introduction of digital modeling techniques in architecture from the 1980s onwards, and the transformations in design practices and concepts that were linked to the introduction of the computer. Bredella brings to light the various factors which motivated and enabled the rise of what is today referred to as Building Information Modeling (BIM) and which she analyzes as a digital construction kit, which allowed not only the simulation of buildings but of the whole design process. Among other examples, she investigates the visions of the architect Chuck Eastman in the 1970s, the first graphical user interfaces developed at MIT in the 1960s, and the early adaptation of software in the office of Gehry Partners in the 1970s and 1980s. Bredella describes these various constellations as interdisciplinary landscapes and argues that these case studies demonstrate how digital modeling served as a new communication tool, connecting architects with the many other social actors involved in planning and constructing.
Finally, Sibylle Anderl follows the emergence, development and use of a specific code in astrophysics. The Paris-Durham model of the propagation of interstellar shock waves was created around 1980 and is still in use today. Anderl shows how the code, while remaining essentially unchanged for a long period, nonetheless took up different epistemic functions in various research practices, depending on the aims of its users and, especially, the transformation of observational techniques in astrophysics. She argues that the code was initially considered a theoretical tool for gaining a qualitative understanding of interstellar shock waves. Later on, it became a means for producing “synthetic data”, which could be compared visually with direct observations. Eventually, the Paris-Durham model came to be seen as the most complete simulation of the propagation of an interstellar shock wave that was attainable. Epistemologically, she claims, this amounts to an extension of the physical concept of shock waves, which were traditionally defined as discontinuous phenomena and were now, in light of this simulation, understood to also be continuous.
Diverse Practices, Interdisciplinary Landscapes and a Protean Concept
The close historical studies in this volume undermine the prevailing view that simulation had its origins in the Monte Carlo method, as it developed in the 1940s as an ancillary to the United States’ war effort. Rather, these papers suggest a diversified and fragmented landscape of practices often, but not always associated with the introduction of the computer. Much like Peter Galison’s notion of trading zones, there were networks in place that connected actors across disciplines, space, domains of basic research and applied knowledge, backgrounds, and interests that were crucial in bringing forward novel applications, ways of programming, and uses of computational simulations and related practices.
This historical reassessment of the supposed paternity of computer simulations has direct implications for our understanding of the present: computer simulations are not one but a set of multiple practices, approaches, uses of computational, mathematical and modeling practices. They were multiple in the past, and remain so in the present. Instead of being a single, unified technique, simulations, on the contrary, can be identified by their protean nature. The attribution of novelty associated with the use of simulations appears to be a reflection of their bringing together hitherto separated spheres. Knowledge, people, thinking, and doing were grouped together in new ways which have fueled the spread of computer simulations and have been reshaped by these new constellations. Computer simulations establish connections that were not offered by preexisting environments such as universities and the conventional disciplines they foster, or the prevailing mindsets such as theoretical versus applied sciences encultured by such conditions. What is more, the configurations simulation allows for are not stable but are ongoing in their reconfiguration, with agents constantly reshuffling, and attaining novel, perhaps even unlikely matches.
This structural, combinatorial nature of simulations lies at the heart of the epistemic relevance they have been able to acquire over a relatively short period of time. The most important finding of the discrete set of case studies presented here is that they can demonstrate how the introduction of computer-aided simulation transformed central concepts in their respective fields: Arianna Borrelli shows how Monte Carlo event generators eventually changed the concept of “elementary particle” in physics; Sibylle Anderl demonstrates how a new type of shock wave emerged in astrophysics through the employment of digital models; and Nathalie Bredella reconstructs how the understanding of what constitutes the design process was fundamentally reframed by modeling buildings not with pen and paper, but digitally.
To end on a provocative note: the case studies in this volume suggest that the “black-boxing” so often associated with the use of computers in general and simulations in particular, might not (only) lie in the complexity of the technology but in the complexity of infrastructures, people, power relations, historical and cultural circumstances that have come together in a contingent historical moment, the dynamics of which are difficult and time-consuming to trace. In other words, looking into the black-box of computer simulation means looking out into the world.
This research was funded by the Institute for Advanced Study on Media Cultures of Computer Simulation (MECS), Leuphana Universität Lüneburg (DFG grant KFOR 1927). We thank our authors and colleagues, as well as the editors of NTM for the time and patience they devoted to the project, their support and ongoing discussion of the subject.
Today, any computer-aided technique using random numbers can be described as a Monte Carlo simulation. They are used, for example, to predict climate change, molecular dynamics, or the dynamics of finance, see Liu (2004).
General introductions to agent-based modeling are missing, but there is a vast literature on the many applications in a variety of fields.
Bazzanella, Liliana 2012. The Future of Cities and Regions: Simulation, Scenario and Visioning, Governance and Scale. Dordrecht: Springer.
Cardoso Llach, Daniel 2015. Builders of the Vision: Software and the Imagination of Design. New York, NY: Routledge.
De Mol, Liesbeth and Maarten Bullynk 2018. Making the History of Computing. The History of Computing in the History of Technology and the History of Mathematics. Revue de Synthèse (139):352–372.
Duran, Juan M. and Eckhart Arnold (eds.) 2013. Computer Simulations and the Changing Face of Scientific Experimentation. Newcastle upon Tyne: Cambridge Scholars Publishing.
Edwards, Paul N. 1997. The Closed World: Computers and the Politics of Discourse in Cold War America. Cambridge, MA: MIT Press.
Edwards, Paul N. 2010. A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming. Cambridge, MA: MIT Press.
Floridi, Luciano 2019. What the Near Future of Artificial Intelligence Could Be. Philosophy & Technology (32):1–15. https://doi.org/10.1007/s13347-019-00345-y
Frigg, Roman and Julian Reiss 2009.The Philosophy of Simulation: Hot New Issues or Same Old Stew? Synthese (169):593–613. https://doi.org/10.1007/s11229-008-9438-z
Galison, Peter 1996. Computer Simulations and the Trading Zone. In: Peter Galison and David J. Stump (eds.). The Disunity of Science: Boundaries, Contexts, and Power. Stanford: Stanford University Press: 118–157.
Galison, Peter 1997. Image and Logic: A Material Culture of Microphysics. Chicago: University of Chicago Press.
Gleiniger, Andrea and Georg Vrachliotis (eds.) 2012. Simulation: Präsentationstechnik und Erkenntnisinstrument. Basel: De Gruyter.
Gramelsberger, Gabriele 2010. Computerexperimente: Zum Wandel der Wissenschaft im Zeitalter des Computers. Bielefeld: Transcript.
Gramelsberger, Gabriele 2011a. From Science to Computational Science. A Science History and Philosophy Overview. In: Gramelsberger, Gabriele (ed.). From Science to Computational Sciences: Studies in the History of Computing and its Influence on Today’s Sciences. Zurich: Diaphanes: 19–44.
Gramelsberger, G. (ed.) 2011b. From Science to Computational Sciences. Studies in the History of Computing and its Influence on Today’s Sciences. Zurich: Diaphanes.
Gugerli, David and Daniela Zetti 2019. Computergeschichte als Irritationsquelle. In: Martina Heßler and Heike Weber (eds.). Provokationen der Technikgeschichte: Zum Reflexionszwang historischer Forschung. Paderborn: Verlag Ferdinand Schöningh: 193–228.
Hartmann, Stephan 1996. The World as a Process. In: Rainer Hegselmann, Ulrich Mueller, and Klaus G. Troitzsch (eds.). Modelling and Simulation in the Social Sciences from the Philosophy of Science Point of View. Dordrecht: Springer Netherlands: 77–100.
Heymann, Matthias, Gabriele Gramelsberger, and Martin Mahony (eds.) 2017. Cultures of Prediction in Atmospheric and Climate Science: Epistemic and Cultural Shifts in Computer-Based Modelling and Simulation. London: Routledge.
Humphreys, Paul 1990. Computer Simulations. PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association: 497–506.
Humphreys, Paul 2004. Extending Ourselves: Computational Science, Empiricism, and Scientific Method. Oxford: Oxford University Press.
Johnson, Ann 2004. From Boeing to Berkeley: Civil Engineers, the Cold War, and the Origins of Finite Element Analysis. In: M. Norton Wise (ed.). Growing Explanations. Historical Perspectives on Recent Science. Durham: Duke University Press: 133–158.
Keller, Evelyn Fox 2003. Models, Simulation, and “Computer Experiments.” In: Hans Radder (ed.). The Philosophy of Scientific Experimentation. Pittsburgh: Pittsburgh University Press: 198–215.
Lengauer, Thomas (ed.) 2011. Computermodelle in der Wissenschaft – zwischen Analyse, Vorhersage und Suggestion. Halle (Saale): Wissenschaftliche Verlagsgesellschaft.
Lenhard, Johannes 2019. Calculated Surprises: A Philosophy of Computer Simulation. Oxford: Oxford University Press.
Liu, Jun S. 2004. Monte Carlo Strategies in Scientific Computing. New York: Springer.
Loukissas, Yanni A. 2009. Keepers of the Geometry. In: Sherry Turkle (ed.). Simulation and its Discontents. Cambridge, MA: MIT Press: 153–201.
Menges, Achim and Sean Ahlquist (eds.) 2011. Computational Design Thinking. Chichester: Wiley.
Morrison, Margaret 2015. Reconstructing Reality: Models, Mathematics, and Simulations. Oxford: Oxford University Press.
Pias, Claus 2011. On the Epistemology of Computer Simulation. Zeitschrift für Medien- und Kulturforschung (2): 29–54. https://doi.org/10.28937/1000107521
Pias, Claus 2017. Computer Game Worlds. Zurich: Diaphanes.
Rohrlich, Fritz 1990. Computer Simulation in the Physical Sciences. PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association: 507–518.
Schroeter, Jens 2004. Computer/Simulation. Kopie ohne Original oder das Original kontrollierende Kopie. In: Gisela Fehrmann et al. (ed.). OriginalKopie – Praktiken des Sekundären, Cologne: Dumont 2004: 139–155.
Turkle, Sherry (ed.) 2009. Simulation and its Discontents. Cambridge, MA: MIT Press.
Varenne, Franck 2010. Formaliser le vivant: lois, théories, modèles? Visions des sciences. Paris: Hermann.
Varenne, Franck 2018. From Models to Simulations. London: Routledge.
Varenne, Franck and Marc Silberstein 2013. Modéliser & simuler: épistémologies et pratiques de la modélisation et de la simulation. vol 1. Paris: Ed. Matériologiques.
Varenne, Franck, Marc Silberstein, Sébastien Dutreuil and Philippe Huneman 2014. Modéliser et simuler: épistémologies et pratiques de la modélisation et de la simulation. vol. 2. Paris: Ed. Matériologiques.
Vehlken, Sebastian, Isabelle Schrickel, Claus Pias and Anneke Janssen 2016. Computersimulationen. In: Benjamin Bühler and Stefan Willer (eds.). Futurologien. Ordnungen des Zukunftswissens. Munich: Fink: 181–192.
Wellmann, Janina 2017. Animating Embryos: The in toto Representation of Life. The British Journal for the History of Science (50): 521–535. https://doi.org/10.1017/S0007087417000656
Wellmann, Janina 2018a. Gluing Life Together. Computer Simulation in the Life Sciences: an Introduction. History and Philosophy of the Life Sciences (40):70. https://doi.org/10.1007/s40656-018-0235-9
Wellmann, Janina (ed.) 2018b. Computer Simulation in the Life Sciences. Topical Collection of History and Philosophy of the Life Sciences. https://link.springer.com/journal/40656/topicalCollection/AC_cc03f75be6e9d27375d913f91f9fa421/page/1
Winsberg, Eric B. 2010. Science in the Age of Computer Simulation. Chicago: University of Chicago Press.
Wise, M. Norton 2017. On the Narrative Form of Simulations. Special Issue of Studies in History and Philosophy of Science Part A (62, Supplement C): 74–85.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Borrelli, A., Wellmann, J. Computer Simulations Then and Now: an Introduction and Historical Reassessment. N.T.M. 27, 407–417 (2019). https://doi.org/10.1007/s00048-019-00227-6