As a concept inflected by the legacy of postmodern theory, simulation’s epistemic nuance and historical significance are easily underestimated. This is particularly true for cinema and media studies. Baudrillard’s theory that we constructed “models” of reality so extensive that signs have begun to refer to them, rather than reality itself, applied all too readily to the emergence of new digital tools, especially ones that seemed to create images at once alarmingly photorealistic yet also alarmingly artificial. The 1994 release of the English translation of Simulacra and Simulation coincided almost perfectly with the special and visual effects in films like Jurassic Park (1993) and Forest Gump (1994) and with the emergence of the first computer animated feature Toy Story (1995). Although the influence of postmodern theory has waned significantly since, critics continue to describe many forms of digital animation and VFX in terms of Baudrillard’s concept of hyperrealism, and simulation continues to be used as a negative term to refer to artificiality.

This disposition toward digital VFX and animation has been buttressed by changes in film production that have seen a shift to post-production and technical work, where filmmaking increasingly involves engineering software, building plug-ins, programming, and writing software scripts. This less geographically and temporally specific form of media labor, where workers sit rank-on-rank in front of computer screens making and using software, has in many cases replaced more immediate (and unionized) forms of production work. New businesses driving growth in the industry like Pixar, Sony Pictures Imageworks, and Netflix, are as much tech companies as they are studios. They fund research and development (R&D) and defend their technical intellectual property to get an edge on the competition.

What if we took these concepts of simulation and technological artificiality and treated them more complexly? For historians and theorists of technology, an artifice is a made thing: a tool, an apparatus, or a machine. This is a definition that dispenses with negative connotations of the word relating to deception, focusing instead on the Latin roots of ars (skill or craft) and facio (to make a thing). Herbert Simon, Mario Bunge, and Walter Vincenti have all developed ways of theorizing the knowledge produced by making artifices such as simulations. This approach upends the typical construction of engineering as the mere application of scientific knowledge. What would happen if we used an example like nonlinear animation to apply such an approach to cinema and media in general? We could look at the scientists and technologists developing animation tools and treat their work seriously as a particular way of seeing the world with its own potentialities. We could introduce new institutional and industrial R&D histories to add nuance and detail to a moving image culture too often defined by the telos of “the digital.” We would uncover a richness of meaning in many of global Hollywood’s most apparently hyperreal products.

Nonlinear simulations are models used to test theories about how unpredictable phenomena work. I use this term in a non-technical sense, to refer to any simulation of nonlinear phenomena. Some of these simulations may not be truly nonlinear themselves. A nonlinear simulation might test a theory about the flow of air or water, the way crowds behave, changes in the weather, or changes in the stock market. These tools have become central to different disciplines and industries, including management science, financial mathematics, meteorology, video games, computer art, and cinema. Studying nonlinear simulation’s use in cinema and revealing the connections with other uses in other fields offers insights into an epistemic paradigm that has slowly begun to shape innumerable facets of modern society. Nonlinear simulation is a way of thinking about contingency and control that is deeply embedded in military and industrial applications and in cybernetic discourse. Its practitioners often use it to manage and exploit the unpredictable. But nonlinear simulations can also be built for fictional uses, to speculate and imagine.

The various nonlinear simulation tools used in the film industry thus offer a range of complex meaning that requires extensive historical and theoretical unpacking, and their study uncovers unmarked connections between cinema and other forms of simulation-based media such as video games and computational art. Rather than using simulation as a negative term to mean mere symptomatic postmodern artifice, the concept of simulation is vital for understanding a mode of image making that has become increasingly popular on cinema screens and that increasingly defines the way businesses and institutions see the world.

Simulation and Nonlinearity

On its own, simulation is a rather imprecise term. Its Latin root, similis, means likeness or similarity. In common speech, simulation can simply refer to representation by what C.S. Peirce would call an “iconic” form, that is to say, the representation of something by means of similitude, or as Peirce puts it, by “a mere community in some quality.”Footnote 1 A person might, for example, simulate the barking of a dog by attempting to make a sound like barking. Simulation refers to a much more specific concept in the context of twentieth century science and engineering though. Here simulation refers to making a functioning model of some kind of system or process in order to understand it. A model is a description of a theoretical mechanism. For example, the theory that water is recycled between land and ocean is supported, in part, by a model that accounts for water evaporating from the oceans, condensing in the atmosphere, and running off back into the ocean. Scientific simulation refers to putting a model into motion to test the validity of the underlying theories. For example, you could build a glass box with water and earth in it to test the aforementioned water cycle model. In other words, you make an artificial mechanism to represent a real mechanism. Stephan Hartmann defines simulation as “the imitation of a process by another process.”Footnote 2 Simulations mostly model physical processes, but the parameters of a simulation need not necessarily be physics. For one, they can be guided by some sort of exotic physics on an astronomical or subatomic scale, but simulations can also be used to model processes like animal behavior, traffic jams, or evolution. They can even be fictional or speculative.

Simulations do not necessarily offer the same forms of evidence as experiments. This is why some philosophers of science, like Eric Winsberg, argue for the need to see simulation as a new form of science that is neither theory nor experiment.Footnote 3 If philosophers of science have observed the need to theoretically grapple with simulation and its evidentiary value, what theoretical and epistemic issues might it raise for media? These questions have become particularly pressing in light of the proliferation of simulations in science, engineering, and media over the past seventy years, and in light of the increased complexity afforded to them by computer technology.

A scientific simulation can be rather simple, especially when it seeks to model a regular, predictable process. For example, an orrery, a physical model of the solar system, simulates how the planets move around each other. This is a relative predictable, linear simulation. The movements of the planets are, after all, are as predictable as the rising and setting of the sun or the movement of the tides. Most processes in the world are not so predictable. Changes in weather, for example, are nonlinear. In a nonlinear system, you may start with relatively simple conditions, but the result of those conditions can be wildly different. You cannot deterministically predict a single outcome based on initial conditions. As computer scientist Melanie Mitchell puts it, “A linear system is one you can understand by understanding its parts… A nonlinear system is one in which the whole is different from the sum of the parts.”Footnote 4 Many different research disciplines have used this concept of nonlinear simulation as a tool for understanding unpredictable things in the world, and different industries have sought to benefit from this research for both the purposes of prediction and control, seeking a way to understand and in some way shape the unpredictable. Starting in the 1980s, animation and VFX studios began using these nonlinear simulations to make animations (Fig. 2.1). Like their nonlinear simulation predecessors and contemporaries, these new forms of nonlinear animation mediate concepts like chance, risk, contingency, emergence, and control in a very particular way. And like scientific uses of simulation, which create a new space between experiment and theory, these new forms of media confound existing media categories, opening a space in-between automated capture and manual manipulation.

Fig. 2.1
A segmented flow diagram depicts the nonlinear simulation techniques, tools, and applications in four time periods, before 1944, 1945 to 1969, 1970 to 1994, and 1995 to present.

Nonlinear simulation: techniques, tools, and applications

Stochastic Simulation

There are two primary forms of nonlinear simulation that each take different approaches to modeling unpredictability: stochastic and dynamic. The history of each reveals their own epistemic complexity, the varied subjects they have been used to understand, their fictional and imaginative use, and their eventual adoption as an important part of the animation and VFX industries. These histories both start in early modernity with exotic concepts formed by mathematicians, but nonlinear simulation only really took hold as an influential way of seeing the world in the context of Second World War and Cold-War state-sponsored R&D. From there it spread from within the nascent field of computer science to fundamentally transform numerous facets of society. Though computer technology and nonlinear simulation were both shaped by the R&D institutions that supported them, and their histories are deeply interlinked, these two technologies are not the same. Stochastic and dynamic simulations identify a more precise way of seeing certain forms of contingency and they are not always necessarily digital in nature. Thus, they offer a new historical context and demand a new theoretical framework for studying related film technologies and production practices from the past few decades.

Stochastic simulation is slightly simpler, so it is the more logical place to start. A stochastic process is, put simply, a simulation with a random variable in it. Stochastic simulations use randomness to model the non-determinism of a nonlinear system. While it may sound absurdly reductive to model an unpredictable process through the use of proverbial dice rolls, it is a concept that has had a profound effect on how society sees the unpredictable and complex.

It would be difficult to determine the first time someone used randomness as a stand-in for unpredictability, but this concept took on a particular meaning and utility in industrial modernity. Several versions of this concept appeared in the first decade of the twentieth century. The earliest was a model for understanding a phenomenon called Brownian motion. In the early nineteenth century, botanist Robert Brown observed that pollen suspended in water moved on an unpredictable and seemingly random path. What he was seeing was the effect of water molecules bouncing around and unpredictably hitting the pollen from different directions, imparting different vectors of momentum.Footnote 5 Some seventy years later French mathematician Louis Bachelier was the first to formulate a model for this unpredictable movement in 1900.Footnote 6 Brownian motion posed a particular problem because it was the result of a process too complex to completely model. Bachelier modeled it using a random walk where, rather than calculating the collisions of myriad particles, a random direction is given to the pollen at a given or random interval. In other words, if you take a moving point and choose random directions for it, you will produce a path like pollen suspended in water without having to simulate millions of molecular collisions.

Bachelier believed this concept could be applied to other unpredictable phenomena. He attempted to use it to model the change in stock prices, for example. If you take what you know about the factors influencing the change in value of a commodity, and then put a random variable in to simulate the unpredictability of the real world, you can model its movement and predict a range of possible outcomes. At the time, Bachelier’s economic theories garnered little interest, but they essentially describe contemporary approaches to options pricing in the field of financial mathematics.

Stochastic simulation exploded in popularity in the context of a surge in federal R&D funding during the Second World War and Cold War that also produced the reprogrammable computer and cognate concepts like systems theory. In the mid-1940s, Los Alamos National Laboratory researchers Nicholas Metropolis, Stan Frankel, and Stanislaw Ulam were trying to predict the paths of neutrons in a nuclear fission reaction, a problem that they found could not be solved through linear means. The issue was that neutrons bounce around in unpredictable ways, not unlike Brown’s pollen molecules. Nuclear scientist Enrico Fermi suggested they try a randomized method, where they would simulate numerous paths based on a random factor, generating a wide variety of outcomes that could be statistically analyzed in aggregate. The process would not produce a single deterministic answer, but a range of statistical likelihoods. Fermi had actually been attempting this technique using a mechanical device of his own invention back in Italy.Footnote 7 Los Alamos consultant John von Neumann suggested they use the Electronic Numerical Integrator and Computer (ENIAC), the first programmable electronic computer, to run these simulations as a sort of test for the new machine. The ENIAC was designed and built for the purpose of calculating firing tables for the Ballistics Research Laboratory, but it seems von Neumann was keen to explore its potential. The team at Los Alamos dubbed their new process the “Monte Carlo method,” based on the idea that it employed randomness like that of casino games.

Stochastic simulation has since become a key tool for prediction and control in several disciplines, shaping society’s relation to risk and uncertainty. Two fields that have made extensive use of stochastic simulation since the 1940s are management science and financial mathematics. Management science grew from these early activities at Los Alamos and from contemporaneous research in the field of logistical “operations research.” Large institutions such as the United Steel Company, the U.S. Air Force, and General Electric were keen to explore the potential benefits of these concepts and supported early research.Footnote 8 These institutions valued nonlinear simulation both for its predictive capacity and for its ability to test systems against unpredictable events. The use of concepts like the Monte Carlo method and stochastic discrete-event simulation allowed organizational structures to cope with unpredictability in coordinated systems like supply chains. Nonlinear simulation is thus a powerful tool for testing management systems against the unpredictability of reality. These disparate experimental programs eventually crystalized into the field of management science (not to be confused with scientific management) in 1959.Footnote 9

These developments in management science were soon followed by developments in other fields such as finance, where Louis Bachelier’s idea for calculating financial risk using stochastic simulation finally found broad acceptance in the 1970s, some twenty-five years after his death, with the development of the Black-Scholes model.Footnote 10 Just as Bachelier reduced the complexity of water molecules to a random factor, so too did Fischer Black and Myron Scholes condense the unruly unpredictability of the world to randomness. Like the Monte Carlo method, the Black-Scholes model takes a statistical sample of many discrete simulations in order to find a range of future outcomes. This new approach to determining options pricing changed the face of modern financial economics.Footnote 11 The Black-Scholes model effectively gave speculators a range of outcomes that they could reasonably expect. It could not tell speculators exactly what would happen, but it could give them a range. Risky wagers that were once considered tantamount to gambling suddenly became quantifiable. While fields like management and finance once abhorred the unpredictable, stochastic simulation transformed their approach.

While stochastic simulation has grown as a useful, functional part of contemporary industry, its use in fictional and imaginative applications is just as historically significant. Both of these parallel histories inform the use of nonlinear animation in contemporary cinema. Indeed, these two genealogies reveal new connections between forms of media like computational art, video games, and cinema. The fictional use of stochastic simulation is even older than Bachelier’s work on Brownian motion. Games of chance that use dice or cards have a history that stretches back well beyond modernity, all the way to the ancient use of sheep knucklebones as proto dice. Yet games of chance are not necessarily simulations. The first example of a game that sought to model real world phenomena while also using dice was likely the Prussian “kriegspiel.” In the late eighteenth century, Prussian entomologist Johann Hellwig came up with the idea of using a game like chess to model famous historical battles as a sort of hobby. Later, Georg Reiswitz Jr., a former artillery officer and son of game designer Georg Reiswitz Sr., introduced the idea of using dice. His reasoning was based on his experience with the “uncertainty” of artillery accuracy.Footnote 12 Even the best artillery crew will sometimes hit and sometimes miss their target. If a simulation of battle seeks to reflect reality, he reasoned, it must therefore simulate this uncertainty.Footnote 13 During the Napoleonic wars, the Prussian state embraced this concept as having legitimate strategic value as part of military science. The adoption of the kriegspiel was part of a combination of new technologies that were drawing the attention of modern militaries, including recent advances in cartography, the application of statistics, and Daniel Bernoulli’s work on the principles of probability.Footnote 14

While the kriegspiel oscillated back and forth between more fictionalized, playful uses and more practical, serious ones, it laid the conceptual groundwork for a world of fictionalized stochastic simulation games. Elizabeth Magie’s The Landlord’s Game from 1904, for example, applied the economic theories of Henry George and incorporated the randomness of dice rolls to simulate a real estate market. While it was intended as a learning tool and it was largely used in universities at first, a later iteration in the form of Monopoly defines the board game for many today. The Prussian kriegspiel also acted as inspiration for tabletop battle games published by Avalon Hill starting in 1952, as well as role-playing games like Gary Gygax’s Dungeons & Dragons in 1974.Footnote 15 These games would in turn inspire the first mainframe-based role-playing video games of the 1970s such as Moria and Avatar. Stochastic calculations continue to play an important role in digital games of all kinds, now popularly referred to as RNG (random number generator) by players, who facetiously pray to “RNJesus” when confronted with simulated chance in a game.

Stochastic simulation has also influenced different forms of generative art. In the 1950s, John Cage used the concept of chance as a part of musical composition. As early as 1965 artists such as Georg Nees and Frieder Nake were using stochastic computer programs to create what they termed generative art or artificial art (künstliche kunst).Footnote 16 Since then stochastic concepts have inspired a great deal of artistic experimentation with computers, including fractal art, which reached a wide range of audiences in the 1980s and 1990s. While academics and researchers were generally responsible for early generative computer art because they had access to computers, more recent open-source software such as Processing and Context Free have brought it to millions of users.

In 1982, these diverse creative, imaginative, fictional, institutional, military, and industrial uses of stochastic simulation found their way onto cinema screens for the first time via computer graphics in Lucasfilm’s “genesis sequence” in Star Trek II: The Wrath of Khan. This animated VFX sequence depicts a planetary-scale explosion and subsequent terraformation within the diegetic frame of a computer visualization. Stochastic simulation played a role in two aspects of this effect. First, the film features a seemingly endless mountain range where each peak and valley looks different. These 3D models were created using a technique invented by Boeing engineer and Pixar co-founder Loren Carpenter.Footnote 17 Rather than repeating the same shapes over and over, or spending endless hours modeling the landscape by hand, Carpenter’s technique used stochastics to create the landscape automatically. The film also features clouds of individual particles meant to look like an explosion, where each individual particle travels on its own unpredictable path. This was the result of a stochastic technique developed by another Pixar co-founder, William T. Reeves.Footnote 18

Pixar would continue to develop these and other techniques based on stochastic simulation. The following year they would begin to use it in ray tracing, a key early technology for 3D rendering.Footnote 19 Ray tracing simulates the way light interacts with materials by calculating factors like reflection, refraction, scattering, and dispersion, throwing in randomness to substitute for the subtle details and variations that affect these phenomena. In some cases, stochastic simulation has been overshadowed by more complex simulations that model the dynamic forces at work. Yet it continues to be useful for certain applications, especially ones where detail can be traded-off for efficiency. Ray tracing is still a fundamental part of 3D rendering, and stochastic simulation is still used in the more recent spectacular effects such as the shattering of rigid objects, which requires creating the random path of a crack in the uniform digital surface of a breakable object like a vase or a statue.Footnote 20

For over a century people have been using stochastics to model unpredictable phenomena. This concept took on particular meaning in the context of industrial modernity and in the context of new scientific and organizational fields that were supported by governmental and industrial R&D during the Second World War. It has since been enshrined as a vital part of fields like financial mathematics and management science. At the same time, artists and media industries have adopted it for fictional uses. Both scientific and fictional uses of this concept entail a certain way of mediating contingency and complexity. While mathematical probability seeks to quantify the unpredictable, stochastic simulations embrace it, if only to seek further control in the end. This is particularly evident in examples like the VFX in The Wrath of Khan, where an important part of the visual appeal of the effects is their unpredictability. This logic is also at work in the case of dynamic simulation.

Dynamic Simulation

While stochastic simulation deals with complexity by substituting it with randomness, dynamic simulation instead models that complexity. To understand the nature of dynamic simulation one must go back to Bachelier’s mentor Henri Poincaré and his solution for the “three-body problem.” This classic problem sees three planets in each other’s gravitational fields, with each body influencing the other in turn. The difficulty of the problem stems from the fact that every force exerted from one body onto another feeds back to the first body via the mediation of the third. The problem cannot be solved in the traditional deterministic sense, based on initial conditions. This is a case of dynamic complexity. Poincaré’s solution was to describe the range of possible outcomes, but another way to model this problem is to use a continuous dynamic simulation. A continuous dynamic simulation would constantly take the resulting forces and re-input them into the problem, continuously revising the conditions. Dynamic simulation has become influential in physics, engineering, meteorology, and sociology, transforming different facets of society, just like stochastic simulation. And just like stochastic simulation, dynamics have given rise to new forms of media with new ways of imagining the unpredictably complex.

A good way to identify a nonlinear system is to ask whether the past predicts the future. Financial markets are an example of this. Though we might use past information to build predictive models, the market’s past behavior does not tell us what it will do tomorrow. Another classic example of this is the weather.Footnote 21 Founder of modern metrology Vilhelm Bjerknes identified this challenge in 1904, when he likened it to Poincaré’s three-body problem.Footnote 22 It was not until the 1940s, however, that researchers began to engage this problem through dynamic simulation. The complex problem that Bjerknes laid out was too great a temptation for researchers working with the earliest computers with the backing of a wartime government. John von Neumann described it as “the most complex, interactive, and highly nonlinear problem that had ever been conceived of.”Footnote 23 In 1946, von Neumann and fellow researcher Jule G. Charney organized a research group to explore computer weather simulation at The Princeton Institute for Advanced Study using grants from the Navy and Air Force, and soon after the ENIAC (the same computer used for the Monte Carlo method) was producing short predictions with relative accuracy.Footnote 24 This research lead to leaps in the understanding and modeling of weather systems and was followed by many other developments by the likes of Edward Norton Lorenz, who would develop his “chaos theory” based on weather simulations.Footnote 25

At the same time computational meteorology was taking shape as a research discipline, other scientists were forming a parallel branch of research at Los Alamos named the T3 Group, which focused on the dynamic modeling of fluids of all kinds. This research took much the same approach as weather modeling because the problem was basically the same. Both approached dynamic phenomena by breaking them into cells and calculating the vectors of movement of those cells based on factors like momentum and pressure difference.Footnote 26 Research into computational fluid dynamics (CFD) promised benefits for engineering and design. For example, while high-speed aircraft designs were tested using physical models and wind tunnels at that time, CFD promised the ability to virtually test designs. CFD also allowed for the simulation of combustion in internal combustion, jet, and rocket engines. CFD was eventually packaged into multipurpose engineering software that could be used by a variety of industries, such as Klaus-Jürgen Bathe’s ADINA software, developed in 1974 while he was at MIT, and applications like PHOENICS, Star-CD and Fluent, all developed by scholars who had worked at Imperial College’s CFD Unit in the late 1970s.

As fluid simulation was becoming enshrined as standard practice in engineering, forms of visual media began to take a similar epistemic approach to fluidity. In the 1990s, studios and software companies began to adapt CFD technologies for animation and VFX, with tools like Arete’s Digital Nature Tools, Next Limit’s Real Flow, Exotic Matter’s Naiad, and Digital Domain’s FSIM and STORM. All these pieces of software animated the motion of nonlinear phenomena like splashing water or clouds of smoke. These animation tools basically share the same user environment and principles of simulation as the ones designed for commercial design and engineering. The one subtle distinction between media and engineering applications is that engineering puts a strong emphasis on fidelity, empirical reliability, and prediction, while animation and VFX tools are more preoccupied with simulation speed and the “directability” of simulations. As a result, simulation software for VFX and animation diverged somewhat from engineering and scientific research tools. This is not to say that there are no more connections between the film industry and scientific research though. As Chap. 3 will demonstrate, there are constant transactions between film and other industries through professional organizations, academic institutions, and the circulation of researchers. Scientific applications often employ spectacular and cinematic images, while media industry applications often promote their scientific realism.

Another parallel line of dynamic simulation research was focused not on fluids or weather but on agents and evolving systems. This type of simulation developed into a different set of nonlinear animation tools. The point of origin for this research was John von Neumann’s concept of cellular automata, which he formulated in the same era he was directing research groups on weather simulation. The principles of cellular automata are deceptively simple: a given grid has either black or white squares, and the squares change state depending on rules about the state of their neighboring squares. This is an example of a dynamic system because one square can change the conditions of the others, which in turn change the conditions of the initial square. As with weather and fluid simulation, simple conditions lead to dynamic complexity. This concept would later be developed by the conceptually similar technologies of agent-based simulation and evolutionary simulation (genetic algorithms), technologies which would be used for sociological research, infrastructure planning, architecture, and certain kinds of nonlinear animation.

An agent-based simulation entails putting many virtual agents with a given set of behaviors into a world with specific rules and seeing how they will interact with each other and their environment. Researchers employed this as a way of studying the dynamic behavior of populations. If you believe certain rules govern behavior, you can run a simulation to see what sort of behavior results from those rules. For example, in 1971 T.C. Schelling published a study where he attempted to understand neighborhood racial segregation by designing a grid with two different kinds of agents that were programmed to move if a certain number of the other kind of agent lived next to them. This research was conducted at Harvard and sponsored by the RAND Corporation.Footnote 27

An important event in the development of this type of research was the formation of the Santa Fe Institute. Founded in 1984, predominantly by researchers from the Los Alamos National Laboratory, the Santa Fe Institute funds and facilitates research on various topics that employ complexity as a research paradigm. While the institute has contributed to subjects relating to physics and theoretical mathematics, it has also sponsored several agent-based and evolutionary simulations. For example, the institute co-sponsored a project called Sugarscape by Joshua M. Epstein and Robert Axtell, a simulation where agents gather, trade, consume, and excrete a consumable commodity. Through their research, they hoped to learn about humanities’ consumption of natural resources and the development of societies more generally.Footnote 28

In cases such as Sugarscape, a complex and intricate system with unpredictable shapes and behaviors emerges from what was once a relatively simple set for rules. While this type of emergence can be useful for understanding those initial conditions, it can also be used to analyze the process of development and change, in other words, its evolution. A good example of this is John Conway’s influential “Game of Life” simulation from 1970. Based directly on von Neumann’s original concept of cellular automata, Conway’s version sees pixels on a grid forming into self-sustaining entities that interact with each other. Examples of these entities include the “glider” and “glider gun,” which seem to fly across the grid like birds. Conway’s work fostered the idea that one could create virtual living organisms through relatively simple simulations, a discourse this book will address in more detail in Chap. 6. After this, researchers began testing to see if simulated agents could optimize their behavior through evolution and perhaps even learn. In 1975, John Holland laid out the fundamentals of genetic algorithms and learning in his book Adaptation in Natural and Artificial Systems. Examples from his book included a robot that could learn the most efficient way to pick up cans through a process of trial and error learning.

Much like weather and fluid simulation, both agent-based simulation and genetic algorithms have been taken up and further developed by VFX and animation. Multiple Agent Simulation System in Virtual Environment (MASSIVE), developed by VFX industry veteran Stephen Regelous at Weta Digital, draws specifically on technology and concepts from agent-based simulation.Footnote 29 Weta developed MASSIVE as a way of animating hordes of moving and fighting creatures, which were too numerous to stage in real life. Past efforts to render large groups suffered from the appearance of patterns. It looks unnatural if every character behaves the same. If you vary characters amongst a finite set of animated behaviors, uniform patterns still emerge, which spectators perceive as too regular and artificial. MASSIVE produced a far more naturalistic effect by setting simple behavior rules for agents and running a dynamic simulation where they reacted to each other. Weta put MASSIVE to use in The Lord of the Rings: The Fellowship of the Ring (2001). Regelous has since left Weta Digital and formed his own independent company that now sells MASSIVE 2.0. MASSIVE has also spawned an engineering tool of its own and it markets its products to engineers and architects, not just as a visualization tool, but also as way to simulate the movement of people through a building.Footnote 30

Genetic algorithms have similarly proven useful for animating characters. This technology was developed by a third-party software company called Natural Motion, founded by software engineer Colm Massey and researcher in evolutionary and adaptive system Torsten Reil. Natural Motion developed the concept of “stuntbots,” which are simulated characters that are programmed to learn to stand upright and to react to external forces. They will attempt to steady or right themselves in response to getting knocked over of thrown. Natural Motion’s Endorphin software has been used in a variety of Hollywood movies such as Troy (2004) and The Lord of the Rings: Return of the King (2003). Like MASSIVE, Endorphin solves the problem of how to naturalistically animate the movement of characters when it would be unfeasible to do it with key-frame animation or motion capture.Footnote 31

Many of these technologies have been the subject of breathless promotional bluster. Therefore, one should exercise a certain degree of caution when studying the connections between VFX, animation, and scientific research. One way a studio might promote the realism of their work is by promoting its connection to scientific realism. A clear example of this can be found the promotion of the film Interstellar (2014). During its release, there were numerous stories in the press about how the animation of the black hole in this film was made with the aid of a simulation designed by prominent astrophysics Kip Thorne. It was “the most accurate simulation ever of what a black hole would look like,” according to a story in Wired.Footnote 32 The fact that some VFX and animation tools use the same concepts used in other domains of science and technology does not tell us that they are realistic though. Instead, it tells us that they share a certain epistemic frame, a certain way of seeing the world and making sense of it. Nonlinear animations are fictionalized versions of scientific simulations.

Over the past century stochastic and dynamic simulations have brought forth a different way of seeing the world. This has been a relatively gradual transition, but since the 1940s this way of understanding and controlling the unpredictability complex has spread from exotic science, to functional tools, to popular culture. Many of the examples of nonlinear animations like The Wrath of Khan or The Fellowship of the Ring are treated by the industry as landmarks in digital filmmaking. But there is another history at work here. These visual effects are not just examples of digital technology; they are examples of nonlinear simulation. We can look to these images as historical indexes of the role nonlinear simulation was beginning to play in our lives, whether we knew it or not.

Stochastic and Dynamic forms of nonlinear animation often share in the myths and cybernetic discourses of the research tools that preceded them, similar to the way Patrick Crogan sees a Cold War cybernetic discourse designed to predict the future as having influenced the shape of video games.Footnote 33 One such myth, which Philip Mirowski terms “cyborg science,” conflates the ontology of real phenomena with the computational simulations used to understand them.Footnote 34 We can see this sort of thinking at work in examples like Conway’s Game of Life, which seems to ignore the difference between the emergent behavior of virtual patterns and the immense unknown complexity of real life. Paul Edwards similarly critiques the discourse of “central command and control” at work in Cold War computing, where politicians and the military were seduced by the idea that they could attain totalizing control by computationally merging different source of information.Footnote 35 Again, the way nonlinear animation tools create computational uncertainty only to further control it suggests a similar way of thinking, especially when these tools are used in place of recording images of real phenomena. Yet we should not reduce nonlinear animation tools to these Cold War discourses. Researchers and practitioners do not always conflate the ontologies of simulation and reality, and they do not always seek totalizing control. Even military-driven R&D itself can be viewed in a more nuanced way.

R&D: Engineering and Science in Modernity

The nonlinear animation technologies developed for the film industry and the nonlinear simulation technologies developed for various other scientific and industrial applications both resulted from a very specific historical, institutional, and epistemic context, defined by the concept of researcher and development. R&D is, in essence, a theory of how technology is created in modernity. This follows from a modern definition of technology that first emerged in the late nineteenth century in Germany in the form of the term “technik,” meaning “industrial arts,” and more generally the application of scientific thought, both for the construction of artifices and for the proposes of organization or systemization.Footnote 36 Rather than limiting engineering and design to existing knowledge, R&D attempts to marshal new knowledge, directing it toward technical application. This way of thinking was central to American Cold War policy, which sought new technologies that could provide strategic advantages, but its influence extends well beyond this. The discourse of R&D continues to shape countless industries, including film industries. The fact that contemporary films feature screen credits such as “R&D,” “R&D Artist,” and “Principal Graphic Scientist” gives us some sense of the concept’s influence.

Up until very recently, analysis of animation and VFX since 1980 has focused on digital technology as the key subject of change: what happens to film when it is reduced to ones and zeros? While this is a valid question, R&D points us to different theoretical questions and directs our attention to other historical factors.Footnote 37 Rather than focusing on the telos of numerical discretization and calculation, we can instead look to the institutional conditions that produced technologies like computers and simulation. Applying the history of R&D to nonlinear animation additionally draws attention to the fact that these technologies were not external forces that had deterministic effects on film production. Instead, it shows how the development of these tools was an internal process, shaped and facilitated by studios and software companies. This shifts agency from the abstraction of technology to the practical reality of institutional and industrial organization. When Raymond Williams makes the case, contra technological determinists like Norman McLaren, that society and culture determine the shape of media technologies, he highlights R&D as site where this process takes place.Footnote 38 Television was not discovered as some technological concept external to society, and neither was nonlinear animation.

Implied in the words research and development is a relation between theoretical and practical knowledge, basic science and applied engineering, or “knowing that” and “knowing how.”Footnote 39 Although scientists and policy makers have critiqued R&D as anathema to the cause of basic science and freedom of inquiry, the concept opens new avenues for understanding how nonlinear simulation is used by scientists, engineers, and artists to make sense of the world. Simulations are artifices built to learn about the world. While scholars like Paul Edwards and Patrick Crogan argue that the roots of simulation can be found in Cold War governments’ drive to anticipate the future in an uncertain and dynamic world, this more basic epistemic topic allows us to uncover an older genealogy. It also allows us to recognize a method for understanding the world that is not defined by its military application, even if the military played an important role in its history. The history of R&D therefore offers a more fulsome picture of the fictional and creative use of nonlinear simulation by artists and the film industry.

Thomas Edison’s Menlo Park research laboratory is a historically significant archetype for modern R&D in North America.Footnote 40 This new form of institution brought scientists together to conduct research that was directed toward developing specific technologies. It was made possible by changes in patent laws that saw employers retaining the rights to discoveries made by researchers.Footnote 41 Following the Menlo Park paradigm, the drivers of American industry soon all had research labs. Not long after industry recognized the potential of R&D, the U.S. government sought to take advantage. In 1915, the Navy formed a science advisory committee, chaired by Edison. The following year the United States government formed the National Research Council with a board also populated by industry figures.Footnote 42 The marriage of science and engineering was now recognized as a matter of national importance. This was also the period when educational institutions such as MIT and Caltech, which merged science and engineering, began to flourish with the help of government funding for R&D oriented projects.Footnote 43 Funding for R&D would one day raise technology-oriented educational institutions such as these to the point where they rivaled the old elite universities, some of which resisted the integration of engineering and technical training.Footnote 44 “Research and development” took hold as a common term to describe this logic in the early 1940’s, with the formation of the United States OSRD (Office of Scientific Research and Development).Footnote 45

The concept of scientific simulation is deeply connected to the logic of R&D. Even before computer simulation, R&D institutions funded considerable work on material simulations. For example, NACA (National Advisory Committee for Aeronautics), a contemporary to Edison’s Navy Committee and predecessor to NASA, made simulation a key component of its mission via the wind tunnel. NACA-sponsored researchers at Stanford made important early advances in propeller design through wind tunnel work. The construction of new wind tunnels such as the VDT (Variable Density Tunnel) in 1922 and the FST (Full Scale Tunnel) in 1931 lead to significant discoveries that put the United States at the forefront of aeronautic research. The wind tunnel was a tool for physical simulation. It created artificial conditions that were meant to mimic real world conditions. It offered the opportunity to better understand the dynamic properties of air in motion, but it also allowed the practical testing of aerodynamic designs. Computer simulation would be developed for the same uses. Some of the examples of nonlinear simulations in the preceding section serve a similar testing function. Indeed, technologies such as fluid simulation have effectively replaced the wind tunnel’s role in R&D, making these expensive physical facilities far less common than they used to be. Nonlinear animation is a product of this history and understanding the stakes of these issues will help us to understand changes in film industry and technology.

Institutions like NACA and the OSRD demonstrate that R&D took shape at a nexus between military and industrial interests, and that it represented a merger between academic science and engineering. As historian Stuart Leslie puts it, this configuration “blurred distinctions between theory and practice, science and engineering, civilian and military.”Footnote 46 As early as 1945 influential research policy maker Vannevar Bush expressed concern over how military R&D directives were transforming science and limiting scientific “freedom of inquiry.” Bush published two important texts at the end of the Second World War on this topic: “Science: The Endless Frontier,” a whitepaper addressed to President Roosevelt, and “As We May Think,” an article published in The Atlantic.Footnote 47 Bush had been responsible for administering funding for scientific research during the war. He led NACA and directed government funding during the Second World War as head of the OSRD, the forerunner of ARPA (Advanced Research Projects Agency) and the NSF (National Science Foundation). He was also an active cyberneticist and the founder of Raytheon, a major military R&D company. Yet in these papers, he calls for an end to the “intermediate” science that was being done during the war and a return to basic research.

While R&D raises the issue of how scientific research is influenced and directed toward certain negative ends, many philosophers of science have cautioned against interpreting engineering and technology in negative terms as contrasted with a purified, idealized definition of science.Footnote 48 The concept of simulation plays a key role in this philosophy. In an influential essay from 1966, Mario Bunge argues that “technology is as theory laden as science” and that there should be a distinction made between “substantive theories,” such as the principles of airflow, and “operative theory” such as how airplanes are designed and how airports are organized.Footnote 49 These new forms of inquiry produce new forms of knowledge. Many historians and philosophers would investigate this issue further, giving greater consideration to the many forms of knowledge produced by engineering and technology.Footnote 50

Aeronautical engineer and historian Walter Vincenti similarly questions why we construct science as the site of knowledge and engineering as the mere application of that knowledge. As a corrective, he seeks to theorize the type of knowledge produced through engineering. Using examples from his field of aeronautics, he argues in his work that engineering produces empirical knowledge through the testing of designs.Footnote 51 For example, testing a new wing design in a wind tunnel is a kind of experiment that produces knowledge. Although Vincenti and Bunge are not directly discussing simulation but rather engineering and technology more broadly, it is worth noting that their key examples do involve simulation.

Herbert Simon makes this connection explicit, arguing that computer simulation is in essence a form of engineering epistemology: it understands the world through the design and testing of models.Footnote 52 You make a model (material or digital) based on a theory and you see what will happen under certain conditions. Simulation provides a form of knowledge that is not exactly empirical (it does not come from actual events), yet it is not entirely theoretical either. The R&D paradigm, which was born of an industrial and governmental desire for technological advance, thus produced a new form of knowledge. While we should remain critical of the way R&D is often employed for militaristic or politically suspect ends, this is not a reason to ignore the theoretical complexity of this way of thinking. This is a new form of knowledge we can look for in digital VFX and animation, especially in examples of nonlinear animation, and we should seek to understand it. Indeed, while digital technology has received the majority of attention as an agent of change in the past few decades of film history, the role of R&D, and R&D institutions like the field of computer science, have themselves been important agents of change.

Computer Science and R&D

As a research discipline, computer science is perhaps the most paradigmatic example of the institutional logic of R&D. It is a field where making things and doing research are one in the same. While other scientists try to understand the physics of weather patterns or the behavior patterns of people, computer scientists study an artifice, a made thing. The institutional context of this new discipline embraced and expanded the epistemic logic of simulation as the “science of the artificial.” It also became the site where research for new media industry tools like nonlinear animation took place. Computer science organizations like the Association for Computing Machinery (ACM) and its special interest group SIGGRAPH have become imbricated with media industries since 1980, particularly Hollywood, as the following chapter will explain in detail. Thus, the rise of the logic of R&D in film industries like animation and VFX offers a different context for changes in the past few decades. Rather than looking for the effects of digital technology on filmmaking, one can see how R&D produced the conditions for those technologies to be developed and used.

There are two possible definitions for computer science that posit two different starting points. On the one hand, one might include all research that was in retrospect conceptually relevant to the computer avant la lettre. Here we would include things such as Alan Turing’s theoretical work in the 1930s, Lord Kelvin, Ada Lovelace, and Charles Babbage’s work on a differential analysis machine, and so forth. Lev Manovich’s genealogy of new media, for example, privileges Babbage as the starting point of computers.Footnote 53 The second definition of computer science would instead be limited to the institutional formation of computer science departments in universities. This later institutional definition prompts us to consider the external conditions that shaped the discipline and reveals how intimately linked computers are to the logic of R&D. Rather than interpreting nonlinear animation (or computer graphics in general) as extending from the fundamental properties of numerical computation laid out by the likes of Babbage, we can instead look to the institutional and epistemic frame of R&D.

Computer science was not necessarily destined to be an R&D discipline. In its earliest days there were many who imagined it as part of more traditional academic pursuits. A good example of this is early British computer science. In the British context, the computer was seen as a theoretical and philosophical tool. Nineteenth century experiments with differential analysis machines in Britain had very different goals than the computer science research conducted by people like Vannevar Bush in the US in the 1930s and 1940s. Mark Bowles observes these to be cultural differences; while the “technological style” of American computer science was one of engineering and optimism, the British approach was mathematical and theoretical.Footnote 54 While Bowles’ cultural observations are intriguing, there are also important historical reasons for these differences. The context that produced the American version of computer science was a military-industrial-academic complex that saw research funds earmarked for things that might yield geopolitical strategic advantage. As Paul Ceruzzi argues, the U.S. struck a balance between state and private involvement in academics, unlike Europe or the USSR.Footnote 55

Though the idea that only the U.S. could have invented the programmable electronic computer sounds like a step too far toward exceptionalism, it is true that different cultures sought to make use of the same concepts for different purposes. Different contexts led to different kinds of computer sciences. The propose of American computers science, the version of computer science that came to dominate globally as a paradigm, is oriented toward developing new technologies that might benefit industrial and national interests.

The first reprogrammable electronic computers in the United States demonstrate the logic of R&D and the cooperative unions formed between government, industry, and research institutions. The ENIAC was designed and constructed for the United States Ballistic Research Laboratory by two members of the University of Pennsylvania’s Moore School of Electrical Engineering, physicist John Mauchly and electrical engineer J. Presper Eckert. Mauchly and Eckert decided to go into business for themselves after the ENIAC, selling their second computer, the UNIVAC (Universal Automatic Computer), commercially. Yet their first customers were the United States Census Bureau and Northrop Aviation, a major Air Force contractor.Footnote 56

Once the computer became a product, demand for trained professionals to design and maintain systems grew. Products like the IBM 650 were sold to universities if, and only if, those universities agreed to teach a course in computer science.Footnote 57 The fact that a private company would offer a special deal to universities demonstrates the synergistic logic of R&D. IBM had an interest in increased academic use of computers for several reasons. First, researchers might discover new uses for the computer, thus leading to future products. Second, IBM ensured that future workers who would go on to jobs in government and industry would be familiar with their equipment. Third, IBM had an interest in hiring new researcher and engineers and developing relationships with universities ensured they would have access to the best minds of the future. This logic continues in the relationship between VFX and animation studios and research institutions.

There was, however, resistance to the R&D nature of computer science in some of the more prominent and established universities in the United States. The idea that technicians, people who build things, would be rubbing shoulders with professors was objectionable to traditionalists.Footnote 58 Harvard, for example, was resistant to the inclusion of any kind of engineering field at first.Footnote 59 They imagined themselves as educating the leaders of tomorrow, not technical workers. Although the University of Pennsylvanian’s Moore School was pivotal in inventing the programmable computer, when computer science first started to take shape as an academic discipline the school chose to outsource the operation of the actual devices.Footnote 60 Building technical things was not part of the traditional liberal arts education.

Rather than being a discipline that advanced the design of ever bigger and better technologies in congress with industrial and governmental interests, universities such as these imagined computer science more as an extension of mathematics, much the way it was imagined in the British context. This approach protected this new discipline from undue direction and shaping.Footnote 61 If computer science is theoretical and largely useless to military and industry, then it is free to explore the potentiality of computation in any direction it sees fit, without influence from directed funding: computer science as basic science. As computer scientist Michael R. Fellows argued at the time, “computer science is not about machines, in the same way that astronomy is not about telescopes.” The engineers responsible for building and running computers were mere service people, hardly the contemporaries of the mathematical researchers. These criticisms have endured in the form of philosophical discussions about the epistemic role of computer science. Is it a science? Is it bad science?Footnote 62 The role of R&D in computer science today is beyond question though. To today’s computer scientists, this resistance sounds like pure elitism and the result of stodgy out of touch professors. The merger of knowing and knowing how, of science and engineering, of research and development, was irresistible in the end. Furthermore, as Simon, Bunge, and Vincenti argue, there is no reason to privilege science as the only source of knowledge. We learn a great deal by making artifices, especially simulations.

Computer science and R&D created the institutional context that gave rise to the many different uses of nonlinear computer simulation, from financial mathematics to management science to the nonlinear animation tools employed in Hollywood blockbusters. This context suggests that nonlinear animation offers a vision of power and control much like management science or finance. While stochastics and dynamics seem to introduce a little bit of anarchy and chaos into calculations, they create chaos in order to contain it. The following chapters will further study nonlinear animation in this context. But limiting study of this subject to only this one angle misses out on some of the epistemic nuance of nonlinear animation. Just as cinema was a paradigmatic product of the episteme of industrial modernity, it was also a medium that gave rise to a range of different visions. Nonlinear animation should be approached as holistically as this.

Speculative Simulation

Though they share a great deal with tools used by managers and investors, nonlinear animation tools open potentialities for fictional, imaginative, and speculative use. Some work in this direction has already been done in the field of games studies. The initial push to define game studies as a field discrete from film or literature studies, the so-called ludology versus narratology movement, centered on the concept of games as a simulation-based medium that should be interpreted based on the rules and causal structures programmed into them. In his introduction to the very first issue of the online academic journal Game Studies,Espen Aarseth writes that the concept of simulation is “crucial” to ludology as a “radically different alternative to narratives.”Footnote 63Gonzalo Frasca, another foundational ludologist, treats games and simulations as virtually synonymous. In several essays, he insists on the difference between simulation, which models the mechanical function of systems, and “representation,” which he associates with painting and film.Footnote 64 The field of game studies has since made peace with narrative and visual analysis, and expanded into numerous other methodologies including ethnographic study of players.Footnote 65 But these early ludology concepts continue to be important to the field, and they provide a starting point to begin to understand the way nonlinear animations make meaning as simulations.

Without specifically naming nonlinearity, Aarseth and Frasca both note the unpredictable outcomes of games and the dynamics of play as defining qualities. Aarseth writes that “The complex nature of simulations is such that a result can’t be predicted beforehand; it can vary greatly depending on the player’s luck, skill and creativity.”Footnote 66 Frasca discusses games as “dynamic systems” with unpredictable outcomes.Footnote 67 This is key to the way they differentiate games form linear narrative forms. Narrative media tell a story while games see players participate in the writing of a story. The essence of ludology is the writing of rules that govern the mechanisms of unpredictable and dynamic play.

As I noted earlier in this chapter, nonlinearity became a key part of gaming in the Prussian war games of the 1800s. The desire to quantify every aspect of war lead to the idea of using random dice rolls to simulate the uncertainty of the battlefield. These simulations combined the dynamics of a chess game with the naked stochastic randomness of games of chance and used both to model the mechanisms of real-world events, both in order to better understand them and to anticipate the future. Although early ludologists like Frasca emphasize the transformative effects of the computer, many scholars have since noted the long durée of this genealogy. Jon Peterson notes the connection between war games and tabletop role-playing games like Dungeons & Dragons, and William Uricchio notes these features at work in computer games with large historical sweeps like Sid Meier’s Civilization series. Here again the concept of nonlinear simulation is key. Uricchio describes how these historical games set conditions, like a computer simulation of some past event, but the unpredictable unfolding of gameplay leads to different outcomes.Footnote 68 So, for example, a player might start with the same historical conditions as the Roman Empire, but history might unfold in a completely different way. Uricchio argues that these games open up master narratives of history and focus our imagination on the possibilities of a contingent unfolding history.

Clearly, VFX and animation differ from games in some significant ways. Those early ludologists would certainly shudder at the idea of the two being described together. Nonlinear animations for film will eventually create a single image that is the same every time it is played back. Thus, they do not create the open-ended user experiences theorized by game studies. Yet both nonlinear animation and games are premised on the concept of building models as a way of representing the world. Broader conceptualizations of fictional simulation provide a possible theoretical framework for thinking about nonlinear animation as well as other fictional forms of simulation like games together.

Animation scholars have already noted the importance of recognizing the connection between animation and games. Almost all games are graphic in nature, and thus rely on animated sequences of images. As Chris Pallant argues, Johan Huizinga’s concept of the “magic circle,” so valued by ludologists for the way it theorizes “temporary worlds,” applies readily to animation and other forms of visual media and performance.Footnote 69 Focusing on simulation and experimentation uncovers yet more common ground between the two.

Gertrud Koch argues that animation is “isomorphic” with scientific experimentation, in the sense that they both work at the threshold of our understanding of the world and invent theories for what is beyond.Footnote 70 In other words, animation and the experiment both speculate about reality in an iterative and contingent ongoing process. Nonlinear animation is a paradigmatic case of this common ground between animation and experimental science. To build a simulation is to attempt a new way of understanding the world. A simulation is but one attempt to model the mechanism behind some real phenomenon. It is speculative. Indeed, simulation has proven an unlikely ally in developments in speculative materialist ontologies concerned with the “mind independence” of reality.

How exactly do we think about mind independence? What is philosophy without humans or thought? Manuel DeLanda offers an answer to this question that utilizes the concept of simulation. DeLanda’s primary initiative is to interpret the ontology of Gilles Deleuze in a realist context, articulating his own version of process ontology. Focusing only on his work Difference and Repetition, he argues that Deleuze, “grants reality full autonomy from the human mind, disregarding the difference between the observable and the unobservable, and the anthropocentrism this distinction implies.”Footnote 71 DeLanda takes examples of nonlinear sciences and argues for their compatibility with a process ontology that sees things becoming actual within a space of possibility. This is an effort to formulate a realist position without locking things down into naïve scientific realism. He is highly critical of scientific positivists who only believe in the mind-independence of things that can be schematized within their established laws.

The key ontological gesture of this approach is that it allows us to see anything in the world as having been composed of an assemblage of interacting factors. DeLanda uses the example of a storm to illustrate this point. In a way it is obvious that a storm is composed of an assemblage of factors, it is an event that emerges as a result from things like temperature, airflow, and moisture, but his larger argument is that all things in the world are in fact ontologically the same as a storm.Footnote 72 Animals and even rocks are all material things that came-to-be as the result of an assemblage of constantly changing contingent factors. Simulation therefore confronts us with the indeterminacy of reality and the impossibility of schematizing it using stable laws. It demonstrates reality’s continued capacity to surprise us, to assert its autonomy. His approach undermines our ability to schematize reality by emphasizing contingency and the singularity of every individual thing or event.

So how could we ever mediate the world in this way? How could we represent reality as the result of a non-deterministic process of becoming that is autonomous from human perception and understanding? The answer DeLanda offers is computer simulation. In his book Philosophy and Simulation: The Emergence of Synthetic Reason he runs through a variety of examples of how simulation can be useful as a tool for realist philosophy. He argues that simulation allows us to conceive of things, not merely in terms of their properties but also in terms of the virtual qualities of their tendencies and capacities. Simulation defines things in terms of what they may become. Simulation does “justice to the creative powers of matter and energy.” It is a way to explore the “structure of the space of possibility.”Footnote 73

In order to imagine a speculative realist philosophy DeLanda effectively merges mediation and philosophy; he makes thought “synthetic.” This sounds contradictory at first. Mediation is human after all. A medium is what sits between the world and us. Mark Hansen and W.J.T. Mitchell note the importance of the human in theorizing media, writing, “Before it becomes available to designate any technically specific form of mediation, linked to a concrete medium, media names an ontological condition of humanization – the constitutive operation of exteriorization and invention.”Footnote 74 But this definition of media is not as far from DeLanda’s approach as it may seem at first. He is describing a “humanization” through invention. Simulations are a sort of translator. Simulations can think without us, yet they are also ultimately our “inventions.”

There is good cause to be a little skeptical of DeLanda’s use of simulation. A few critics have noted how uncritical he is of simulation. Matthijs Kouw writes in a review of DeLanda’s book that simulation has more explanatory power for him than it does even for the sciences.Footnote 75 Could it be that he believes the virtual character of computer simulation is an effective homolog for his process ontology? Is he reducing reality to mere computation, just as some Cold War cyberneticists did?

There is also the issue of neglecting the way simulations often entail very specific ways of seeing the world. DeLanda’s uses examples from the R&D history of simulation in his book. He discusses cellular automata and Conway’s Game of Life, two examples of nonlinear simulation covered earlier in this chapter. Simulations were used in the making of nuclear bombs. They are used by management scientists and financial mathematicians to extract as much capital as possible. Simulations also have an appealing way of excluding anything you would like to exclude: a new way to sanitize the messiness of history and to manufacture epistemic authority.

Yet DeLanda is advocating for a speculative disposition. Any attempt to use simulation would be but an anecdote, an experiment, a fictionalization that could at best glimpse some aspect of the character of reality. More than anything, it confronts us with the limits of our understanding. Isabelle Stengers writes, “Computer simulations not only propose an advent of the fictional use of mathematics, they subvert equally the hierarchy between the purified phenomenon, (responding to the ideal intelligibility invented by the experimental representation), and anecdotal complications.”Footnote 76 Simulations, in other words, offer us unlooked-for things that might confound our understanding of the world. Simulation can exceed the settled, restrictive epistemology that the arts generally attribute to science and technology. If cinema can be thought of as an apparatus tied to a specific technical disposition regarding time, perspective and so forth, the model-building activity of simulation presents the opportunity to rebuild the apparatus every time you use it. It is never tied to a specific way of seeing and the ideology that might entail.

This theoretical approach to fictional uses of simulation could be applied to nonlinear animation, games, or other forms of generative computational art. The contingency of nonlinearity represents the threshold of our understanding, past this point reality is beyond our control and beyond our ability to predict. Thinking about engineering as part of film production, focusing on “knowing how,” allows us to consider these possibilities. Yet, at the same time, the history of nonlinear simulation seems configured toward developing ways to control and contain unpredictable contingency, and thus to tame or compromise it. The R&D histories of computer science and nonlinear simulation show that these technologies developed within institutional and economic contexts that directed them toward certain uses. Understanding nonlinear animation means considering both of these conflicting sides. There is much more going on here that a definition of simulation as mere artificial fakery or a symptom of postmodernity accounts for.