The past chapters have covered the way R&D has become a vital part of studio economics and how R&D, computer graphics, and nonlinear animation share a complex interlinked institutional history. Chapter 2 also proposed a theoretical framework for understanding nonlinear simulation as a form of animation that hinges on engineering speculative models, as a kind of R&D experiment. All of this points toward a convergence between technology development and animation. The question remains whether any of this is evident in production practices and studio organization over time, or whether R&D and animation production work have remained separate domains. The screen credits for contemporary animated and VFX-laden features today have whole sections dedicated to explicitly technical staff, and Chap. 2 established how studios began to employ workers with titles like “R&D” and “Principal Graphic Scientist” in the 1980s. But all this technical and R&D activity could be insulated from the rest of production, with artists clearly separated from technicians and researchers. This chapter will go through several examples that show production did in fact become imbricated with R&D starting in the 1990s, leading to even more unusual screen credits like “R&D Artist.”

Understanding the role of R&D in production is fundamental for understanding changes in production labor since the introduction of digital technology. Scholars like Tony Tarantini, Hyejin Yoon, Edward J. Malecki, and Hye Jean Chung have noted the role digital technology has played in changes in training, hierarchies, and the internationalization of labor as part of broader post-Fordist trends.Footnote 1 Technology was not just an external force acting on these media industries though. It was being produced by studios for specific purposes. Thus, it is important to understand how technology development has become integrated into production practices as an internal force of change.

As studios began hiring researchers, sponsoring research, creating connections with institutions, and participating in scholarly conferences in the 1980s, it makes sense that R&D would start to play a role in production work itself. At first this would have only been the case in very specialized, rarefied effects sequences, like Loren Carpenter and William Reeve’s work creating bespoke graphics for the “genesis sequence” in Star Trek II: The Wrath of Khan. Over time it would become more and more common for R&D to play a direct role in production. Technologies like the nonlinear animation work Carpenter and Reeves were doing further require us to rethink the relationship between animation and automation. They were making software in order to make an image. All nonlinear animation “FX” work since has, to a greater or lesser extent, followed in this same logic. An FX artist makes animations by making simulations that produce unpredictable movement. They do this by choosing the right software and plug-ins and making them work together, writing scripts in a given program’s language, and manipulating the parameters of a simulation. All this work sits in a liminal space somewhere between animation production and technical work. Nonlinear animation thus provides a particularly good example of how R&D and production have overlapped.

The Cold War logic of technology development has also very clearly had an influence on the general organization of VFX and animation “workflows” since the 1990s. Workflows have their roots in the concept of project management, which was first developed as a way to keep US nuclear missile programs ahead of their Soviet rivals. This idea was then carried further by sectors like the automotive industry as a way of organizing the development of new products, called “product development.” The design of VFX and animation workflows is also heavily influenced by software development principles, especially a school of software development referred to as “agile” founded in 2001, which emphasizes flexibility, reconfiguration, iteration, and customization. These organizational paradigms for developing technology all played an important part in shaping development-oriented animation and VFX production workflows. Today the term “development,” often shortened to “dev,” is not just used to describe technical work like software development, R&D, or “tech dev,” but also production work like “look development” or “look dev,” which can refer to an iterative process of design refinement or a specialized job where artists shape lighting and rendering style.Footnote 2

The implementation of these workflow paradigms required the extensive engineering of the production “pipelines” that connect different production processes to each other and enable the “development” of an animated sequence. These pipelines are constantly being rebuilt for different projects, enabling agile development, and further blurring the line between production work and technical work.

These principles of flexibility and agility did not emerge in a vacuum. They have strong correspondences to post-Fordist flexible accumulation, and they have led to a highly precarious labor system that sees workers moving from one six-month contract to the next. These factors need to be considered together. The shift from large-scale Cold War federal R&D programs to private industries, the rise of flexible approaches to project management, and the political-economic turn toward deregulation and entrepreneurialism all go hand in hand. This chapter will address each of these points in turn, starting with the rise of workflows, project management, and software development, moving on to the correlated emergence of software pipelines, then to nonlinear animation practices, before finally reflecting on how the gradual disappearance of the line between engineering and production has affected labor and the construction of worker subjectivities.

Project Management, Product Development, and Agile Development

Since 1990, VFX and animation production have increasingly been defined by the dual concepts of workflow and pipeline. In the common quotidian parlance of these industries, workflow and pipeline refer to all the work that needs to be done in order to achieve a final product. Although people in the industry often conflate these two terms, each have important distinct technical definitions. The Visual Effect Society handbook defines a workflow as “a specific set of procedures and deliverables that defines a goal.”Footnote 3 Workflow describes each stage of a production, all the jobs that need to be done to ship the final product. Pipelines are the technical infrastructure of data exchange that makes workflows possible.

As many scholars have observed, major film studios, such as those in the Hollywood or Weimar film industries, operated like factories.Footnote 4David Bordwell, Janet Staiger, and Kristin Thompson argue that factory-style management is a key component for understanding the classic Hollywood studio system, even its aesthetics.Footnote 5 The concept of workflow in animation and VFX is similarly borrowed from industrial management theory. But there are some important differences between the assembly-line style of twentieth-century studio film production and these more recent approaches to production. While the former focuses on building a regulated and reliable system for outputting one product after another, the former treats each film as a discrete project. In a sense animation and VFX studios re-tool the factory for every film.

To understand this distinction in its simplest terms, consider two of the pioneers of industrial management: Frederick Taylor and Henri Gantt. While Taylor focused on the regularity and efficiency of factory production lines, Henri Gantt focused on organizing the steps needed to complete a task. Gantt’s approach is illustrated quite clearly by the Gantt chart, a visual organizational tool still used today. In a Gantt chart a project manager maps out multiple parallel jobs along a grid with a time-based axis, carefully timing each job in order to avoid slowing subsequent jobs that will rely upon its completion. Animation and VFX production represent a trend away from the Taylorist approach and toward Gantt, a move toward treating production like building a skyscraper or a steamship and away from turning out a uniform product in vast quantities. Gantt’s approach is an early example of what would become project management.

The Project Management Institute defines a project as being “temporary … with defined scope and resources.”Footnote 6 Thus, project management does not apply to constant day-to-day operations. Project management is also intimately linked with R&D. It emerged as a term in the early Cold War, alongside cognate concepts like operations research and systems engineering, as one of three “approaches to big technology.”Footnote 7 Brigadier Bernard Schriever came up with the now influential concept of project management out of necessity in the context of the nuclear arms race. He was responsible for the US military’s new Inter-Continental Ballistic Missile program, which was under extreme pressure to stay ahead of Soviet aerospace and nuclear advances. Thus, Schriever began thinking about how to facilitate technological development as fast as possible. Schriever’s ideas became so influential they caught the attention of Secretary of Defense Robert McNamara, who spread the principles of project management throughout NATO militaries. In the hands of Schriever, project management was not just a way of organizing labor for a specific project, it was a meta-technology; it was a technology of how to create the conditions for technological advance. This goes a long way toward explaining why project management has become influential in animation and VFX because, as the preceding chapters have established, studios in these industries have increasingly begun to support R&D and produce valuable technological properties.

In the 1980s project management was developed further by private industries outside of the military-industrial complex in fields such as automobile manufacturing and pharmaceuticals. From this work emerged the concept of “product development.” Product development is a version of project management that focuses specifically on creating a marketable product. One of the earliest and most influential examples of this was the Toyota Production System (TPS). TPS is an umbrella term for Toyota’s approach to management that includes such concepts as just-in-time logistics and total quality management. TPS was a key idea in product development because it considered the entire process of getting a product to market: from initial concept, research, and engineering, to manufacture and distribution. These dual concepts of R&D-focused project management and product-focused product development both provided the conceptual groundwork for changes in animation workflows in the 1990s and VFX workflows soon after.

The work of Steve Jobs is perhaps the most iconic example of technology-focused product development. Product development defined Jobs’ glorified return to Apple in 1997, which saw the company’s value grow almost hundred-fold thanks to products like the iPhone. Before his return to Apple, Jobs was the majority owner of Pixar, where he also practiced his now famous approach to product development. When he bought the company, he had intended to develop a product that would make 3D animation broadly accessible, like desktop publishing.Footnote 8 So focused was he on product development he insisted on spending scarce resources on the distinctive sculpted granite design of the Pixar Image Computer P-II.

Long-time Pixar CEO Ed Catmull was also heavily influenced by Toyota’s TPS philosophy.Footnote 9 Some studios might hesitate to refer to their movies as products in public, but Catmull wears his product development mindset on his sleeve. In an article for the Harvard Business Review he writes,

People tend to think of creativity as a mysterious solo act, and they typically reduce products to a single idea: This is a movie about toys, or dinosaurs, or love, they’ll say. However, in filmmaking and many other kinds of complex product development, creativity involves a large number of people from different disciplines working effectively together to solve a great many problems. The initial idea for the movie—what people in the movie business call “the high concept”—is merely one step in a long, arduous process that takes four to five years.Footnote 10

In some respects, the product development mindset has become ubiquitous in large-scale film production. The high-concept film, which integrates planning for ancillary markets, revenue streams, and corporate synergy, is a commonplace of conglomerated Hollywood. Yet Catmull is identifying another aspect of the product development mindset here. He is trying to communicate how Toyota-like his approach is. He is thinking about the process, from the original idea, all the way through every step of production. He is talking about refining the film, about developing it.

A handbook on contemporary animation and VFX workflows and pipelines reads, “the main difference between factory goods and art is that art goes through a review and refining process.”Footnote 11 The point they are making is that animation and VFX production are a development process, just as Catmull says. Indeed, some management researchers have singled out these industries as examples for studying what they refer to as the “theory of managing creativity-intensive processes” (TMCP). Researcher Stefan Seidel, for example, has studied the VFX industry as a model for TMCP because of how “process aware” its production processes are.Footnote 12

Project management and product development concepts from industries like consumer electronics and automotive have clearly had an influence on the animation and VFX industries. And this influence gives us a hint as to how principles for developing technologies and products have shaped these media industries. But these fields pale in comparison to the greatest single influence: software development. Software development might seem like a self-evident concept. Isn’t all programming software development? In fact, the concept has only been around since the 1980s, and it represents perhaps the most broadly influential application of project management to date. During the 1960s, 70s, and 80s software industries were undergoing what is now referred to as the “software crisis,” where projects had an alarming tendency to go over budget and to under-deliver. As software engineering histories tell the story now, the problem was the lack of a conventional process for how to build something. What steps to do you take? What are the best practices? Project management offered a way to make software-building more systematized and rigorous, like engineering. This was also the period when the term “software engineering” became popular as a way of describing an approach to programming that was rigorous and accountable. The dominant software development approach that emerged from this era is now referred to as “waterfall.” Waterfall consists of discrete stages, each of which must be completed in turn: requirement analysis, design, development, testing, and release.

The waterfall development model puts emphasis on establishing the requirements of the client before getting into the detailed design stage. This ensures that a team does not spend countless hours and multitudinous resources building a product that does not do what the client needs. While this had clear benefits, it was not long until this gradual, careful approach to software development became at odds with the demands of private industry. In the 1990s, critics began to gather, and their key complaint was that the world simply moved too quickly for this approach. While waterfall was effective at producing a refined piece of software that did exactly what was needed, over its long development process “what was needed” could change. Thus, aerospace engineer Jon Kern and eleven other engineers and programmers conceived in 2001 to write a manifesto for a new approach to development: The Agile Manifesto.Footnote 13 Agile software development is focused on flexibility and responding to change. The idea is to get a product into a user’s hand as quickly as possible, then to respond to the ongoing needs of the user through successive iterations. This is an approach to engineering steeped in the Silicon Valley ideology of entrepreneurial disruption. Rather than publishing a whitepaper at a stuffy conference, these engineers wrote a “manifesto” and published it on a website. The principles of responsive, reconfigurable flexibility that The Agile Manifesto espouses define most of the contemporary private software development.

One can see the influence of this way of thinking in recent changes in how software products are sold. Not so long ago, when you purchased a copy of Microsoft Office or the latest game, you brought it home, installed in your computer, and that was the end of it. In the late 1990s ubiquitous connectivity meant that software companies could push revisions, in the form of “updates,” over the internet. This was a key tool in solving problems like the Y2K bug, but it also opened the door to an agile approach to product release, where the version available on day one was not necessarily the final product. Today, more and more software products offered by companies like Adobe and Microsoft are shifting to a model where customers pay a monthly subscription for an ever-evolving product, known as “software as a service.” This has economic advantages to be sure. Requiring constant connection to a server is a great way to combat piracy, and this model forces customers to buy the latest editions, rather than sticking with what they already have. One might expect that agile development would be incompatible with animation production, but in fact it has been transforming it for quite some time.

Pixar has made agile principles a key element of its public-facing management philosophy since the earliest days of agile development discourse, avant la lettre. One of the studio’s favorite promotional anecdotes about the production of Toy Story2 communicates their uncompromising commitment to product development and their flexible and responsive workflows. Due to their distribution agreement with Disney, Pixar had to make a sequel to Toy Story (1995). This led to the studio running multiple projects at the same time. Because of this increased level of activity, Toy Story 2 (1999) reached a high level of completion before key decision maker John Lasseter had fully scrutinized it.Footnote 14 When he finally did, he decided it needed re-working and they made extensive revisions, despite being far along in production. Rather than planning a single vision of the film at the beginning and seeing it through to the end, Pixar made a product part of the way, tested it, found it wanting, and went back to the drawing board. They were willing to iterate and revise. This story likely communicates their self-image more than actual practices, but it is still revealing. The point is to demonstrate the importance of building flexible structures that allow for iteration and revision, to build a workflow where it is possible to make changes at a late stage. These values have spread far and wide in the animation industry, and, significantly, in the VFX industry. This software development logic that Pixar championed can be observed spreading throughout the VFX industry in the 2000s in the design of their production workflows.

Organizing Production Workflows

VFX workflows both show how ingrained the logic of “dev” has become in VFX production since the early 2000s and, through their intimate link to pipelines, demonstrate how building connective infrastructure has become a fundamental part of production over the same period. The nature of VFX workflows has unquestionably been influenced by Pixar’s early example, but there were a variety of factors working in congress. As more people with software engineering and computer science backgrounds entered VFX studios, they brought these ideas with them. Even more importantly, the flexibility of these agile development principles responded to challenges VFX studios were facing as vendors who must competitively bid for studio contracts. Agility offered a way of living with the unpredictable demands of film studio clients in what was becoming a ruthlessly competitive industry. Thus, VFX studio’s use of agile principles are also the product of neoliberal economic conditions.

Since at least the early 2000s the VFX industry has been defined by an ultra-competitive bidding process.Footnote 15 This process starts with the film studio assembling a short list of VFX studios based on existing relationships, reputation, experience, and the VFX studio’s show reel.Footnote 16 This is a process that one VFX producer’s handbook in 2010 likened to casting actors: certain vendors are suited to certain roles, and a studio-side VFX supervisor can judge their fit based on their past work.Footnote 17 Once the studio has established a short-list of prospective VFX studios, they will ask for competitive bids from the vendors. Tax incentives have more recently become an important factor in bidding. A well-organized film production will have a plan for what local tax breaks they are hoping to benefit from and the VFX vendor will have to be able to commit to employing a certain number of workers in a certain city.Footnote 18 All of this adds to the complexity of the interaction between the film studio and its VFX vendors, and as a result the need for flexible development workflows.

The combination of competitive bidding with the implicit need to respond to changing demands has been a key point of contention within the VFX industry. When Rhythm and Hues famously went bankrupt after the overages of Life of Pi (2012) many workers and VFX studios rallied around this complaint. More recently, the constant revision of the vendor’s bid has become baked into the production process. Now VFX studios have a department that calculates and updates their costs with every new unforeseen development and challenge. The industry has, in other words, resigned itself to the reality of constantly changing demands and has developed more agile procedures to deal with it.Footnote 19

Contracts between studios and VFX vendors will specify budgets and also delivery “turnover” dates, the specific dates when the VFX vendor will turn over their finished work.Footnote 20 In the 2010s it became more common for studios to use early work for promotional purposes though.Footnote 21 These sequences may not be the same as the final product that appears in theaters. Close analysis of trailers and features reveals how different they can be. For example, if you compare the first trailer for Guardians of The Galaxy (2014), a film co-produced by British visual effects studio Motion Picture Company (MPC), you can see a great many differences between the trailer and the final version shown in theaters.Footnote 22 The fact that parts of the film are completed and then revised demonstrates agile development thinking.

Once the studio and VFX vendor establish what they need for each shot, the VFX vendor can begin building the organizational infrastructure they will need for the job. While much of the in-house infrastructure, such as office space and workstations will likely be the same from project to project, the studio will need to arrange many things before a project can start. For starters, they may sub-contract certain jobs to other VFX studios. At the very least the studio will need to hire workers on a project-specific six-month contract. Since the 2000s the norm has been for the number of workers to generally follow a bell curve, with few workers staying on from the very beginning to the very end.Footnote 23 As one VFX producer’s manual writes, a VFX unit “may spring into life almost any time during production or postproduction. Its life may be as short as a mayfly … or it may last several months.”Footnote 24

Even certain hardware infrastructure that was formerly in-house became more flexible and agile-friendly in the 2010s. While VFX and animation studios used to have vast cutting-edge server farms, now that can use cloud-based servers like Amazon Web Services (AWS). In 2011 VFX studio Zero VFX developed a cloud-based rendering tool tailor made for the industry called Zync. In 2014 Google purchased Zync and integrated it into their Google Cloud Platform. This service is noteworthy because it even takes over some of the software needs of VFX studios. You simply send them a V-Ray or Renderman project and they do the rest. The homepage for Zync reads: “Two things continue to be true in visual effects and rendering projects: schedules fluctuate, and the effort to get to final remains impossible to predict.”Footnote 25 Thus, even the basic infrastructure of VFX studios is becoming profoundly reconfigurable and reprogrammable as a way of responding to uncertainty.

It is the job of several workers to manage all the unexpected changes coming from the film studio and to facilitate the flexible flow of content from shoots into the VFX workflow. Prime amongst these is the VFX supervisor. In the late 1990s and early 2000s the role of the VFX supervisor started to become more deeply integrated in films’ production. Going far beyond simply sourcing the “plates” (live-action footage) they would need from film shoots, they started to become more involved in planning shots and solving on-site technical problems.Footnote 26 Indeed, by 2010 it was not uncommon for VFX supervisors to work as second unit directors.Footnote 27

The vast majority of VFX shots contain at least some content shot on a set or location and since the 2000s the variety and amount of data gathered seems to have steadily increased. The VFX supervisor and coordinators oversee this work, done by “data collectors” and “data wranglers.”Footnote 28 Plates will contain some information that needs to be kept, for example, an actress’s performance, and some that needs to be removed and replaced by a composited effect, for example, wires from a special effects sequence. In addition to this, data collectors will gather information about the shoot like the lenses used, frame counts, file format info, and pictures of sets and locations.Footnote 29 It has also become the norm since 1999 to record ambient lighting using some sort of HDRI system.Footnote 30 Additional data that a VFX vendor might gather today includes performance capture data, light detection and ranging (LIDAR) volumetric scans of sets, and even volumetric scans of performers and props.Footnote 31

A production photo of any contemporary Hollywood blockbuster will reveal just how prolific these forms of data collection have become. It is not uncommon to see a performer in a green leotard covered in motion capture tracking points, holding a green mandrill (stand-in object) in front of a green screen set. Though many of these processes have become routine, the types of assets and content a VFX vendor must manage are going to vary from project to project. They also are not going to arrive exactly when the VFX artists need them. This, again, requires an agile approach to project management. The vendor will have to be ready to receive these different types of materials, manage them, and integrate them into their production workflow. The VFX workflow is thus a complex, integrated, parallel process custom designed for each project to be able to respond to the changing demands of a client studio. It also involves a substantial amount of technological configuration for every project to respond to these contingencies. The software development and project management concepts used in designing this system model film production as a development task.

VFX workflows have developed in complexity and flexibility since the early 2000s and they have very recently reached a point of agile development that challenges some of our most basic assumptions about film production. Movie “development” has in fact already begun to resemble the “software as a service” models of Microsoft and Adobe. This may sound far-fetched at first. Even if workflows involve extensive iteration and change, in the end a movie is a movie. One would assume you cannot send a movie to audiences, find out if it suits their needs, then go back to the drawing board. Recent events have demonstrated otherwise. In 2019 Sony Pictures released their first trailer for Sonic the Hedgehog (2020), a VFX-laden family film based on the character from the Sega video game series. The trailer received extensive negative reactions from the public, who hated the design of the titular character. This turned into a promotional nightmare for Sony, as negative reactions on social media became such a phenomenon, they were picked up the press.Footnote 32 Sony responded by having their main VFX vendor MPC revise the character model and re-do the animation for the entire film. This pushed their release date past the precious holiday season, but it turned a promotional disaster into a success, as their revision, which responded to the public’s complaints, garnered further media attention. Sony tested their product with the public and revised it.

During the same release season there was a second case that pushed the logic of agile development even further than Sonic the Hedgehog. The film adaptation of the popular musical Cats had a similar negative reaction from the public, though on an even greater scale. The production company Working Title had their VFX contractors, include MPC, tweak their animation in response in a production scramble that, according to the film’s director, saw them finishing editing the day before the release after thirty-six hours of constant work.Footnote 33 These last-minute revisions induced some new mistakes though, including unfinished animations. The studio then re-uploaded a second, fixed version of the film a day later through digital distribution to theaters. The viewers who went to see Cats opening weekend thus saw a different film than those who saw it after. Clearly films have been released in different versions before. Even before special editions DVDs and director’s cuts, films were being redubbed for international audiences. But Working Title’s use of digital distribution and its rapid release of fixes bear a striking resemblance to the agile distribution of software, suggesting this logic is continuing to spread.

The concept of development was born in the Cold War R&D complex as a way of outpacing other nations’ technological advance. From there it was adapted to developing consumer products and software, and over time these organizational paradigms began to turn toward principles of flexibility and constant product revision. Animation and VFX have been using and contributing to these concepts since the 1990s, and VFX management has seen a particular intensification of these concepts in the 2000s as part of a post-Fordist response to the contract and bidding system. This discursive shift toward seeing media production and creative practice as “dev” has produced a situation where constant reconfiguration is the norm in VFX, and that reconfiguration entails constant engineering as an integrated part of production. This emphasis on constant redesign is also clearly at work in production pipelines.

Connecting Production Pipelines

The structure and organization of workflow would be impossible on its own without a technical infrastructure. This is where the production pipeline comes in. To borrow the phraseology of Bruno Latour, pipelines are workflows “made durable.” Pipelines are mutable, they change with every project, and building a pipeline has become a major task on any large project.Footnote 34 The VFX and animation pipeline are the reprogrammable infrastructure that allows workflows to be flexible and allows collaboration between different departments and workers though the exchange of assets, while also facilitating creative control on large-scale projects. In the case of large animation studios, the control is generally in-house, while in the case of VFX some of the direction comes from a film studio. These workflows demand constant technological change. This is why VFX and animation studios are in a constant state of developing and reconfiguring their tools and infrastructures, and this is why people in the industry so often conflate workflows with pipelines; you really cannot have one without the other.

A pipeline facilitates the exchange of data by connecting the outputs from jobs to the inputs of other jobs. In other words, it allows workers to share assets between different departments. Once again, the industrial production line is a useful metaphor here: at its simplest, the pipeline is like a conveyor belt, moving the product from one department to another and spitting out a finished product at the end. However, since the 1990s VFX and animation production pipelines have become far from simple or linear. Instead, they are like a conveyor belt that has numerous convergences and bifurcations that engineers can divert and reprogram.Footnote 35 Every project also has particular challenges that require a specific combination of software, plug-ins, and workers, and the pipeline facilitates the integrations of these parts. Programming and reprogramming such a flexible production infrastructure entails a great deal of technical work. This technical work becomes difficult to distinguish from production work, as the selection, customization, and interconnecting of different tools is both the work of artists and of technical staff.

An important part of pre-production is figuring out what software is needed to make a given sequence. As Chap. 3 showed, this can entail developing new software from the ground up, but since the 2000s it more commonly entails choosing the right off-the-shelf software for the job. Once the pieces of software are chosen, it falls on the pipeline TDs (technical directors) to connect them to the pipeline and do any necessary customization.Footnote 36 Sometimes software companies design their products to work with other programs. For example, there are many programs that are designed to work with Autodesk’s Maya, because Maya is the central hub of most 3D animation work. Houdini, with its nodal workflow design, serves a similar hub function for nonlinear animation “FX.” Sometimes though, a job will necessitate bringing together pieces of software that were not designed to be connected. In these cases, TDs and engineers may need to transcode file formats and protocols and deal with all the subtle problems that arise from using custom scripts and programs. The construction of this connective infrastructure of the pipeline is mostly the work of TDs on feature film sized projects. These are workers with extensive coding experience in different programming and scripting languages. Sometimes TDs are people who have worked in the industry long enough as artists to intimately understand the inner workings of popular software, but they can also be people with computer science backgrounds.Footnote 37 Thus, there is a certain ambivalence between technical and production experience.

When you consider the complex and interweaving nature of workflows, you can begin to imagine how difficult it is to build this connective infrastructure. A job may require inputs or assets from multiple other jobs, and the output of their work in turn may go out to multiple other workers. Since at least the late 2000s it has been standard practice to have several artists working on the same assets simultaneously.Footnote 38 The metaphor of making a large building like a skyscraper is particularly apt here. As an animation pipeline manual puts it, “it is not uncommon for a single creature in a VFX movie to comprise hundreds, if not thousands, of individual assets that must be assembled to generate a working render.”Footnote 39 One can imagine scores of workers assembling a life-sized model of King Kong or a tyrannosaurus with cranes and scaffolds, with a hand or foot being delivered on a flatbed truck, like some sort of monumental construction project. All of this intricacy is done in the name of producing a project as quickly and efficiently as possible, bringing a technological project (a film sequence) to market on time for a scheduled summer or holiday season release. While large projects require extensive planning of these complex pipelines, even small-scale contemporary jobs require artists and technicians to thoughtfully plan their production pipeline.Footnote 40

Allowing multiple parties from different studios to access or modify an asset like a character model in a hectic scramble is, of course, a recipe for conflict. Several technical features of the pipeline are directed toward managing these potential conflicts. The first most important technology for organizing the inputs and outputs of different jobs is digital asset management (DAM) software. DAM software was first implemented in the 1990s in television for twenty-four-hour news stations that had large collections of footage that they needed to be able to access quickly. In the case of VFX, the key function of DAM software is keeping track of versions and editing permissions, allowing many people to work with the same assets. As one paper on the subject from 2010 states, “Traditional DAM platforms are not even a consideration when it comes to providing workflow solutions in the digital media industries” because of the volume of data and the complexity of workflows entailed.Footnote 41 Other techniques that facilitate simultaneous work include the use of “placeholder assets” and low-level-of-detail assets.Footnote 42 This asynchronous approach to production became so elaborate in the 2010s it enabled some forms of “virtual production,” where filmmakers could see low detail previews on set in real time.Footnote 43

The complexity brought on by having many people work with the same content simultaneously is even further complicated by the iterative development approach to creative control, which sees creation as a process of refinement. It is easy to imagine how late changes happen on VFX projects. The above-the-line studio workers likely have a clear vision of what they want, but often they may not be technically versed, or able communicate that vision into VFX language. Industry manuals and best-practices guides acknowledge that late revisions are an inevitable issue.Footnote 44 The idea of refining the product over time is also a core principle of the development mindset, as Pixar’s story about the development of Toy Story 2 in the late 1990s demonstrates. Imagine assembling an intricate sequence filled with hundreds of layered elements and having a director ask for one basic element to be changed. Pipeline design must be so flexible it can accommodate this approach to workflow. The integration of different elements is designed so that artists in one department can go back and change a single element without it adversely affecting all the cascading subsequent work that relies on it.Footnote 45

Some of the challenges that workflow and pipeline design deal with are not new to film production. A classic Hollywood studio film was a large project that required the labor of scores of specialists. It seems obvious that studio film productions throughout history must have employed some kind of workflow planning, even if it did not follow the concepts later formalized by project management or its offshoots. A review of major industry journals over the past hundred years using Mediahistoryproject.org’s “Lantern” reveals little evidence of theorizing these challenges. Yet there is broad scholarly consensus that studios followed a Fordist factory model of efficient, regulated output.Footnote 46 Thus, what really differentiates contemporary approaches to production is their zeal for the post-Fordist efficiencies and potential competitive advantages offered by development-minded project management approaches like product development and agile software development.

This quest for agility required that technological development be a part of production. Being able to re-program the pipeline for every different project allows for flexible workflows. VFX studios re-fit the factory for every job, even during the job. It is true that some things stay the same. VFX studios employ some full-time staff, and there are permanent buildings, workstations, networks, servers, and so forth. This is the stuff of first-order infrastructure. Though, even these forms of solidity are evaporating through trends like increased sub-contracting and cloud-based rendering. As film production becomes more agile it is likely to resemble software development more and more.

All this flexible project management has of course had immense and mostly negative effects on labor practices in the film industry. As John Caldwell and others have noted, the expanded role of VFX and general post-production has destabilized many traditional production labor roles.Footnote 47 The intricate and flexible way VFX studios connect their workflows to film studios has enabled the expansion of the shift away from the once dominant studio system to a competitive bidding system, which has in turn eroded labor unions.Footnote 48 This is connected to what Toby Miller refers to as the “new international division of cultural labour,” where international cities like Vancouver, Toronto and London compete with ever-increasing tax incentives to lure studios.Footnote 49Hye Jean Chung notes how the “nonlinear” nature of VFX pipelines facilitate this internationalization trend.Footnote 50 As Michael Curtin and John Vanderhoef write, many simple VFX tasks like wire removal can be done by “a couple guys in a garage in Van Nuys or a small shop in Chennai.”Footnote 51 Although studios often cast agile workflows as a feature of their commitment to producing a refined final product, or technological advance for its own sake, these approaches to production organization are inseparable from race-to-the-bottom political economic and employment practices.

Scripts, Plug-ins, and Programs for Nonlinear Animation

If workflows and pipelines demonstrate a pervasive trend toward using “development” principles that collapse production and R&D, the particularity of nonlinear animation production is an intensified case of these same trends. Nonlinear animation is constructed by studios as a special type of production that entails the deep integration of technical work. If you look at a flow chart of VFX or animation workflows that include things like modeling, rigging, lighting, and rendering, nonlinear animation has its own special branch, often referred to as “FX” or as “technical animation.”Footnote 52 These FX departments do not make animations; they make simulations that make animations. Getting a certain phenomenon to look a certain way, the gathering of a character’s clothing, for example, or the splash of a turbulent sea, can require buying new software, writing new scripts, developing new plug-ins, or even writing new simulation software from scratch. All this work is done to build a technical apparatus for automated animation.

Nonlinear animations consist of technological and organizational configurations designed to manage unpredictability, just like workflows and pipelines. As Chap. 2 established, the genealogy of nonlinear animation is rooted in attempts to predict and manage unpredictable systems like the weather or financial markets. A closer look at this form of animation reveals how FX artists and TDs build technical apparatuses to enable flexible and reprogrammable control, and how the work of animating and engineering has been collapsed into a single undifferentiated “dev” task. Indeed, the jobs of “Senior FX artist” and FX TD are practically interchangeable.Footnote 53 This suggests that while there are still official divisions between technical and artistic work, in practice they are one and the same.

Like other VFX and animation tasks, the first step in nonlinear animation preproduction is planning exactly what software, people, and infrastructure a studio will need to achieve the desired look for a shot. In the late 1990s and early 2000s high-quality nonlinear animations were relatively expensive to produce and required more basic technological development. For situations where there was only one brief shot, it may have been easier to simply fake it using composited libraries of footage.Footnote 54 As software got better over the course of the 2000s and 2010s, there was generally a spectrum where lower budget projects could be handled with a combination of off-the-shelf software and minimal customization and more spectacular or photorealistic high-budget projects involved high levels of customization and building new software.

Throughout most of the 2010s every nonlinear effect in a feature film or TV show would have been made up of several different effects combined. For example, animating a stormy ocean required animating the larger-scale flow of waves, the smaller-scale turbulence and splashes, the foam braking off the waves, wind effects, and so forth. All of these are specific simulations in their own right. FX artists refer to this combination of effects as the master FX recipe.Footnote 55 SideFX’s Houdini was, and is, the most popular core software for building an FX recipe because of its nodal pipeline design.Footnote 56 Starting in 2009, SideFX also started making its own collection of nonlinear animation effects with fluid, particle, rigid-body dynamics, fur, cloth, fire, and smoke solvers that work natively in Houdini. A low-budget FX job in the 2010s might only call for a one-stop-shop suite like this, and indeed these types of solutions have since become conventionally quite acceptable to use on most projects. Using off-the-shelf suites like the one sold by SideFX dramatically cuts costs. Buying new software is expensive, not just because it needs to be built into the pipeline, but also because workers will need to be trained on it.Footnote 57

The next step up in complexity and cost for building an FX recipe would entail sourcing different third-party plug-ins to achieve a more customized or photorealistic look. Plug-ins range from being relatively simple tweaks to being sophisticated nonlinear physics simulations. They might add certain kind of spray to ocean waves, for example. Anyone who has used an internet browser or word processor should have some basic understanding of what a plug-in is, but it is worth taking a moment to consider the definition. A plug-in is a kind of modification that adds functionality to a piece of software. The difference between software and plug-ins is that a software program can run on its own, without being built into something else. Without the framework to accommodate plug-ins, the modification of software would be difficult, and in some cases illegal. Software like Houdini is designed to be as flexible as possible because there are so many different possible modifications for different jobs.Footnote 58 The more readily these programs can accommodate plug-ins, the less labor needs to go into building the pipeline and therefore less money needs to be spent. Many software companies make their products plug-in friendly because they allow third parties to expand the functionality of their software and thus drive more consumption of the core product. This is, again, an example of how agile workflows are facilitated by flexible and reconfigurable technical infrastructures.

Programs like Houdini can also connect to other independent simulation programs, like Next Limit Technology’s program RealFlow. When different programs do not work well together, the FX TDs must build their own custom pipeline infrastructure. TDs refer to this as writing “glue code.” The more customized the job is, the more elaborate and customized the pipeline infrastructure will be. Thus, the logic of plug-ins is intimately linked to that of pipelines.

The next step in technical complexity beyond employing plug-ins is writing scripts. This is the sort of thing done by the more experienced FX artists and FX TDs. Much like plug-ins, scripts can only run within a program. By contrast, a program runs on its own. In other words, programming is writing instructions for the computer, while scripting is writing instructions for a specific program. Being able to write scripts requires understanding the language a program uses. For example, AutodeskMaya’s script editor console uses their MEL scripting language, but in 2007 they added the Python language, which is vastly better known. Artists and TDs might use scripts to automate something to improve work efficiency, like combining several repetitive jobs into a batch to eliminate the need to do them one by one. This sort of efficiency work is all done in the name of minimizing the amount of clicks an artist must make to do their work. Thus, work is done faster, or with fewer people, and profits are maximized.Footnote 59 But scripts can also be used to manipulate the automation of nonlinear animations, accessing a level of customization not available through the graphic user interface.

The distinction between programming and scripting is important to understand, because while script writing is a common practice, programming work is generally only done at the largest studios. As one TD and former FX artist told me, “Modification of scripts or creation of plug-in is pretty usual. Software change requires foresight about what your need will be in the future.”Footnote 60 Script writing also demonstrates that the line between developing tools and using tools is blurry. At a certain point, the quotidian work of script writing becomes so complex that it becomes an entire plug-in.Footnote 61 And in essence every customization of software is technology development. This blurriness is reflected in labor roles, as in the case of the interchangeability of TD and senior FX artist titles. One might expect there to be a strict division between technical and artistic roles, but this is clearly not the case. Making an image and developing a technology are indistinguishable.

These blurry lines notwithstanding, the scale of tool development clearly tracks closely with the size of VFX and animation studios and their projects. Developing software from the ground up requires immense foresight, planning, and resources.Footnote 62 The largest studios do the most fundamental technology development. As one FX TD explained, they do this because of the “immediacy and customizability” provided by in-house software.Footnote 63 Having the people that made the software down the hall makes service immediate and makes getting the exact image the director wants easier. Furthermore, as Chap. 3 noted, there are immense strategic and economics advantages to developing and owning proprietary technologies.

Sometimes software companies themselves offer custom services. There are also some VFX studios that specialize in just one type of effect and even one piece of simulation software. These studios defy categorization as either software developers or production studios. The best example of this is Fusion CI Studios, a Vancouver-based company founded in 2004 that specializes in RealFlow software. Fusion CI does extensive R&D work and developed its own specific fluid simulation that operates within RealFlow, called Smorganic, that specializes in animating the ultra-thin sheets fluids make when they splash. Fusion CI models itself as a “plug and play” company, which can be brought on for a specific job, bringing its own artists and technicians, and attaching itself to the greater VFX pipeline and workflow. This approach makes economic sense for studios that have important fluid simulation scenes to do, but do not have the operational scale to justify, or indeed fund, extensive R&D. Fusion CI’s hybrid role once again demonstrates how indistinguishable technology development and animation are in nonlinear animation, and how modular and reprogrammable production workflows can be.

As a way of demonstrating how even the most basic nonlinear animation blurs the line between animation production and technical work, I will describe a hypothetical case based on my own practice learning how to make fluid simulations and experimenting with different pieces of software. A case like this one demonstrates the conflation of image making and technological development just as well as a large-scale project, not because of the particular combination of custom software or coding it involves, but because of the way the user controls animated phenomena through the manipulation of parameters. The user (an FX artist or a rank amateur like myself) builds a flexible technological apparatus to manage unpredictability the same way a complex pipeline does. For this example, I will use the 3D animation software Maya, two plug-ins for Maya called Krakatoa and Nuke, and RealFlow, a fluid simulation program that outputs to Maya using a plug-in. This example may not represent the most cutting-edge work done at large VFX or animation studios, but it does offer a basic and general account of what using this type of software is like.

Our hypothetical nonlinear animation job starts in RealFlow, where the artist makes a particle-based simulation of a fluid. The first steps will likely involve putting in any boundaries, containers, or objects that the fluid might splash off of. Next, the artist inserts the fluid, either as something already present or as something flowing out of what is called an emitter, like a pipe or an overflowing bathtub. The FX artist can alter the size, direction, and amount of flow from an emitter by changing different values either in a script or more likely in a tool-specific user-interface window. At this point the artist can insert different forces into the fluid over a timeline, which will cause perturbations, vortices, and movement. They can also potentially add random noise, using a stochastic algorithm to make the movement more interesting and naturalistic. They might also adjust the force of gravity. At this stage, the artist can also change the proprieties of the fluid, such as the vorticity (how many swirls the fluid forms) or the viscosity (how thick the fluid is). All of this is done by changing the value of a given modifier. It is important to emphasize here that these are all pre-programmed conditions. The artist cannot directly shape the fluid, but instead manipulates parameters. With any adjustment they will have to run a low-level-of-detail simulation to see what the outcomes of these conditions will be.

At this point the artist has made a flowing volume of particles. The next job is to draw a mesh onto the particles. Particles are like a volume without a surface and adding meshes gives the water a surface. With the polygonal surface of the fluid drawn, the FX artist can teak more values, such as the thickness of splashes. It is also common practice for the animator to make a second particle simulation that will stay as particles without a mesh. These little points of water will act as mist droplets.

Next, the artist outputs the simulations to Maya. Here, lighting and camera position can be set, as they would be with any animation, although another department may do this work. The artist will also give shading and reflective properties to the mesh surfaces, as well as surface textures and coloring. In the case of water, the surface will obviously be transparent. The artist can also change the look of the secondary particle simulation using the Krakatoa plug-in, giving the particle points shade or color. Finally, the two simulations will be put together and composited into a scene with other elements using Nuke. This is a relatively simple example of how a FX artist would go about making a simulation. One person with a few thousand dollars’ worth of technology could do it. In 2020 the open-source animation suite Blender fully integrated an FX framework called Mantaflow that combines all of this functionality, so a simpler version could even be done with only one piece of free software and a consumer-grade computer.

Although the work described here sounds, and indeed looks, not unlike the work of any digital animation artist, there are some important distinctions to be made. For one, the artist cannot directly control the outcome of their simulation. The best they can do is use trial and error and make choices based on their own experience with the behavior of a given simulation. Further, the artist is using nonlinearity, and even adding additional randomness, as an important part of achieving the right look. The FX artist seeks to foster unpredictable complexity as a resource while also shaping it to conform with direction. The FX artist is thus building a technical apparatus (the simulation) to control some unpredictable system. The principles are the same as those shaping workflows and pipelines in general.

The nature of this work raises a curious theoretical question that brings us back to a topic addressed in the second chapter. Every time an FX team uses a different plug-in or modifies a simulation, they are re-inventing their representational apparatus. They are adopting a different way of seeing the ontology of the fluid (or hair or smoke, etc.) by engineering a different “solution.” Imagine if filmmakers re-invented the camera every time they made a film. As Chap. 2 argued, there is an exciting potential in representing the world through such contingent, speculative means. Thus, although the turn toward conceptualizing production as “development” generally takes the shape of treating production work as technical problem solving, this does not tell the whole story. In order to understand the work of this new breed of creative industries worker we should borrow a page from the philosophy of engineering and recognize that the epistemic value of media is not always in the “knowing that” but also in the “knowing how.”

Recasting Technical Labor

Historically it has been common for creative production work and technical work in film industries to be constructed as separate categories. The title of an organization like the Academy of Motion Picture Arts and Sciences suggests that the two sit close together, but in practice the academy tends to compartmentalize technical work. The annual Scientific and Technical Awards are a mere sideshow when compared to the main Academy awards. The VFX and animation industries have been slowly renegotiating this division though. When organizations like the Visual Effects Society describe the work done by their members, they liken them to Renaissance artists, invoking a period when artists employed sophisticated techniques informed by their knowledge of light and physiology to produce “realistic” images.Footnote 64 Animation studios have a similar rhetoric, styling the extensive engineering work they do as a form of creativity. A traveling exhibit put together by DreamWorks and the Australian Centre for the Moving Image features numerous displays of how their artists solve “creative and technical challenges,” including one (sponsored by the computer hardware company HP) that allows patrons to manipulate a fluid simulation through an interactive display.Footnote 65 Pixar also likes to emphasize the way Walt Disney fused the “magic” of animation with technical innovation, and they construct themselves as continuing this tradition of technological creativity.Footnote 66

Digital animation studios such as these have always put extra emphasis on the creativity of their sometimes very technical work, because from the outset they have had to make the case that seemingly rigid and lifeless computer graphics can be used to make cartoons. Pixar’s use of their shorts is an excellent example of this. LuxoJr. (1986), their first short after they split from Lucasfilm, features rigid looking desk lamps brought to life through careful manual manipulation of their gestures, in the Disney animation tradition. Christopher Holliday argues that the Luxo character is a “synonym” for the “animatedness” of digital animation, and thus the desk lamps have become an enduring part of the studio’s brand, featuring prominently in their logo and the entrance to their studio.Footnote 67 One might therefore expect that nonlinear forms of animation, which are made through engineering an automated simulation rather than manual key-frame techniques, would be something Pixar might downplay. But rather than avoiding animation work that seems too technical, they promote the creative quality of this technical work. Their 2016 short Piper shows the way they treat this highly technical form of animation as yet another form of animated creativity.

Piper tells the story of a sand piper learning to hunt for food in the ever-changing landscape of an ocean shoreline. Following the paradigm of Pixar shorts, the animators communicate an incredible amount of storytelling through the subtle character animation of gestures and facial expressions. The birds, though relatively naturalistic, convey a range of emotions that are universally intelligible to humans. These expressions are the result of work that requires painstaking manual labor done by people who fit our tradition definition of what a key-frame animator is. Yet Piper also abounds with nonlinear animation. The feathers, a key expressive part of the birds, automatically ruffle in the wind and react to movement. The feathers are also bound to the deformable movement of the skin of the birds, which is connected in turn to a simulation of musculature.Footnote 68 Thus, while the bird’s core model is manipulated manually, the overall animation of the bird consists of at least as much simulation as manual animation.

The most impactful aspect of Piper is arguably the way it renders the material experience of being small. The tiny waves seem huge, the grains of sand are more like pebbles, and blades of grass are the size of trees. The animated material quality of all these things is the result of an imaginative use of simulation, from the flow of the grass, to the crash of the waves, to the way the sand moves as the bird tumbles across tiny dunes. These simulations are not self-evident, and they are not easily achieved. They require imagination, picturing one’s self in the world on a different scale. This vision would have informed the building of the FX recipe and pipeline infrastructure. TDs would have customized and altered certain tools, FX artists would have carefully manipulated different parameters, wrote scripts, and created many different iterations, all in the name of arriving at this final product. All of this is not evident when you watch the short, but what is evident is the artful way the artists have shaped the material world. Piper is legibly a techno-artistic feat.

Piper demonstrates very keenly how the manual and the automatic have been renegotiated to style technological work as creative work. Given the discursive importance Pixar shorts have, we can get a glimpse of how the studio is renegotiating these ideas in this short. The technical work of simulation building is being subsumed into the image Pixar has worked so hard to cultivate all these years as a fount of creativity in the tradition of Disney animation. Much in the way Pixar originally used their shorts to convey that 3D computer graphics are genuine animation, they now convey that simulation is animation. The job of making a crashing wave look just right has been elevated to the same level as the job of animating the expressive gestures of an animated character.

This new image of creative work in animation raises some questions about the construction of labor roles. An important subject in the field of production studies concerns the subjectivity of the self-identified creative worker, and the role the discourse of creativity plays in organizing labor. Vicki Mayer notes how the attributes of creativity and professionalism are used to create hierarchies in media industries, above-the-line and below. Above-the-line are the professionals who manage workers and the creatives who have control over the content being produced: labor constructed as intellectual or creative. Below-the-line are the technical and service workers.Footnote 69 Similarly, John Caldwell is interested in understanding the socio-cultural factors that make possible the current state of the industry, where workers log long hours for little, or sometimes no, pay. He finds that there is an “invisible economy” of “symbolic payroll,” where workers are motivated by discourses like creativity instead of material compensation or job security.Footnote 70 The idea of creative work makes possible the state of precarious “deprivation” employment practices in industries like VFX and animation.

If the idea of creativity is so important for organizing labor, and if, as Mayer argues, it follows a division between technical trades and creative or management roles, what happens when both creativity and technical work are cast as development processes? Is the permeability between FX artists and TDs evidence that these labor divisions have been disrupted? A key finding of Mayer’s is that below-the-line workers see themselves as making creative contributions to the production of media, but from the outside they are invisible and excluded. Mayer writes, “all of us increasingly define ourselves through our productive work while at the same time industries devalue our agency as producers.”Footnote 71 It is exactly this dynamic that makes it possible to benefit from the motivating discourses Mayer and Caldwell describe, while at the same time having labor spread across many contracted companies scattered throughout the world. Caldwell categorizes production and post-production work below-the-line, yet he also demonstrates how the workers in these categories are strongly motivated by the discourse of creativity and the symbolic payroll. He observes that low-level VFX workers work so hard in large part because they want to imagine themselves as artists who are a part of the movies they love.Footnote 72 Even if you only did some match-moving work on Jar Jar Binks in a scene that ended up being cut, you still worked on a Star Wars movie.

Nonlinear animation and R&D laborers are in no danger of being recognized publicly by the industry as valued creative workers, even as technical work becomes indistinct from creative work. VFX Studios, supervisors, and organizations repeat the same refrain with surprising consistency in public communications: our job is to make the director’s vision come to life.Footnote 73The VES Handbook of Visual Effects writes that VFX supervisors (the highest ranking VFX workers) take “artistic desire and turn it into a technical plan.”Footnote 74 The role of VFX supervisor is truly commensurate with director of photography or art direction, yet they continue to lack recognition in the most visible places, like the Academy Awards. The only pushback against this has come from labor organization initiatives.

The integration of more engineering and R&D work into animation and VFX production has in fact ensnared more workers into the symbolic payroll. Academic nonlinear animation researchers revel in their association with the film industry. At the very least, association with Hollywood seems to be a good way of promoting your work. Evidence of this can be found in profiles on researcher’s personal websites and blogs, on official university webpages, and, of course, in SIGGRAPH presentations. Take, for example, a scholarly publication by Jerry Tessendorf (a researcher profiled in Chap. 3) and several other scholars at Rhythm and Hues, which was presented at SIGGRAPH and can be accessed both through Tessendorf’s personal website and through his university page.Footnote 75 The paper concerns a new technique for animating realistic clouds. This research was conducted at Rhythm and Hues for a specific project: the film reboot of the 1980s television show The A-Team (2010). The title of this peer-reviewed research paper is I Love It When a Cloud Comes Together, a play on the famous catchphrase from the show “I love it when a plan comes together.” The researchers seem to be playfully suggesting an analogy between their work and the work of the A-Team: a scrappy squad of underappreciated misfits who always get the job done. It seems quite clear that researchers enjoy being a part of making spectacular movies. It no doubt differentiates them from their peers in other fields. How many mathematicians have Academy awards? While these valued scientists at the forefront of their field probably are not exactly exploited by this symbolic payroll, this is a phenomenon that suffuses networks of graduate students and more precarious academic laborers. To reiterate Mayer’s words, “all of us increasingly define ourselves through our productive work while at the same time industries devalue our agency as producers.”Footnote 76

There are several convergent and related causes for what could broadly be described as the development turn in VFX and animation production. First is the spread of the logic of R&D from the institutions of the military-industrial-academic complex to the film industry. The fact that VFX and animation studios have invested so much into R&D, and that R&D has become an important strategic and economic factor, has had a long-term effect on the role of technology in production. The military’s R&D complex is also where the concept of project management was formalized and spread in the first place. Second, the spread of nonlinear animation tools, which complicate the relationship between automation and animation, has had a practical effect on the nature of production work. Tools such as these have made writing scripts, installing plug-ins, connecting software pipelines, and even sometimes writing new programs, an everyday part of animation work.

On their own, these two important conditions explain much of these trends in production, but they do not necessarily explain why trends like flexibility, configurability, and customization have become so important. Why did agile project management become so much more popular than waterfall? Why have VFX and animation studios borrowed these principles? They were responding to economic trends that seek capital productivity and efficiency in neoliberal and post-Fordist principles, which introduce competitive market forces to every facet of operation, making every film production a nesting-doll of contracted and sub-contracted vendors that in turn employ workers on six-month contracts. A turn toward conflating cultural work with computer engineering could also be seen as a by-product of the “information society” discourse that proliferated in this neoliberal context, because it sees culture as nothing more than information.Footnote 77 All of these conditions are intimately linked. The rise of R&D in the film industry was spurred on by the shift from a Cold War federal funding model to a tax-incentivized private model. Thus policy, economics, discourse, and technology all feed into each other, with no single factor offering a sufficient explanation on its own.

The concept of R&D took experimentation, exploration, and discovery and modeled it as a process that could be managed and instrumentalized without compromising its productive unpredictability. Nonlinear simulation sought to model unpredictable processes so that they could be analyzed and reproduced. Nonlinear animation uses these principles to animate the unexpected, random, and complex nature of natural motion, while also being able to artistically manipulate it. There is an epistemic paradigm specific to this period in history, an episteme, which joins these ideas. Chapter 5 will pursue this concept further, adding greater nuance to certain assumptions about post-Fordist management techniques, using the example of Pixar.