1 Introduction

There is a trade-off between efficiency and resilience. Efficiency requires optimal adaptation to an existing environment, while resilience is an ability to adapt to large or sudden changes in the environment.Footnote 1 Emphasis on short-term gains has long tipped the balance in favor of efficiency.Footnote 2 The relentless pursuit of efficiency removes hurdles to the speed and reach of transactions, hurdles that also serve as buffers against shocks. Buffers provide resilience in the face of ecological, geopolitical, and financial crises.

As a computer scientist, I look at how algorithms provide a way to test assumptions about resilience, even as the field of computing itself shares the bias toward efficiency. Three recent crises—the 2021 winter storm in Texas, the COVID-19 pandemic, and the Boeing 737 Max software failure—highlight the cost of valuing efficiency over resilience and provide lessons for bringing society into balance.

The essence of digital humanism is the firm belief that technology development must be centered on human interests. This article argues that technology often focuses on efficiency and does that at the expense of resilience.

2 Economists and Engineers Focus on Efficiency

Economics has long been obsessed with efficiency. Economic efficiency means that goods and production are distributed or allocated to their most valuable uses and waste is eliminated or minimized. Free-market advocates argue further that, through individual self-interest and freedom of production and consumption, economic efficiency is achieved and the best interests of society, as a whole, are fulfilled.Footnote 3 Yet this conflates efficiency with the best societal outcome.

The intense focus on efficiency at the expense of resilience plagues not only business and economics but also technology. Society has educated generations of computer scientists that analyzing algorithms, the step-by-step instructions at the heart of computer software, boils down to measuring their computational efficiency. The Art of Computer Programming (Knuth, 1997), one of the founding texts of computer science, is dedicated to the analysis of algorithms, which is the process of figuring out the amount of time, storage, or other resources needed to execute them. In other words, efficiency is the sole concern in the design of algorithms, according to this guide.

Yet what about resilience? Designing resilient algorithms requires computer scientists to consider, in advance, what can go wrong and build effective countermeasures into their algorithms. Without designing for resilience, you get efficient but brittle algorithms. Of course, fault tolerance has been part of the canon of computing-system building for decades. Jim Gray’s 1998 Turing Award citation refers to his invention of transactions as a mechanism to provide crash resilience to databases. Leslie Lamport’s 2013 Turing Award citation refers to his work on fault tolerance in distributed systems. Nevertheless, computer science has yet to fully internalize the idea that resilience, which includes reliability, robustness, and more, must be pushed down to the algorithmic level. A case in point is search-result ranking. Google’s original ranking algorithm was PageRank (Page et al., 1999), which works by counting the number and quality of links to a page to determine how important the website is. Yet PageRank is not resilient to manipulation, hence search-engine optimization.Footnote 4

3 A Storm, Long Trains, a Plague, and Some Bad Software

Brittle systems are more likely than resilient systems to break down when crises strike. For instance, cold temperatures and blackouts during Winter Storm UriFootnote 5 killed almost 200 peopleFootnote 6 in February 2021 in Texas. The storm damaged the power grid and water systems, which lacked the weatherproofing features common to utility infrastructure in much of the rest of the country. A recent investigationFootnote 7 by ProPublica revealed that trains are getting longer to increase their efficiency. Yet while railroads are getting richer, these “monster trains” are jumping off tracks across America and regulators are doing little to curb the risk.

The harsh economic consequences of failing to prepare for a pandemic, despite many early warnings,Footnote 8 provoked questions about whether the obsessive pursuit of efficiency, which has dominated standard business orthodoxy for decades, has made the global economic system more vulnerable to disruptive changes, such as the COVID-19 pandemic.Footnote 9 Thomas Friedman wrote in a May 2020 New York Times column: “Over the past 20 years, we’ve been steadily removing man-made and natural buffers, redundancies, regulations and norms that provide resilience and protection when big systems—be they ecological, geopolitical or financial—get stressed … . We’ve been recklessly removing these buffers out of an obsession with short-term efficiency and growth, or without thinking at all.Footnote 10

A stark example of a system designed for efficiency and not resilience is the flight-control algorithm for the Boeing 737 Max. Boeing retrofitted its 737, a passenger aircraft first produced more than half a century ago, with engines that are more efficient. This retrofitting caused some flight instability, which the flight-control algorithm was designed to overcome.Footnote 11

The algorithm, however, relied on data from a single sensor, and when the sensor failed, the algorithm incorrectly determined that the plane was stalling.Footnote 12 In response, the algorithm caused the plane to dive as an emergency measure to recover from a stall that was not happening. The result was two horrific crashes and hundreds of the aircraft being grounded for nearly 2 years.Footnote 13 In retrospect, the engineers overly optimized for fuel economy and time to market at the expense of safety.Footnote 14

4 Just-in-Time Manufacturing

Just-in-time manufacturing is a logistical methodology aimed primarily at reducing times within the production system as well as response times from suppliers and to customers (Cheng & Podolsky, 1996). The idea is to reduce inventory costs by reducing inventory, since warehousing inventory incurs costs of warehouse maintenance, idle inventory, and stocking and de-stocking. Just-in-time manufacturing tries to match production to demand by only supplying goods that have been ordered and focuses on efficiency, productivity, and reduction of “wastes” for the producer and supplier of goods. Parts should arrive, for example, “just in time” for the usage in the manufacturing line.

Just-in-time manufacturing, however, assumes best-case logistics, since a delay in arrival of goods may ricochet across the whole supply chain. Indeed, when the Ever Given container ship got stuck in the Suez Canal in March 2021, it had a worldwide impact, as every day the ship blocked the canal, dozens of ships carrying billions of dollars’ worth of cargo had to wait to enter the waterway.Footnote 15

5 The Price of Anarchy

If brittle systems are prone to disasters, why are they so prevalent? One explanation is that, short of disasters, systems that emphasize efficiency can achieve a particular kind of stability. A fundamental theorem in economics, known as the “First Welfare Theorem,”Footnote 16 states that under certain assumptions a market will tend toward a competitive balance point, known as the Pareto-optimal equilibrium,Footnote 17 which achieves economic efficiency.

How well, however, does such an equilibrium serve the best interests of society? A team of computer scientists (Koutsoupias & Papadimitriou, 1999) studied how beneficial or detrimental equilibria can be from a computational perspective. The researchers studied systems in which uncooperative agents share a common resource, the mathematical equivalent of roadways or fisheries. They showed that such systems can have multiple equilibria, and different equilibria can vary in terms of the total social utility, e.g., traffic congestion or fishery health.

The question is how large the gap is between the best and worst equilibria. To that end, they investigated the ratio between the worst possible equilibrium in terms of social utility and the best equilibrium in terms of social utility (social optimum), a ratio dubbed the “price of anarchy”Footnote 18 because it measures how far from optimal such uncooperative systems can be. They showed that this ratio could be very high. In other words, economic efficiency does not guarantee that the best interests of society are fulfilled.

Another team of researchers asked how long it takes until economic agents converge to an equilibrium (Daskalakis et al., 2009). By studying the computational complexityFootnote 19 of computing such equilibria, the researchers showed that there are systems that take an exceedingly long time to converge to an equilibrium.

The implication is that economic systems are very unlikely ever to be in an equilibrium, because the underlying variables—such as prices, supply, and demand—are very likely to change while the systems are making their slow way toward convergence. In other words, economic equilibrium,Footnote 20 a central concept in economic theory, is a mythical rather than a real phenomenon. This is not necessarily an argument against free markets, but it does require a pragmatic view of them.

6 Free Trade

In fact, the very idea of free trade is motivated by the quest for efficiency. In 1817, David Ricardo argued that if two countries capable of producing two commodities engage in free trade, then each country would increase its overall consumption by exporting the goods for which it has a comparative advantage while importing the other goods (King, 2013). The benefits of free trade are undeniable. Over the past 30 years, worldwide free trade (“globalization”) has liftedFootnote 21 over one billion people out of extreme poverty. At the same time, the question for efficiency has led to the situation that Taiwan, which today is one of the riskiest geopolitical points, dominatesFootnote 22 worldwide semiconductor production, which ledFootnote 23 the USA to launch a major investment in domestic semiconductor capacity.

7 Why Sex Is Best

It is interesting to consider how nature deals with the trade-off between efficiency and resilience. This issue was addressed in a computer science paper titled “Sex as an Algorithm” (Livnat & Papadimitriou, 2016). Computer scientists have studied different search algorithms that are inspired by natural evolution. On one hand, there is simulated annealing,Footnote 24 which consists mainly of small steps that lead to an improvement of a goal function, but allowing also for some steps that are less than optimal, but could lead to better solutions further down the line. On the other hand, there are genetic algorithms,Footnote 25 which mimics natural selection by creating “off-springs” of previous solutions and then adding some random mutations. Computer scientists know that, in general, simulated annealing is faster than genetic algorithms (Franconi & Jennison, 1997).

Why, then, has nature chosen sexual reproduction as the almost exclusive reproduction mechanism in animals? The answer is that sex as an algorithm offers advantages other than good performance (Livnat & Papadimitriou, 2016). In particular, natural selection favors genes that work well with a greater diversity of other genes, and this makes the species more adaptable to disruptive environmental changes—that is to say, more resilient. Thus, in the interest of long-term survival, nature prioritized resilience over efficiency. Darwin supposedly said, “It’s not the strongest of the species that survives, nor the most intelligent. It is the one that is most adaptable to change.”

8 The Crisis of Democracy

US society is in the throes of deep polarization that not only leads to political paralysis but also threatens the very foundations of democracy. The phrase “The Disunited States of America”Footnote 26 (tracing back to Harry Turtledove’s novel with this title) is often mentioned. “The U.S. is heading into its greatest political and constitutional crisis since the Civil War,” wroteFootnote 27 Robert Kagan in the Washington Post, raising the specter of mass violence. How did we get here? What went wrong?

The last 40 years have launched a tsunami of technology on the world. The IBM Personal Computer—Model 5150, commonly known as the IBM PC—was released on August 12, 1981, and quickly became a smashing success. For its January 3, 1983, issue, Time magazine replaced its customary person-of-the-year cover with a graphical depiction of the IBM PC, “Machine of the Year.” A computer on every work desk became reality for knowledge workers within a few years. These knowledge workers soon also had a computer at home. With the introduction of the World Wide Web in 1989, many millions could access the Web. The commercialization of the Internet in 1995, and the introduction of the iPhone in 2007, extended access to billions.

The socioeconomic-political context of this technology tsunami is significant. There was a resurgence of neoliberalism marked by the election of Margaret Thatcher as Prime Minister of the UK in 1979 and by Ronald Reagan as President of the USA in 1980. Neoliberalism is free-market capitalism generally associated with policies of economic liberalization, privatization, deregulation, globalization, free trade, monetarism, austerity, and reductions in government spending. Neoliberalism increases the role of the private sector in the economy and society and diminishes the role of government. These trends have exerted significant competitive pressure on the economies of the developed world. To stay competitive, the manufacturing sector automated extensively, with the nascent distributed-computing technology playing a significant role. The implications are still with us.

A 2014 paper by MIT economist David Autor argued that information technology was destroying wide swaths of routine office and manufacturing jobs, creating in their place high-skill jobs (Autor, 2014). This labor polarization appeared to have brought about a shrinking middle class.Footnote 28 Autor’s data for the USA and 16 European countries showed shrinkage in the middle and growth at the high and low ends of the labor-skill spectrum. This polarization greatly increased income and wealth disparities.

As information technology allowed the flooding of Internet users with more information than they could digest, tech companies supplied mass customization that allowed users to concentrate on information that confirmed preconceived opinions, resulting in deeper societal polarization (Vardi, 2018). This exacerbated further the “filter bubbles” that were created earlier in the broadcast media, following the abolition, in 1987, by the US Federal Communications Commission, of the “Fairness Doctrine,” which required holders of broadcast licenses both to present controversial issues of public importance and to do so in a manner that reflected differing viewpoints fairly.

9 Lessons from the Internet

Our digital infrastructure, which has become a key component of the economic system in developed countries, is one of the few components that did not buckle under the stress of COVID-19. Indeed, in March 2020, many sectors of our economy switched in haste to the work from home (WFH) mode. This work from home, teach from home, and learn from home was enabled (to an imperfect degree, in many cases) by the Internet. From its very roots of the ARPANET in the 1960s, resilience, enabled by seemingly inefficient redundancy, was a prime design goal for the Internet (Yoo, 2018). Resilience via redundancy is one of the great principles of computer science (von Neumann, 2017), which deserves more attention!

10 Conclusions

The bottom line is that resilience is a fundamental but underappreciated societal need. Yet both computing and economics have underemphasized resilience. In general, markets and people are quite bad at preparing for very low-probability or very long-term events (Taleb, 2010). For instance, people have to be forced to buy car insurance, because buying insurance is not efficient. After all, taken together, the insurance business is profitable for the insurers, not for the insured. The purpose of insurance is increased resilience. This example shows that ensuring resilience requires societal action and cannot be left solely to markets. The economic impact of the pandemic shows the cost of society’s failure to act.Footnote 29 Furthermore, COVID-19 may be just the warm-up actFootnote 30 for the much bigger impending climate crisis, so focusing on resilience is becoming more and more important.

There seems to be a broad recognition that the incalculable suffering and trauma of COVID-19 offer societiesFootnote 31 ways to change for the better.Footnote 32 Similar lessons can be drawn from Winter Storm Uri and the Boeing 737 Max. Focusing on resilience is a way for societies to change for the better. In the meantime, the steady flow of news events—like a pipeline company that appears to have underinvested in securityFootnote 33—continues to underscore the cost of prizing efficiency over resilience. The big question is how the AC (“after COVID”) world will differ from the BC (“before COVID”) world. Kurt Vonnegut supposedly said, “We’ll go down in history as the first society that wouldn’t save itself because it wasn’t cost effective.” Resilience must be a key societal focus in the AC world.

Discussion Questions for Students and Their Teachers

Please consider your own research area. What are the dimensions of efficiency and resilience you can identify? Is there a trade-off between these two dimensions? Where is the current emphasis? How can you increase the emphasis on resilience?

Learning Resources for Students

  1. 1.

    Recipe for Disaster: The Formula That Killed Wall Street.Footnote 34

    Analyze the 2008–9 Financial Crisis through the efficiency/resilience lens.

  2. 2.

    On the Inefficiency of Being Efficient (Goldberg, 1975).

    Summarize the main point of the paper through the efficiency/resilience lens.

  3. 3.

    Antifragile – Things That Gain from Disorder (Taleb, 2012).

    What is the argument here against efficiency?