Keywords

Introduction

The most effective forms of warfare have therefore always included recognition of the inherent unpredictability that accompanies the use of force and built into military organization a tolerance to uncertainty and even a capacity to profit from it.

-Antoine J. Bousquet, The Scientific Way for Warfare

Design’s capacity to deal with complexity and conflicting concerns is perhaps its most fascinating feature … this ability to address complexity is inherently intertwined with design’s resilience to reductive dichotomies. More specifically, it comes out of a hunch that a key reason we enjoy dichotomies so much in design is because they allow us to address conflict, collision, and contradiction, opening up new perspectives and potentials as a result.

-Johan Redström, Making Design Theory

All goods and services are designed. The urge to design – to consider a situation, imagine a better situation, and act to create that improved situation – goes back to our prehuman ancestors. Making tools helped us become what we are – design helped to make us human.

-Ken Friedman and Erik Stolterman, “Series Forward”

The world is “wicked once over” in that it is both dangerous and disorderly.

Defense professionals readily acknowledge the former because they regularly confront hazardous circumstances, and the use of organized violence for political effect is their reason for being. The latter refers to “wicked problems” that resist comprehensive or conclusive solutions because the context is confusingly dynamic and subject to competing interpretations (Rittel and Webber 1973, pp. 155–169). Those intimately familiar with complex security challenges increasingly recognize this second form of wickedness, as well as the corollary that traditional security concepts are only sufficient for “tame problems” (issues that are clearly defined and amenable to linear logic and reverse engineering).

The realization that competition in this century will require more sophisticated approaches has led to generic calls for “intellectual overmatch,” including the ability to imagine new possibilities, to arrive at these new ideas in new ways, and to apply them in new contexts (Joint Chiefs of Staff, US Department of Defense 2020, p. 2).

There is much debate over how to accomplish these goals and very little consensus. Over the last several decades, however, rigorous explorations into these challenges have coalesced around new ways to think and act. This growing body of scholarship and practices is broadly known as military design (Wrigley et al. 2021), and the practitioners are often termed “military designers.” The chapters in this portion of the Springer Handbook of Military Sciences represent the first collaborative and scholarly attempt to summarize the emergence of this international movement in expanded form. The authors are either professional academics or warfighters, and, in many cases, they are both expert theorists and experienced practitioners. The result is a holistic overview of military design meant to spur their colleagues in the security enterprise on to further reading, reflection, and practice.

Wayfinding

Though each chapter can be read independently, this introduction offers a broad overview as a starting point for anyone new to this field or interested in the “prehistory” of design, a topic rarely addressed in popular writings. This sets up the following chapter’s more detailed history of how military design emerged from design writ large. That narrative focuses more on the modern period of history (i.e., the historiographical era following the European Middle Ages) and how, near the end of the twentieth century, some security professionals “designed design” to create a distinct movement.

In tandem, the first two chapters frame and contextualize all subsequent material in this section of the Springer project. For those less interested in the historical perspective, the remaining chapters serve as stand-alone guides to their specific topics. They address various forms of design praxis, in that the authors consider the form and function of the design applications, methods, and contexts, as well as the theoretical foundations these diverse designs originate from.

Altogether, these chapters survey the vast landscape of military design. The chosen vantage point, especially in the first two chapters, is intentionally broad, sacrificing details for a wider perspective. Still, there are no claims to being exhaustive or conclusive, either within a given subject or across this entire body of scholarship. To assert otherwise would not only be disingenuous, but also contradictory to the ethos of design. Put another way, these ideas about military design are – like the conditions they are meant to operate in – messy, evolving, and not without controversy.

Ultimately, this project is not about mapping terrain in order for readers to navigate known areas, nor is it about learning how to design, which some suggest ought to be framed as a “social technology” that can only be fully understood in practice (Liedtka 2000, pp. 7–8). These chapters, instead, are background information to equip individuals for the journey as they wander through the fog to sense the unknown or as they look toward the horizon to explore the edge of our current thinking. That expedition starts below, perhaps surprisingly, with reviewing the very nature of the human species.

“Realizing” Advantage

While military design may be a recent expression, the spirit behind it is woven into all of military history and even the entire story of humanity. Our species is equipped with various capacities that allow us to navigate and nudge our environments. In addition to our physical and social skills, mental abilities emerged during what some call the cognitive revolution (Harari 2018, pp. 3, 31–37). With this change, Homo sapiens gained the skill of abstraction, representing ideas symbolically in language and mentally projecting themselves outside their immediate, perceptible circumstances. In short, imagination became a strategic advantage. Those individuals or groups who harnessed this intellectual gift better than others gained a competitive edge because strategy – what prominent theorist Lawrence Freedman calls “the art of creating power” (Freedman 2013, p. xii) – comes from “realizing” advantages in both senses of the verb: to be mentally aware of possibilities and to create, or actualize, the intended effect.

The duality of “realization” brings attention to the fact that the world we are applying our imagination to contains some elements that are internal and subjective with others that are external and objective. At the risk of overgeneralizing, art explores the first set, while science is oriented to the second. This contrast between what the novelist and scientist C. P. Snow referred to as the “two cultures” is broadly accepted in the West as an accurate division of knowledge. Indeed, each is often defined by its hostility to the other. Art explores the human condition, is qualitative, and synthetic. Science studies the natural world, is quantitative, and analytical. In addition to differing areas of focus, each tradition is further defined by how users make sense of the world through the questions it asks of its particular field of study, what constitutes valid answers, and what activities flow from those approaches (Burrell and Morgan 1979; Kuhn 1996). There is, however, a third way of understanding and acting upon the world that can be equally disciplined and distinct and is linked to our innate gift of creative abstraction – design.

Design operates in both the subjective and objective realms in order to conceptualize changes to the world intended to produce some advantage (Dorst 2015, p. vii). This is so integral to our existence that “designing constitutes being human” (Krippendorff 2006, p. 74) and design has been called a “fundamental form of human intelligence” (Cross 2006, pp. vi, 1). Moreover, this “first tradition” preceded art and science as one of the defining practices of our species (Dorst 2015, pp. 1–15). Humans, for example, conceived, constructed, and creatively employed artifacts, such as woven branches or stone tools, before anything about the process was “scientific” or “artistic.” Timely pragmatism – realizing an advantage, in other words – trumped systematic observation or controlled experimentation, as well as creative expression for its own sake. Yet, despite its prehistoric origins, design has not reached the same distinctiveness as a field of knowledge as art or science. This is partially because the term is used so widely and in so many ways, but also because the tradition is not as distinct from art and science as art and science are imagined to be from each other (Cross 2006, pp. 2–11).

To continue this broad introduction to design, it is useful to briefly compare and contrast design with the other two traditions. Like scientific endeavors, design focuses on answering questions through methodical processes. Unlike science, however, it does not require comprehensive explanations or repeatable solutions. Indeed, the very questions are different: science explores what is and design imagines what could be.

Differing orientations toward time further distinguish the two: science can wait patiently for its truths, but the relative immediacy of a design challenge requires some response, even if speculative and temporary. Also, scientific knowledge is generalizable, but design situates information – including the outputs of science – into specific contexts and sometimes does so in ways that are intuitive and therefore less explicable or open to standardization. This receptiveness to nonlinear, subjective, conjectural thinking makes design, to some degree, artful. Furthermore, design may employ artistic techniques as tools to make sense of ill-defined problems, which is what “designerly ways of knowing” evolved to address (Cross 2006, p. iv). It does so to suggest a practical advantage that does not currently exist, which is a goal art does not share. In other words, both appreciate originality, but for distinct reasons. Novelty is the context in which design operates as well as a necessary means to gain an advantage, either because each emerging scenario requires unique solutions or because surprise has instrumental value in a competitive environment. Originality is not, as it is for art, intrinsically valuable.

Finally, to use an even greater degree of abstraction, consider this final attempt to distinguish the three: whereas art is expressive and science is explanatory, design is expedient. In other words, it is oriented toward timely pragmatism and unapologetically embraces intuitive shortcuts (or what psychologists call heuristics) to imagine solutions that are not necessarily optimal but still sufficient. Designers, for instance, often employ nonlinear, conjectural reasoning in which prototypical ideas are quickly sketched (figuratively or literally) in iterative cycles of increasing fidelity in order to “capture, analyze, explore, and transmit” possible designs (Carlgren et al. 2016, p. 51; Dorst 2015, pp. vii–viii). “Expedient” can also indicate a degree of suitability for the moment (i.e., situational fitness), which foregrounds design’s emphasis on agile adaptation to the particular versus a comprehensive search for universally valid solutions. This is somewhat natural, since design expects, and moreover nurtures, dynamic emergence – even in the design process itself. Furthermore, the sense of expedience as opportunistic or exploitative invokes the moral ambiguity of “designing” and related words that share this embedded duality, such as planning, plotting, or scheming (Flusser and Cullars 1995, pp. 50–51). Perhaps an even more revealing synonym for design is “crafty,” which can mean both manipulating material as well as cunning machination. This further strengthens the point that design, which is an evolved capacity for agile action inspired by abstraction and imagination, offers a strategic advantage in our complex and competitive world.

To have this strategic effect, design is necessarily integrative, an adjective that is even more insightful than expedience. In one sense, this is about humble attempts to consider the totality of the circumstance in which an advantage is meant to operate in and upon. Designers will inevitably employ some analytical distinctions, but for their results to operate effectively “in the wild,” they must always return to the perspective of “an integrated, complex whole that is not cleaved into clear, distinct, and separate taxonomies” (Nelson and Stolterman 2014, pp. 18–19). This orientation toward integration extends beyond how the situation is viewed and onto “designerly ways of knowing, thinking, and acting” (Cross 2006, p. 100). Design techniques, some of which are mentioned above, can be framed as integrating diverse perspectives, integrating solutions from dissimilar areas or past precedents, integrating the search for solutions with the understanding of the situation, or integrating disparate thinking styles in a single individual or in design teams. There is also integration in the sense of the executive skill that carefully assembles a portfolio of ideas, procedures, and perspectives for a specific situation.

In design, pragmatic results supersede strict adherence to a formal process. Thus, a multidisciplinary approach is natural; an ill-defined, open-ended problem necessitates an ill-defined, open-ended strategy. Indeed, design could be understood as not just multi- or interdisciplinary but “undisciplined,” because the artificial constraints inherent in academic paradigms have less authority when the focus is on finding what works. So, to craft advantages, designers adopt and adapt whatever information or methods may be useful, including those from art and science. Again, those “two cultures” emerged well after humans honed the innate “designfulness” we still employ daily, though somewhat unconsciously. The pervasiveness of design – our ever-present predisposition to “consider a situation, imagine a better situation, and act to create that improved situation” (Dorst 2015, p. vii) – may also contribute to its relatively hidden status.

The lack of widespread appreciation for design as a distinct and separate tradition does not mean this third culture has been static. Just as art and science have subdivided multiples times over, design has done the same, evolving well beyond the type of design that enabled humanity to advance from those humble origins. In time, multiple movements emerged, each with a varying affinity or aversion for movements in art or science. From the earliest distinct forms of design (e.g., architecture), design now includes industrial design, graphic design, organizational design, experience design, transportation design, process design, systems design, urban design, and many more. A full exploration of these activities, including their relationships to the core of design and to each other, is beyond the scope of this work. Instead, the following section sketches a rough historical context from which military design emerged and within which it continues to evolve today.

Making History

Over time, our species learned to exploit the opportunities and mitigate the dangers of a variety of environments. Toolmaking was central to this development, creating a positive feedback loop that functioned as both a cause and effect of human evolution. Of course, evolution has been, and always will be, interconnected with the physical, mental, social, and ecological dimensions of life.

As humans spread across the world from our origins in Africa, their settlements also grew denser, particularly after the “Agricultural Revolution” began in the area in and around modern Iraq (Christian and McNeill 2004, pp. 140–145; Harari 2018, pp. 77–78). Some places, by virtue of the environment, the number of people living there, and how those people were organized, were able to support the deliberate production of food. This was a function of imagination that involved deliberate exchange of materials between groups, domestication of selected flora and fauna, and thinking on a much longer time horizon for planning than required for a mode of existence based on hunting and gathering. Some theories assert an even greater role for the human capacity for abstraction and suggest, at least for some locations, ideological and cultural factors may have created the original impulse for agriculture (Christian and McNeill 2004, pp. 207–243; Harari 2018, pp. 89–90, 100).

Agriculture was neither a quick nor complete transformation, but for those areas that settled into the sedentary construct, some common innovations emerged.

Specialization is one example. In an earlier time, everyone did everything that anyone could do. The rise of civilizations, however, created a need and a capacity for individuals to focus on particular tasks that would, in sum, support the functioning of the entire group. This included efforts to produce food; to construct artificial environments to grow, water, or store food; to protect people and resources from internal or external threats; and to enforce various roles and their relationships to each other through bureaucratic, legal, and religious structures (Christian and McNeill 2004, pp. 245–252, 274–282; Harari 2018, pp. 99–103). In each of these cases, people realized advantages by imagining them in the mind and then implementing specific designs in the world. Traces of modern design are discernable in the specific cases in which individuals were directed to execute someone else’s idea, but there is little evidence this was a common distinction or that they consciously developed the expediency and integration associated with design. Regardless of the status of “design” at the time, material intervention, powered by imagination, continued to play a role as civilizations expanded east to Asia and west to the Mediterranean Sea.

Despite some Eurocentric tendencies in world history that suggest otherwise, all areas continued to experience technological, social, political, and economic innovations. Still, the intellectual history of the Anglosphere is influenced more by developments in the areas southeast of Europe, including their evolving attitudes toward design. One notable inflection point occurred in the city-state of Athens around the turn of the fourth century BCE. In the centuries leading up to this moment, oral traditions captured a holistic conception of wisdom and work (i.e., thinking and doing). Homer’s written version of these stories, The Iliad and The Odyssey, honored the practical intelligence of craftsmen. In Homeric epics, which played a key role in aristocratic education, it is possible to identify design in the thinking that informed material production. Moreover, because “designerly ways of knowing and acting” enjoyed an honorable reputation, metaphors of design were expected to inform everything from statecraft to morality. This integrated view, however, suffered a near-fatal blow from a young Athenian who went on to become the most influential philosopher in Western culture: Plato (Trew 2015, pp. 11–12, 21).

“The safest general characterization of the European philosophical tradition is that it consists of a series of footnotes to Plato” (Whitehead 1979, p. 39). A core tenant of his philosophy was the claim that “changeless, eternal” forms existed and could only be discovered through rationality. Most people, however, were incapable of seeing past the “shadows” of reality; only philosophers could discern and explain abstract truths (Stumpf 1995, pp. 58–59). In contrast, the “useful arts” of design were accessible, contextual, and based on tacit knowledge. Furthermore, Plato introduced a rigid hierarchy into his distinction of thinking versus doing, with the head literally and metaphorically above the hands, which were downgraded to mere instruments to manipulate the environment. This bias persisted as the locus of Western civilization shifted west to Rome and then north to Europe and then west again across the Atlantic. Indeed, at the risk of smoothing over a large span of history and counter-ideas, Plato planted seeds across the globe that would fully sprout centuries later in the Modern Era.

The transition from the Middle Ages to modernity is defined by a variety of interdependent changes occurring at different rates in various places and with different results. In general, however, there was a rediscovery of classical Greek and Roman philosophies, the emergence of the international system which sanctified the sovereignty of the state, and global exploration by the most powerful of those states.

Scientific advancements – which for some are the essential feature of this transition – included metallurgy, steam-powered machines, Newtonian physics, and the empirical process for discovering (really, designing) new knowledge itself, the “scientific method.” The focus on rationalism continued to build momentum in the subsequent “Age of Reason” and became “an unstoppable secular trend…expected to squeeze out emotions and romance, thereby removing intrusive sources of error and uncertainty” (Freedman 2013, p. 609; Bousquet 2009, p. 41). What was accurate and useful for mechanical systems was extended beyond the physical world: The concept of efficiency became a measure of human activity and some even claimed the entire universe could be understood as “clockwork” (Alexander 2008, pp. 1–3; Bousquet and Curtis 2011, p. 45).

“Manufacturing” Design

Setting aside trends in intellectual history and the various periodization schemes based on those trends, the dominant mode of production across all civilizations remained relatively unchanged for a millennium after Plato. Despite his disparaging remarks, most work was accomplished by animal power, including that of Homo sapiens. Despite the increasing scope and depth of specialized labor, design was not a distinct activity; the conception of an artificial intervention was tightly coupled with its actual construction. Artisanal production was necessarily small scale and tailored to an individual’s gifts and resources as well as the means and desires of the user (which well may have been the artisans themselves). There was little impulse – or even practical instruments accessible – to rigorously codify knowledge, portions of which were largely tacit and intuitive anyway. Transmitting the craft tradition to future generations was done through lengthy apprenticeships. Something entirely different, however, emerged with the Industrial Revolution that started in England in the eighteenth century.

Amidst the acceleration of technological innovations and the accompanying social, economic, and political changes, design finally emerged as a definable field of human activity. While historians of technology justly remind us that craft production continued (just as “obsolete” technologies persist well past their spotlight in conventional histories), work that was amenable to industrialization radically changed in character (Edgerton 2011, pp. ix–xvi, 52, 74). Rudimentary machines and the large capital investment to create them meant production became centralized in factories within which the manufacturing process was dissolved into repeatable, standardized steps. Society pivoted to respond to and support this shift, which increasingly delivered palpable changes to the human experience in multiple waves of industrialization that spread into other fields and other areas of the globe.

These trends further strengthened Plato’s separation of thinkers and doers. Traditionally, even well after he wrote The Republic, designing remained an integral piece of a seamless process of realizing advantage. Thus, in architecture – perhaps the first distinguishable subdiscipline of design, born of civilization’s demand for and means of creating artificial environments – the architect was the master builder. Plato’s division really took hold in the Industrial Revolution, which accelerated design in at least two new directions. One developed into the field of industrial design that engineered the increasingly sophisticated machines (indeed, design is related etymologically to “machine” (Flusser and Cullars 1995, pp. 50–51). The other, which was not formally identified as design until centuries later, was the management of the humans who operated among the mechanized coworkers. Again, the later field imported ideas of the former, with references to people as “cogs in the (organizational) machine” and efforts to find efficiencies through analytical reductivism (i.e., reducing all tasks to elementary parts presuming the whole is no more than the sum of its parts) (Bousquet 2009). In fact, at the turn of the twentieth century, “scientific management” became a “secular religion” and its prophet Fredrick W. Taylor preached the gospel of the “one best way” to approach any problem (McChrystal et al. 2015, p. 45). In the spirit of Plato, this wisdom was the exclusive domain of a few: “I have you for your strength and mechanical ability,” Taylor is often quoted saying, “We have other men paid for thinking” (as cited in Kelly 2018, p. 13).

Simultaneously, many of those involved with the design of material objects, including the built environment, were equally enamored with objectivity, rationality, and scientific approaches. One contemporary designer noted the shift, claiming “Our epoch is hostile to every subjective speculation…The new spirit, which already governs almost all modern life, is opposed to animal spontaneity, to nature’s domination, to artistic flummery. In order to construct a new object, we need a method, that is to say, an objective system” (emphasis added) (Vermaas and Vial 2019, p. 352).

Throughout the first half of the twentieth century, literal and metaphorical “industrialization” continued across multiple domains of human experience, beyond the obvious technological advancements. Educational theories, corporate structures, urban development, and even fiscal policy (in the aftermath of a Great Depression caused by unrestrained market forces and mass panics) all reflected an impetus on rational, objective control by managers (Freedman 2013, pp. 460–461, 494–502).

Managers became the manifestation of Plato’s declaration that people must be led by the intellectual elite. To support this new breed of professional, management thus became an academic discipline. Developments in social science and operations research inspired its mental models, goal-oriented and sequential planning became its mantra, and hierarchical organization became its means. Managers designed highly orchestrated courses of action to achieve preordained objectives and then directed others to execute their plans with no feedback mechanism to test assumptions or account for changing conditions (Liedtka 2000, p. 8).

A figure who aptly represents the spirit of this age, and who will come up again in the next chapter, is Robert McNamara. Previously a professor at Harvard Business School (an institution highly influenced by Taylorism), he later joined Ford Motor Company as it began modernization efforts in the 1940s. According to Freedman, McNamara and like-minded individuals “epitomized rationalism in decision-making, deploring reliance on intuition and tradition, and were unbothered by their lack of industrial experience.” And they were rewarded for their stance. As a signal of the growing power and prestige now afforded to managers, the former accounting professor became Ford’s president for a short period in 1960 before accepting a position in President Kennedy’s cabinet (Freedman 2013, pp. 461–462, 501).

It was not just scientifically minded managers who enjoyed heightened stature in their field. Many professional designers continued to gain prominence by advancing the earlier trend toward a science of design. The same decade that McNamara was elevated into national politics, the design methods movement appeared and iconoclast Buckminster Fuller declared it the “design science decade” (Cross 2006, pp. 96–99).

The same trends imported into the business world – rationality, order, precision, and predictability – were applied to the design of industrial products, architecture, and urban areas. Herbert Simon’s 1969 work, The Sciences of the Artificial, sought to establish “a body of intellectually tough, analytic, partly formalizable, partly empirical, teachable doctrine about the design process” (Simon 1996, p. 113). Notably, the author’s background as economist, political scientist, and cognitive psychologist reveals what influences were considered relevant for designers.

His book is an insightful inflection point for capturing trends in design discipline. Simon and others increasingly promoted a positivist design logic in which problems are broken down into precise elements so that novel solutions could be created in the reassembly of the isolated parts. Yet, while The Sciences of the Artificial is often seen as the culmination of scientific design thinking, it would also facilitate a counterculture design movement. Some began challenging the field’s obsession with rationality and rigid reductivism and critiquing the industrialized methodology as insufficient and even counterproductive. Alexander, who worked at the forefront of the design movement of the 1960s, reversed his support for the positivistic approach and suggested designers should “forget it…forget the whole thing” (Alexander 1971, pp. 3–7). Papanek later described this as a schism between designers that “are trying to make the design process more systematic, scientific, and predictable, as well as computer-compatible” and opponents who “follow feeling, sensation, revelation, and intuition…‘seat-of-the-pants’ design.” In his opinion, the rational approach leads “to reductionism and frequently results in sterility and the sort of high-tech functionalism that disregards human psychic needs at the expense of clarity” (Papanek 1988, p. 4).

Similar doubts of pure objectivity and mechanical causality emerged elsewhere. New theories of complexity highlighted the nonlinearity of many systems (Bousquet and Curtis 2011); public policy experts looked for alternative decision-making models less dependent upon the ideal of pure rationality (Mintzberg 1978); urban planners developed the concepts of “tame” problems and “wicked” dilemmas to explain why economic logic often failed in the face of local political and social contexts (Rittel and Webber 1973, pp. 159–165); sociologists wrote about the social construction of reality (Burrell and Morgan 1979); and postmodernists highlighted the contingency of truth (Lyotard 1979).

The history of science was even reconceived in Thomas Kuhn’s highly influential work, The Structure of Scientific Revolutions. He argued persuasively that science advanced, in part, by nonrational factors including sensitivity to context, resistance to change, and a “leap of faith” toward a new paradigm when unexplainable anomalies in the current one created a crisis of confidence (Kuhn 1996, pp. 122, 154–157).

Kuhn’s explanation of paradigms was highly influential outside of his field. Indeed, society writ large seemed conscious of its position at the revolutionary moment in the Kuhnian cycle. The techno-rationalistic paradigm it had inherited increasingly appeared inadequate to describe the way the world worked or guide effective actions therein. Sanders describes the new paradigm as a shift from a deterministic universe of atomistic agents to a dynamic world of intersubjectivity; from rigid hierarchies to adaptive networks; from reductionism to synergism; and from rational and discrete planning to reflective practice and emergent opportunities. Keywords entering into Western discourse included holism, mutual causality, indeterminism, adaptive self-organization, and postmodernism (Trew 2015, p. 28). This was not just an academic exercise, however. In some cases, there were political, environmental, and humanitarian crises that cast doubt on the value of robotically adopting technological and scientific concepts into other domains of the human experience (e.g., the Watergate scandal, oil spills, acid raid, and the Challenger shuttle accident) (Hughes 2005, pp. 84–96; Pursell 2007, p. 134).

Amidst these changes, management – which, to reiterate, can be considered a form of design – reevaluated its mechanical processes, strict hierarchical structures, and its past basis in Taylorism. A softer approach gained influence, emphasizing subjective elements such as organizational culture as well as individual passion and creativity (Freedman 2013, pp. 551–556). One business book, inspired by sociology, declared, “The numerative, rationalist approach to management is right enough to be dangerously wrong, and it has arguably already led us astray” (Peters and Waterman Jr 1982, p. 29). Even the person who was credited with creating the discipline and who was so gratified to continue Taylor’s legacy, Peter Drucker, acknowledged the limits of reason and endorsed nonrational factors such as intuition. He acknowledged that the separation of those who design plans and those who implement them was a “dubious and dangerous” construct and that adaptation to unpredictable context necessitated some degree of decentralization (Drucker 1986, p. 284). In 1973, the same year Horst Rittel and Melvin Webber introduced the concept of “wicked problems,” the oil crisis demonstrated the limits of corporate design to handle complex changes. Within a decade, Jack Welch led General Electric to abandon its highly formalized, highly analytical design process, rejecting the “endless quest by managers for a paint-by-numbers approach, that would automatically give them answers.” The chief executive also noted that “any cookbook approach is powerless to cope with the independent will, or with the unfolding situations of the real world” (Freedman 2013, pp. 503–504, 528). In response, Mintzberg noted that the most well-designed plans build in feedback so that “deliberate” intentions could be balanced with unforeseeable “emergent” factors (Mintzberg 1994, p. 111). Altogether, this “social turn” in the discipline reemphasized how to realize strategic advantage: Design is a learning contest reliant upon dialogue, experimentation, agility, and an even healthy dose of humility; design has never been a purely objective, scientific endeavor.

A better understanding of design essentially revitalized the subjective elements Plato denigrated centuries before. This, in turn, implied a need to explore more sophisticated descriptions of human cognition. Indeed, Simon himself contributed to the notion that rationality was, at best, “bounded” to specific cases outside of which we relied upon heuristics as expedients to decision-making (Cross 2006, p. 99; Dorst and Dijkhuis 1995, pp. 261–274; Simon 1982). The implication for design research was to explore, as the title of Bryan Lawson’s book clearly identifies, How Designers Think. By studying professional designers, scholars hoped to capture the essence of design, even as it continued to fracture into various subfields. Donald Schön, for example, highlighted the process of “reflective practice” including the essential role of deliberately and iteratively “framing” the problem itself: to “set its boundaries, select particular things and relations for attention, and impose on the situation a coherence that guides subsequent moves.” As this suggests, their work not only identified techniques (heretofore treated as nearly mystical and thus unstudiable), but also helped clarify the nature of design problems. “Artistic, intuitive” processes of “design thinking,” he continued, were the only appropriate response to situations of “uncertainty, instability, uniqueness, and value conflict” (Schön 1984, pp. 22, 49). Indeed, as these challenges plagued a number of fields, design thinking continued the trend – already operating in management theory – toward becoming “unbound” from its historical basis in the manipulation of material artifacts (Pendleton-Jullian and Brown 2018).

This did not mean that all versions of design pivoted to a new paradigm.

Industrial designers, who necessarily remained very “bound” in the physical world, continued to hone their engineering approach to realizing advantages. Indeed, in complicated systems – those amenable to reductivistic logic because their many parts involve known, stable causal relationships – it is useful and appropriate to codify “best practices” into doctrine or checklists (Ackoff 1981; Cross 2006, p. 27; Snowden 2007, pp. 58–60).

Procedural knowledge certainly facilitates learning more efficiently than the time- honored tradition of craft apprenticeship. Efficiency is, however, meaningless, when there is no effectiveness. In other words, while there are certainly advantages to making otherwise unspoken processes explicit, there is a risk that tacit elements are lost or that doctrine is inherently obsolete because, as Ian Eishen offers, the “checklist is the best way we knew how to do something yesterday” (personal communication, December 12, 2021). The use of routinized wisdom should therefore carry a warning label that the information is perishable in the face of any complexity. Thus, when it does not match the context, designers must take the difficult step of “dropping their tools,” as sociologist Karl Weick writes. For many, one particular school of design has come dangerously close to misrepresenting the artistry of design thinking as a “miracle cure” (Carlgren et al. 2016, p. 39) using simplistic “Mad Libs” style, fill-in-the-blank poetry.

Designing Mainstream Design

“Design Thinking,” herein distinguished from earlier, more generalized insights by its capitalization, emerged from noble intentions. Industrial design, though focused on complicated systems and engineered solutions, is still vulnerable to the “wickedness” of the complex: products are imagined by humans, intended to realize advantage for humans, and constructed by humans. And humans, by nature, introduce unpredictable emergence into everything. Thus, even the most elegant technical design fails if it does not account for the human element. So, just as management began to integrate sociological factors, professional designers did the same (yielding further support to design as “integrative”). Methods, organized in categories titled as “empathy” or even “ethnography,” formed what has become known as “human-centered design.” As summarized by Dr. Ben Zweibelson, this evolution “shifted some of the purely analytic- based optimization mindset of industrial design towards subjective aspects of the complex social-economic qualities of the human condition. Here, empathy, multiple perspectives, paradox, and complex dynamic systems would soften industrial design” (Zweibelson 2019, para. 14). Other researchers highlight additional characteristics, including creative, conjectural thinking; promoting diversity of ideas and individuals; rapid iterations to visualize abstract concepts and to prototype solutions; a tolerance for ambiguous, risky situations; and reframing not only the initial challenge, but also reimagining rapid failure as important to a designful learning process. Harkening back to the start of this chapter, some note that it is the integration of these elements that may be “the key to understanding Design Thinking” (Carlgren et al. 2016, pp. 50, 53; Micheli et al. 2018, pp. 10–13).

However defined, by the end of the century, Design Thinking was well established in both the commercial and academic ecosystems of Silicon Valley. This was largely due to emergent leaders in each of those fields. In 2004, Stanford University founded the Hasso Plattner Institute for Design (otherwise known as the “d.school”) to teach the design methods that had been successfully honed over the previous decades by the nearby design firm IDEO (Katz 2017, pp. 144–145). Other commercial and academic entities have since entered this exchange, but Stanford and IDEO remain the most influential. The result is that their particular style of design thinking has become commodified into Design Thinking.

This was not without costs, however. First, the distinction between design thinking and Design Thinking is rarely noted, leading to further confusion on design. Second, the history of design theory was omitted; as one analysis reported, “for most designers, design has no history” (Rodgers and Bremner 2017, para. 15).

Furthermore, to make it easier for nondesigners, complex methods were boiled down to formulas (numerous templates, including the “d.school bootleg,” are available online). This invites the same criticism levied against positivistic design. Peter Rowe, for example, writes in Design Thinking that “in the real world…we discover there is no such thing as the design process in the restricted sense of an ideal step-by-step technique” (Rowe 1998, p. 2). Moreover, as many organizations adopted a “linear, gated, by-the-book methodology,” they dulled the potential of design into a force that was capable of “delivering, at best, incremental change and innovation” (Nussbaum 2011, para. 4).

Naturally, some began wondering how design ever became so popular (Kimbell 2011, pp. 286, 293).

Even with (over)simplification, there is lack of consensus, with “stark conceptual divides over the very definition” of Design Thinking (Carlgren et al. 2016, pp. 40–41). Yet, despite all the downsides and debate, this democratization of design has been, arguably, still an important advancement. It gives real energy to the claim of many design researchers (and this chapter) that design is a “fundamental form of human intelligence” honed by professionals, but accessible to all (Cross 2006, p. vi). The scaffolding it offers eases the application for novices (though one that should support learning until it can be removed, as actual scaffolds are in building construction) and is an initial step toward new language, models, and methods to fully express design’s transformative potential. Finally, while the approach began with an orientation toward novel products and other commercial innovations, the empathic, human-centered approach is always relevant given that design is always done for, with, and through humans. It is, therefore, a positive trend to see Design Thinking moving across to new fields, including democratic governance, ecological resilience, personal wellness, and education, and up into higher echelons of organizations (Brown 2019, pp. 7, 37, 149; Kimbell 2011, p. 287; Micheli et al. 2018, pp. 1–2; 18; Nelson and Stolterman 2014, pp. ix–x). The integration of design into executive levels of business now links it explicitly with the scholarship on management, with some advocating that “design thinking needs to pervade everything business students do” (Dunne and Martin 2006, p. 522). Moreover, the trend has contributed to a greater appreciation of design and “designerly ways of thinking” as both “central to modern ways of working” as well as our entire “way of life” (Krippendorff 2000, p. 3; Lloyd 2019, p. 177).

To repeat an earlier criticism, what Design Thinking does not do is situate the discipline within history. Yet, design – as the mental conception preceding the material intervention with the intent to create an advantage – has always been central to human existence. Granted, it lacked distinction until civilization and specialization intensified, giving rise to architecture as the first discernable design field. Eventually, the Industrial Revolution created the impulse to privilege rationally ordered manufacturing over artisanal production and two other distinct fields of design were born: industrial design and management. Numerous distinct fields of design also emerged, each with divergent sets of skills based on their medium, intellectual traditions, and a number of other idiosyncratic factors.

Admittedly, the diversity of current design activities stretches the conception of a singular, coherent discipline. Still, there is a common shared essence or what Wittgenstein called a “family resemblance” (Lawson and Dorst 2009, p. 24): They share tendencies toward integration and expedience to realize advantage in complex contexts that are not vulnerable to optimization or purely objective analysis (even if humans were capable of performing such feats). This includes an area of design – largely omitted from the scholarly or popular discourse on design – that not only participates in these crafty ways, but perhaps best exemplifies them: the use of design in warfare.

Designing Advantage

Design and security are fundamentally intertwined, and the following chapter will expand on their connection, restoring the linkages deliberately omitted from the historical sketch above. For now, however, it is useful to briefly reflect on the generic language used in this chapter to introduce design, for it gestures to how the security context fits naturally into the (pre)history of the discipline. First, design is related to realizing advantage in dynamic environments. The links to security are obvious. The threat or use of force for political effect creates (or reacts to) conditions that are not only volatile, but decidedly consequential as well. The setting is arguably more complex and more vital than the commercial realm that design is more closely associated with, which is why military designers may gain insights hidden from those designing products, experiences, or managerial processes. Furthermore, this reconnects “design” to feigning, cunning, and manipulation. These are all synonyms (and implications) civilian designers tend to ignore (Wendt 2017, pp. 4–9) and connotations that bring generic design even closer to this handbook’s military focus.

Ultimately, among all its subfields, military design uniquely links the discipline to its historical function of leveraging imagination for strategic effect. This relationship may be clouded by the timing in which the phrase “military design” first appeared. As the next chapter will detail, the term gained usage in the late twentieth century, implying it was simply another offshoot of the design discipline as it was splitting into many different directions. In actuality, the way design thinking has been used for warfare is more independent, having developed with – and influenced – all human history. In other words, even though it is often ignored by civilian designers, the use of design in military contexts is closer to the origins of design and how it realizes advantage in wicked contexts through integration and expedience. Indeed, military designers test their craft in the most wicked of human endeavors: the use of organized violence for political effects, which is eternally cursed – as Clausewitz hauntingly reminds us – by the paradoxical “trinity of friction, chance, and uncertainty” (Gray 1999, p. 41).

Cross-References