Keywords

1 Introduction

Deterrence revolves around the mind; in essence, it is a psychological game that is played to influence the decision-making process of another actor. Patrick Morgan defined the nexus of deterrence and psychology succinctly as follows: ‘Deterrence is undoubtedly a psychological phenomenon, for it involves convincing an opponent not to attack by threatening it with harm in retaliation. To “convince” is to penetrate and manipulate the thought processes of the opposing leaders so that they draw the “proper” conclusion about the utility of attacking’.Footnote 1 Indeed, deterrence, as Freedman and Mazarr recount respectively in Chaps. 1 and 2 of this volume, aims to dissuade an opponent from taking undesirable actions. Clear communication of demands (a red line for instance), coupled with a credible threat to inflict pain if necessary, and demonstration of resolve are some obvious essential elements for creating effective deterrence. Success crucially also depends on whether the opponent receives the intended signal, interprets it as intended, and has the perception that the message is congruent with reality, i.e., that the opponent can make good on their threats. Success furthermore assumes that the demands communicated are acceptable. If those prerequisites exist, theory suggests a rational actor will back down, after weighing the benefits of the envisioned actions versus the potential costs that may result when the threat is executed.

This chapter offers a synthesis of insights that have appeared since the 1980s that fundamentally challenge that assumption of rationality. This contribution about the workings of the human mind concerns the various filters and cognitive shortcuts that colour the incoming stream of information and the processes to digest it and come to a decision. Just as the human body and mind are closely tied to each other, emotion too is integrally connected to one’s way of thinking and behaviour. The logic of affect, or emotional choice theory, states that decision-making is based on the dynamic interplay between one’s emotions, norms, and identity.Footnote 2 The emotional part of the mind I will leave to Zilincik and Duyvesteyn in Chap. 24 of the present book as well as to many other researchers.

This chapter benefits from ongoing research in, at one end of the war-coercion-deterrence-spectrum, Peace Psychology, which is a division within the American Psychological Association (APA). The full name of this Division 48 is Society for the Study of Peace, Conflict, and Violence. Peace psychology has as its focus the application of psychology to issues of peace, conflict, and violence, often in the context of international politics. Second, it builds on insights from political psychology, which is not (yet) an independent and distinctive tag in the field of psychology, yet in the last decades, it has contributed quite significantly to research on international studies, international relations and foreign policy.Footnote 3 This contribution is welcomed because of the integrated approach required for the understanding of political problems, an understanding that also requires the disciplines of, for example, political science, sociology, history and economics. Moreover, the annual meeting of the International Political Psychology Association in 1982 formed the cradle of the seminal work Psychology and Deterrence.Footnote 4

While this chapter focuses on the deterring person and the individual receiver, the deterred, it is absolutely of value for understanding decision-making processes at higher aggregate levels (groups, military organizations and governments).Footnote 5 In her standard work Foreign Policy Analysis, Valerie Hudson deprives the reader right from the start of some illusions. The field of international relations is basically about … understanding how humans perceive and react to the world around them, and how humans shape and are shaped by the world around them….Footnote 6 The first two levels of the nine major levels of analysis in foreign policy analysis she constructed are about the individual: cognitive processes and leader personality and orientation—all basic aspects in this contribution.

After a short preface about the historical developments in deterrence research, the stage is set for the deterring mind, and the corresponding eye of the beholder: six of the most common heuristics related to deterrence are discussed. Framing as a major bias is then introduced, followed by prospect theory. Finally, some of the consequences of using heuristics, defensively as well as offensively, are discussed.Footnote 7 The final conclusion can be summarized as follows: when deterring, know yourself well, very well, as well as your opponent, and work in a team.

2 Rationality and the Evolution of Deterrence Theory

Jeffrey Knopf usefully introduced the idea of four waves in the deterrence literature. The first wave took shape after the invention of the atomic bomb and the setting of the new power blocks, right after World War II.Footnote 8 The second wave emerged in the 1950s and 60s. Fuelled by research in the RAND think tank and a steady involvement by their researchers in policy and strategy development (Wohlstetter, Schelling, Kahn, Kaufman), during the Cold War at least the Western military and policy makers embraced the rational actor model (RAM) to plan the strategies of nuclear deterrence. This classic deterrence theory, at the end in practice worked out till MAD, is a bold tit-for-tat-game without an option to transform a competitive relation into a cooperative one. With regard to the USSR, this strategy was developed under the assumption Soviet leaders would think and act as reasonable, rational humans as well.

From the 1970s on, new insights from the psychological, economical, and decision-making literature made the shortcomings of RAM more prominent. Graham Allison’s study on the Cuba Crisis which became a bestseller, showed how organizational and political interests, processes and routines, and group think all brought the US and Russia to the brink of a nuclear exchange. Robert Jervis in turn noted that a state can rationally choose to fight a war it thinks it will probably lose if the gains of winning and/or the costs of alternative policies are great enough. Statesmen may also adopt deterrence policies that are not in the national interest because they are acting on the basis of their domestic or personal interest, so the external threat focus assumption is invalid. Third, states may create a confrontation or go to war, not in the hope of making positive gains, but in order to avoid the losses that are foreseen unless they do so. Moreover, deterrence may fail because of misperception: one side would launch a first strike not because it was aggressive or believed that war was preferable to peace, but because it was sure that the other side was about to attack and believed that striking first was better than striking second. Jervis also argued that, because the world is very complex and people’s information-processing capabilities are sharply limited, we must all employ a number of short-cuts to rationality (the topic that will be discussed more fully below). For instance, people tend to act in accordance with theories they already subscribe to rather than to fresh data. Second, beliefs tend to be strongly influenced by historical analogies to recent important cases that the person or his country has experienced first-hand. The role of accidents and confusion tends to be underestimated too and other states and alliances tend to be seen as being much more centralized than they actually are. And rather than integrating many values, people often decide on the basis of a single important value dimension.Footnote 9 In the 1990s, prospect theory claimed decisions are influenced by how the issue is interpreted or framed: gaining or losing.Footnote 10 Such insights suggest that real world decision-making deviates significantly from the RAM. Rationality is where you stand (for).

In this third wave the fundaments of nuclear deterrence strategies remained. The last decades these fundaments started crumbling.Footnote 11 There are asymmetric threats (by e.g. rogue states, terrorists, refugees), paired with a new dimension: cyber, and a new constellation (at least in theory): hybrid deterrence or hybrid war. Military force and (nuclear) deterrence is no longer the sole factor. Because of this, a ‘tailored deterrence’ is adopted, a multidisciplinary contingency strategy for each state apart. Decision-making theory has developed as well. Physiological research on e.g. brains, and endocrinology made progress. From psychology, for example, it became clear it is not rationality, or emotion and intuition, but in decision-making they are intertwined and multi-layered.Footnote 12 Clearly research and experience has moved us well beyond the assumptions embedded in the first wave deterrence theory, and this certainly pertains to the rather simplified rationality assumption.

3 Rationality and the Eye of the Beholder

When trying to influence the mind of the other, it is essential to know how the mind is composed, will think and can possibly be affected. As Jervis pointed out; it is hard to find cases of even mild international conflict in which both sides fully grasp the other’s views. Yet all too often statesmen assume that their opposite numbers see the world as they see it, fail to devote sufficient resources to determining whether this is actually true, and have much more confidence in their beliefs about the other’s perceptions than the evidence warrants.Footnote 13 Aware of this trap, General De Kruif. Former commander of Regional Command South (RC-S) of the International Security Assistance Force (ISAF) in Afghanistan, stated that he therefore preferred anthropologists over country experts to explain the Afghan tribal culture.Footnote 14 Knowing how they think and what they mean by their expressions is of great help in the regular discussions with the politicians and governors. De Kruijf was involved in planning and commanding operations and worked at the locations concerned. For politicians and policymakers working in their own familiar environments, with their confidence-inspiring processes, cultures and thinking habits, it is hard to picture the party/individuals on the other side (of the world).

Even in Europe, the ways in which people in different countries think and act, and even the type of communication and language politicians use, can differ markedly. Miscommunication is a constant possibility, and the greater the cultural differences between countries, the greater the probability of misunderstanding and distortion with respect to communication and action in the context of deterrence. This can be a problem from the outset: in terms of urgency and political nuances, for example, is the sender’s message attuned to the culture of the party being addressed? Moreover, many interpretations are made by all of the parties involved already in the run-up to a conflict. These interpretations are also based on assumptions about culture, strategies, policies and the personalities of the leaders. The work of, for example, Hofstede about cultural differences between countries is a bare essential.Footnote 15 Therefore, seen from a rational actor model perspective, building a deterrence strategy in a dynamic world and achieving objectives in relation to the parties involved is a blurred and befogged game. Rationality begins to fade, or rather, as Paul explains,Footnote 16 the common assumption of “instrumental rationality” is, meaning a rationality which solely looks at weighing costs and benefits. Instead, or in addition, we need to acknowledge the workings of “value rationality”, values which may differ for each individual, group, or (failed) state, and which are based on ideology, religion or psychology.

The terrorist attacks of 9/11 for instance demonstrated the impact of religious fervour. As Payne observed, religion may undermine deterrence effectiveness. For instance, the superiority of an opposing force may be an insufficient deterrent against a religiously motivated actor and blind obedience to what is seen as divine will may actually compel battle with even a superior opponent. The faithful may be spurred on by a belief that providence will make them invulnerable and victorious and fighting against the odds may be considered a necessary demonstration of faith. Somewhat similarly, again following Payne, ideology may cause a state to fall victim to “mirror-imaging” and assume that an adversary’s behaviour is as predictable as its own, because the adversary is motivated by the same or similar logic, values, and objectives. Mistaken threat perceptions can arise from ideological influences, leading states to perceive dangers where none exist or ignore threats with an objective reality. Significant ideological differences between two states can result in miscommunication, both in words and in actions (force deployments and exercises, for example) intended to convey intent. Some ideologies may also encompass or produce absolute goals, the attainment of which may be worth virtually any price to an adversary, something that would undermine deterrent strategies based on the opponent weighing the costs and benefits of a course of action.Footnote 17 Any deterrence strategy should therefore be based on a proper understanding of the interplay of such intangible factors such as religion, history, culture, and ideology and their impact on individual and collective cognitive processes.

4 Our Thinking Patterns: Heuristics and Biases

As Janice Gross Stein already noted, neuroscience is a very important factor when translating deterrence theory into practice.Footnote 18 Indeed, research in the fields of psychology and behavioural economics during the last three decades has shed light on the dynamics at play affecting rational choice processes, in particular in high stakes contexts,Footnote 19 suggesting heuristics, biases, stereotypes, mental models and psychological fallacies in general are omnipresent. Taking advantage of the epic work of Tversky and Kahneman,Footnote 20 which should be compulsory literature for every decision maker, the following section offers a brief sketch of heuristics (rules of thumb, to put it simply) and biases (systematic errors). In our normal mode we make System 1 decisions; that is, we make decisions quickly without deliberation. To do so, relying on experience, an expert uses his skilled intuition. An amateur, not knowing the answer to a complex issue, reframes the problem in simpler terms, falling back on a heuristic. There are certain kinds of heuristics (see below). These are based on intuition, built on recognition. However, because the complex issue is redefined as a simple question and the stored recognition differs (completely) from the actual context, heuristics are by definition biased. System 2, on the other hand, is the analytical process. This takes time and energy, but the outcome is more objectively argued, even though this process does not provide the certainty that the outcome is right, or at least more right than the System 1 solution. These thinking-modes are common for deterrence as well. It is about interpretation, building the situation assessment, based on objective and/or subjective stimuli. Because both time and energy are scarce in times of crisis, System 1 is tempting. First, let us explore heuristics and biases.

Heuristics, those rules of thumb, are strategies derived from experience with similar problems, using readily accessible, though loosely applicable, information to control problem solving in human beings, machines, and abstract issues.Footnote 21 People use heuristics and biases to survive in our complex and challenging world. The automatic pilot functions to enable a person to focus on activities that require brainpower. We use heuristics in our work as well as in our daily social lives, and we start creating them from babyhood onwards. These rules work well under most circumstances, but in certain cases lead to systematic errors or cognitive biases. A heuristic is used more or less unconsciously. We have to use brainpower to identify a heuristic and discover its rationale, roots and specific construction.

Biases, on the other hand, are inclinations or prejudices for or against something or somebody. Once we have adopted one (unconsciously), there is almost no clear rational track leading from our thinking and acting back to causes or persuasions. To increase the complexity, most biases are emotionally loaded. Below six of the most common heuristics are introduced (anchoring, confirmation, availability, representativeness, affect and fluency heuristics), which will be followed by a discussion of biases that affect decision-making processes.

4.1 Anchoring

The anchoring heuristic is the common human tendency to rely too heavily on the first piece of information offered (the “anchor”) when making a decision. We give disproportionate weight to the first information that we receive, especially when we have no clue (in chaotic and dynamic times or about an unknown area). Sometimes the anchor is in our memory, something that was once stored and is now possibly outdated or inaccurate in the current context. The given information, serving as an anchor (at least for the target of deterrence), can be deliberately inserted by the deterring party, leading to the framing heuristic (see below). In times of stress and chaos, the effect is difficult to avoid. It is like the instruction “Don’t think about a pink elephant”.

4.2 Confirmation

The confirmation heuristic is a psychological tendency to confirm evidence. It involves seeking information that supports one’s existing point of view and neglecting or ignoring signs that can lead to contrary evidence. It is about assimilating new information into one’s pre-existing beliefs, resulting in seeing only what one expects to be present. Ambiguous or even contradicting information is ignored, misperceived or reinterpreted so that it does minimum damage to one’s own mental model. It is sometimes hard to change one’s mental model of the situation and exchange it for a worse or vaguer one. In a blurred context, swapping the reliable straw that a person keeps for another straw requires mental energy and courage. The confirmation heuristic is vulnerable to biases and can evolve into tunnel vision.

An example of a confirming heuristic is the US attitude towards Japan before the start of the Pacific War in December 1941. In those days, the US military was somewhat dismissive regarding the professionalism of Japanese fighter pilots and the machines that they were flying in. In addition, the US navy expected a traditional naval war with surface ships and gave little to no attention to air raids. In the period prior to 7 December, the US Department of War, Navy headquarters and Washington, DC, received many weak signals about Japanese plans for a massive surprise air assault on the air and naval assets at Pearl Harbor.Footnote 22 These signals, however, were not in line with existing ideas about the potency of the Japanese air force and did not confirm the US naval strategy.

A detail worth noting in this case is that the Japanese attacked Pearl Harbor from the north. The nearest Japanese naval base was to the south at Truk Lagoon. This was why the Americans conducted aerial reconnaissance only to the south. This information was provided by a collaborator from the Japanese consulate. He drove every morning to a hill to observe the direction in which the reconnaissance units flew, and it was never to the north.Footnote 23

4.3 Availability

This heuristic operates on the basis of a mental shortcut that occurs when people make judgments about the probability of events according to the ease with which examples come to mind. The recognition tends to colour situational awareness and decision-making by making information that is already stored easier to recall. Because of this, some can argue that travelling by plane is far more dangerous than driving a car; almost every plane crash is newsworthy. Deterring with an action already used by some party in the past is more powerful than deterring with an action never done before, although it might have more impact when executed. Even so, the impression the events made determine how they are stored in our minds. That is why our memories are more strongly excited by a hijacked plane and the threat of the plane being flown into a building than by a hijacked cruise ship and the threat of the ship being sunk to the bottom of the ocean. The deterrence impact depends on memory. Because of this, the press, television broadcasts, newspapers and social media are instruments for mass influencers.

A special form of influencing is priming, making use of strong points of reference stored in the brain. The point of reference makes it easier for the brain to think of associated topics. For example, exposing someone to the word “yellow” will make him more likely to think of “banana” instead of “apple” when asked to name a fruit. The associations are automatic routines in the brains, individually and culturally embedded, and open to conditioning (like a Pavlov reaction).

The image of the drowned Syrian three-year-old boy on a Turkish beach on 2 September 2015 is a strong primer. Everyone felt a strong sense of pity for him and his family. The image forced us to think about the real problem, whether it was the war in the Middle East, the refugees or migrants, the people smugglers, or the attitude of the countries involved. These rational thoughts were nevertheless based on emotion. Priming by striking an emotional chord is a strong weapon. Think about the images of the Boeing flying into the Twin Towers, or the screaming, naked Vietnamese girl after a US napalm attack.

In several countries, billboards along roads display graphic, real-life images of the results of drunk or distracted driving. The purpose is to deter by confrontation.

4.4 Representativeness

This heuristic resembles the previous one. Where the availability heuristic recalls memories, the representative heuristic compares a situation with mental models in our minds. These representations are stored in our minds, based on our experiences, and are used to make our daily lives easier because they do not require energy. Stereotyping and profiling are forms of this heuristic. We all have our first impressions and immediate opinions regarding Americans, Chinese, criminals, terrorists, military personnel, crime fighters or fire fighters. Reasoning by historical analogy is an example of this heuristic as well, and the heuristic is the foundation of the proverb that the military fights a current war with the doctrine, attitude and mindset of the last war. In their own contexts, would not politicians act in the same heuristic way?

For military personnel, well-known skills and drills are examples of this live-saving heuristic. In this military context, Eikmeier even refers to acronyms as being powerful heuristics. Acronyms used as recognition heuristics have two functions: storage and recall.Footnote 24 In the fields of aviation, hospitals, and the military, checklists are used to ensure that protocols are carefully followed and tracked. Without doubt, checklists improve safety and enhance the quality of the processes involved. There is usually a huge world behind each item of a checklist and the person putting a checkmark is himself a subject matter expert. In this sense, one can see a checklist as a formalized memory aid, like a rule of thumb in that they are largely considered important tools to condense large quantities of knowledge in a concise fashion, reduce the frequency of errors of omission, create reliable and reproducible evaluations and improve quality standards and use of best practices.Footnote 25 Returning to the subject of deterrence, checklists and lists, stereotypes and analogies can aid in recognizing the threat as real, but false analogies and hostile stereotyping can result in unnecessary escalation.

4.5 Affect

The affect heuristic describes the psychological process we are more positively inclined to what we like. Current emotion, possibly intentionally generated, influences decisions. In other words, it is a type of heuristic in which the emotional response, referred to as “affect” in psychological terms, plays the leading role. Deterring always gives rise to an emotional dimension. Connected with this dimension is the level of interaction and nature of the interpersonal relationship. For instance, it pertains to the distance between two persons, seated or standing, how they shake hands or react to other physical contact, poker faces and eye contact. The first impression is important, something referred to as the halo or horn effect. Music and scents play on two other senses and are used in shops, bars or restaurants to seduce.

The meetings between Vladimir Putin and Donald Trump, the two most powerful leaders in the world, are examples of talks where two completely different styles clash. Putin is a rather muscled but small person. His stone-faced behaviour betrays not a single emotion. His voice is muted and during discussions he waits for his moments to make eye contact. His background—and because of this partly his attitude and capabilities related to deterrence—is in the secret service; in working under the radar.

The US president is always physically present. When greeting he thrusts out his big right hand, grabbing the right arm of the other with his left hand and pumping it enthusiastically for rather longer than is comfortable. With this move Trump pulls the other into his personal space regardless of the values and norms and cultural descent of the other. He is aware of his length, width, weight and tanned skin and uses these to impress. His speech is always firm and loud. Not being accustomed to silence during talks (he repeats his short sentences) is perhaps linked to a lack of listening.

Introvert meets extrovert: two protagonists of completely different worlds thinking about and acting on mutual deterrence.

4.6 Fluency

The fluency heuristic is closely related to the affect heuristic. It is a mental heuristic in which, if one object is processed more fluently, faster or more smoothly than another, the mind infers that this object has the higher value with respect to the question being considered. In other words, the more skilfully or elegantly an idea is communicated, the more likely it is to be considered seriously, whether or not it is logical.

Mohammed Saeed al-Sahhaf is known for his daily press briefings in Baghdad during the 2003 invasion of Iraq, the second Gulf War. He was the Iraqi Information Minister under President Saddam Hussein. His colourful and overly convincing appearances caused him to be nicknamed “Baghdad Bob”.Footnote 26 He continually spoke with theatrical sentences and made over-the-top, even absurd, claims about enormous American losses in their cowardly missions and about Iraqi courage and heroic victories. Because of his performances, he made a living caricature of himself or, to put it more strongly, became the archetype of an obvious liar. The summit in this regard was the news broadcast in which he denied that there were any American tanks in Baghdad, when in fact they were only a few hundred metres away from the press conference at which he was speaking. The combat sounds of these approaching American troops could already be heard in the background of the broadcast.

For some people, Al-Sahhaf triggered their affect and fluency heuristics. At a more abstract level, this power play in mass media is an example of deterrence, in this case luckily without teeth. He was the personified precursor of fake news.

5 Biases

Two types of biases are generally recognized. Framing is an example of cognitive bias in which people react to a certain stimulus. A communicator frames by stressing certain elements and omitting less effective ones, and associates some cause and effects to make his point. It concerns the presentation of the stimulus, the description, the context it is placed in and the words and medium used. The content and packaging of the message are deliberately designed to convince the receiver to accept a specific perception. When framing, one can make use of all of the heuristics mentioned. These mental shortcuts are basically already easy prey. Framing techniques make them even more vulnerable to manipulation.

When the sender crosses a certain boundary or enters a grey zone, framing becomes fake news. In the case of deterring, fake news can be a serious weapon if the other party does not know whether the sender is serious or otherwise. Because of this obscuring move, the receiver cannot look into the mind of the sender and loses some clues in relation to the bigger picture or situational awareness regarding the sender. In a hybrid war, for example, given the speed and quantity of social media, framing is an effective tool to plant suspicion and doubt in people’s minds. Framing is also the cornerstone of a successful stratagem. A magician makes use of the same psychological effect. He primes his audience by unnoticeably transferring certain stimuli. Framing is basically a marketing technique. For instance, in the case of product selling, advertisers try with the help of music, photos, movies and text to create an ideal context for each of their target audiences. The way that a product, and even a problem or an issue, is framed can profoundly influence the choices that one makes.

Soon after the two planes flew into the skyscrapers in New York on 9/11, CNN showed live broadcasting with “America under Attack” constantly visible on the screen; quite a statement, even in the chaotic “fog of attack”. These written words or short sentences have a direct impact on TV viewers. With speech it is more apparent, but every written word evokes emotion as well; resonates in terms of our feelings or affect; for example, “chaos” versus “disorder” or, in the case of the 9/11 example, “America under Attack” versus “Four planes hijacked: aiming for selected buildings”. One of the lesser-known changes of 9/11 was that the attacks prompted news networks to introduce the scrolling news ticker at the bottom of the screen—a powerful medium for a big audience. Since being elected as president, the same applies with respect to Trump’s tweets. It is a weapon that the current US president uses very regularly for deterrence by presenting complex and interrelated challenges as straightforward, single-dimension issues. Intentionally or otherwise, he is a master of framing.

5.1 Prospect Theory

A second bias concern how leaders deal with risk and an overview of biases and heuristics as related to deterrence would not be complete without prospect theory. Introduced by Tversky and Kahneman, and explored further by Jack Levy, this is a behavioural economic theory that describes the way people choose between probabilistic alternatives that involve risk, where the probabilities of outcomes are known. The theory states that people make decisions based on the potential value of losses and gains rather than the final outcome, and that people evaluate these losses and gains using certain heuristics. In contrast to rational choice, prospect theory finds that decision makers do not maximize in their choices, are apt to overweight losses with respect to comparable gains, and tend to be risk averse when confronted with choices between gains while risk acceptant when confronted with losses.Footnote 27

Recently McDermott wrote an interesting book about this theory with some historic examples from US foreign policy decisions made by the president.Footnote 28 These cases all involve time pressure, high stakes, conditions of uncertainty and secrecy. When balancing and acting between deterring and coercing, one of the parties might interpret the situation at hand as a negative and losing one. In this scenario, prospect theory assumes that there is an increasing chance that the inferior party will execute more risky and probably unforeseen actions. Emotions such as fear of losing the conflict, shame, honour, or loss of face, losing credibility and status may play a role here. Fear leads to the three coping mechanisms: freeze, flight or fight.Footnote 29

In a deterrence context, this can result in unforeseen actions that have a strong escalatory effect.Footnote 30 Risk-acceptant and non-maximizing behaviour, together with the effects of fear, is not automatically integrated into traditional models of deterrence that assume the rational actor perspective. In short, echoing Jack Levy, applied to deterrence dynamics, the result is that leaders are inclined to take more risks to maintain their positions, reputations etc., than they are to enhance their positions. Having suffered losses, leaders will display an aversion to accommodate to those losses but instead are willing to engage in excessive risk taking behaviour to recover lost territory. This also explains why in principle it is easier to deter an adversary from taking an action than to compel him to terminate an action or undo what he has already done. Similarly, it is easier to deter an adversary from making gains than to deter him from recovering losses.Footnote 31

6 Conclusion: You Think, Therefore I Can Deter You?

Everybody is prone to these heuristics. The biggest advantage of heuristics is the fact that these pre-programmed processes in the brain are fast and frugal. Heuristics are snapshots of the mind. But when there is no hurry, how can one hold one’s horses and take time to make an analytical sweep? Or, when in stress and chaos, how can one being alarmed to take a deep breath and count till three? Or, climbing out of the foxhole with skills, drills and an automated thinking pattern, switch to a more rational state? A valuable observation in relation to decision-making is: At every stage of the decision-making process, misperceptions, biases, and other tricks of the mind can influence the choices we make. Highly complex and important decisions are the most prone to distortion because they tend to involve the most assumptions, the most estimates, and the most inputs from the most people. The higher the stakes, the higher the risk of being caught in a psychological trap.Footnote 32 Therefore, forewarned is forearmed.

In face-to-face negotiations, the whole spectrum of affective behaviour, language and setting can be used to create a pressing or relaxing ambiance; from a poker face, not giving away any clue, to acted, fake emotion to provoke pity or, on the contrary, to deter seriously. A famous example of the latter is this cased ascribed to Hitler. When a British emissary arrived in July 1938, Hitler was not in the mood yet: “Gott im Himmel! Don’t let him in yet. I’m still in a good humour.” According to his assistants he proceeded “to work himself up until his face darkened, he was breathing heavily, and his eyes were glazed.”Footnote 33

Richard Nixon had played with the idea of pretending that he was going to lose his reason because of the domestic and international pressure to end US military involvement in Vietnam and end the war. One day, on a walk along a beach in California, he told Bob Haldeman, his chief of staff: I call it the Madman theory, Bob. I want the North Vietnamese to believe I’ve reached the point where I might do anything to stop the war. We’ll just slip the word to them that, “for God’s sake, you know Nixon is obsessed about Communism. We can’t restrain him when he’s angry — and he has his hand on the nuclear button” — and Ho Chi Minh himself will be in Paris in two days begging for peace.Footnote 34

We should be aware of these heuristics and biases and that we are by nature not rational beings. One should read about, listen to, watch and learn about these psychological processes. System 2 is more analytical, but it remains unreliable because of the reality of the workings of the human mind. Working with and reflecting in a group is a second line of defence. Not only because of this pitfall, but striving for professionalism as a whole, former US president Barack Obama took steps to minimize the potential for “groupthink” to affect his political decision-making. As Coile observed, for his national security council, Obama was deliberately seeking strong personalities and strong opinions. He insisted he wanted advisers who would push back and challenge his assumptions. ‘I think that’s how the best decisions are made’, he said. ‘One of the dangers in the White House, based on my reading of history, is that you get wrapped up in ‘groupthink’, and everybody agrees with everything, and there is no discussion and there are no dissenting views’.Footnote 35 On the offensive side of deterrence, one has to influence the (unconscious) mind of the opponent. This requires a good understanding of the individual personalities of the leaders, their potential biases, their historical frames of references, the frames they have been using in the media to signify the nature of the crisis, and their relative power position vis a vis potential political domestic rivals. Similar information needs to be obtained concerning the group of officials included in the decision-making process. In short, awareness of heuristics and biases, one’s own and those within the leadership of the opponent is essential for coping with the challenges of deterrence.