Keywords

Introduction

A lot of research has focused on the impact of different stressors on performance of military personnel on the battlefield, such as sleep deprivation, fatigue, injury, sustainment of attention, and multitasking. Less appreciated is that many of the same stressors affect performance and learning during routine activities (e.g., training). By design, after all, military training addresses “all aspects of performance under highly stressful and adverse conditions…and…Preparations in this area are key to developing mental toughness and resiliency” (Department of National Defence 2014). In the present chapter, therefore, the impact of a wide host of stressors on learning and performance in training is reviewed, evaluating their effects on outcomes relevant to the military.

Before beginning, however, it is important to situate training-related stress and performance within the larger body of work in military behavioral sciences. In her overview of this large, interdisciplinary domain, Goldenberg (2021) noted that much of the work in military behavioral sciences falls within the discipline of social sciences, although the research also extends into allied disciplines such as physiology, human performance, human factors, and operations research (see Straver 2021). Within this larger sphere, research into training-related stress and performance can be seen as falling largely within the disciplines of psychology and physiology, as researchers aim to understand the impact of a wide host of stressors – both actual and simulated – on psychological and physiological functioning. For this reason, most researchers focusing on training-related stress and performance subscribe to biopsychosocial models of stress, in an effort to quantify the effects of stress on outcome measures that signal biological as well as psychological functioning, including social aspects thereof.

Methodologically, the bulk of the work involves the manipulation of stress in the context of training, and measuring its effects on a wide host of performance-related measures in human psychology and physiology. The manipulation of stress can take many forms, ranging from variation in relatively acute forms of stress such as short-term sleep deprivation (i.e., sleep deprivation under 48 h) to more chronic forms of stress such as captivity survival training that can last several weeks, among others. For this reason, in terms of stress manipulation, the study of training-related stress overlaps with research on human performance, given that the latter focuses on the impact of environmental and physical stressors on human behavior and performance, including altitude, isolation, heat, cold, humidity, and aridity (see Sullivan-Kwantes et al. 2021). Indeed, research on human performance is relevant because it encompasses factors that are frequently manipulated experimentally to induce the desired levels of stress in training. In turn, the measurement of the effects of training-related stress on performance can also take many forms, ranging from assessing changes in basic biological functioning such as endocrinology (i.e., study of hormones) to higher-level psychological constructs such as interpersonal behavior in groups. This work is motivated by a basic tenet that optimal military performance includes personal as well as interpersonal aspects, and that understanding how training-related stress impacts performance in terms of both aspects is necessary for the development of countermeasures against its deleterious effects, including resilience.

The Stress-Performance Relationship: Uniformity and Diversity

This chapter uses a stimulus definition of stress, according to which a stressor is any physical (including physiological) or psychological condition that necessitates an adaptive response from the organism (Byron et al. 2010). Physical stressors include factors such as altitude, noise, and temperature; psychological stressors include factors such as isolation, competition, and evaluation. Typically, the relationship between stress and performance is based on the stimulus definition where the former necessarily hampers the latter. However, there are theoretical and empirical reasons to believe that the stress-performance relationship can take many forms (see Farrell and Jarmasz 2020). First, the distraction arousal theory proposes that stress has a detrimental effect on performance because stress depletes a limited pool of cognitive resources that would have been allocated to task performance. Indeed, stress can reduce performance on certain effortful cognitive tasks, such as divergent thinking, where the underlying mechanism could be the reduced availability of cognitive resources. In contrast, Byron et al. (2010) hypothesized that stress might have a positive impact on performance by increasing arousal and motivation to derive solutions (i.e., increased drive) or, cognitively, by focusing attention on salient features of the problem space that can benefit the problem-solver. Indeed, negative emotion narrows the focus of attention – a likely scenario under conditions of stress. Finally, others have proposed a curvilinear (i.e., inverted-U) relationship between stress and performance. According to Gardner’s (1990) activation theory, an increase in stress can improve performance up to a point, beyond which performance decreases. The inflection point in the curvilinear relationship occurs when the coping capacity of the organism has been exceeded, following which no further benefits can accrue from increasing the stimulus level.

The critical question in any curvilinear model is how an activating stimulus is experienced by the subject. Quantifying individual-differences factors that affect how people perceive and react to stressors (see Lazarus and Folkman 1984) allows researchers to measure how well a person can cope with a stimulus. To the extent that coping mechanisms have not been overwhelmed, an increase in stress should lead to an increase in performance; in turn, when coping mechanisms have been exceeded, any additional increase in stress should lead to a decrease in performance. People high in trait anxiety, for example, tend to perceive stressors as threatening and, as such, experience their psychological and physiological consequences to a higher degree, overwhelming their coping capacity and hampering their performance. In contrast, people low in trait anxiety tend to perceive stressors as challenges and feel motivated to overcome them, which can lead to improved performance. To model the stress-performance relationship accurately, therefore, one needs metrics for the levels of stress and performance as well as for the individual differences that influence vulnerability to stress .

The Scope of the Present Chapter

This chapter sheds light on the stress-performance relationship, drawing on research in the context of routine military operations such as training. Toward that end, the role of stress in training in three different contexts will be discussed: (1) captivity survival training, (2) insertion and extraction training involving high levels of physical and mental stress, and (3) stress in simulated training environments. These three training contexts were selected specifically because they represent different levels and types of stress, and as such can reveal the various forms the stress-performance and stress-learning relationships can take. Critically, an examination of this literature will reveal gaps in knowledge in this domain, and point to ways in which continued research into the effects of stressors on learning and performance is needed to advance the state of knowledge on the impact of training-related stress.

Captivity Survival Training: Stress Inoculation at Work

What is the optimal method for preparing military personnel for the exceptionally stressful conditions on combat operations? According to stress inoculation theory, resilience to stress can be enhanced through training. Coping mechanisms that mitigate the effects of combat-related stress on performance can be taught using controlled exposure to realistic stressors (Stetz et al. 2007). Training must expose trainees to stress levels sufficient to activate their psychological and biological coping mechanisms without overwhelming their coping capacity. Consistent with activation theory (Gardner 1990), this approach involves locating the sweet spot in the curvilinear (i.e., inverted-U) relationship between stress and performance, such that individuals experience the maximal level of stress they can cope with. The assumption is that people who can perform well under simulated conditions in training have developed the resilience to perform on the battlefield.

This approach is fundamentally similar to the train-as-you-fight doctrine, shared by many military units internationally (e.g., US Marine Corps, etc.). According to this doctrine, all training should be as realistic as possible. In practical terms this implies that “all training is to incorporate the highest degree of fidelity possible, and that no aspect of operations is to be ‘notionalized’ if a means to simulate it is available” (Canadian Forces 2005, p. 14). This doctrine has international resonance. For example, it guides training in the Royal Canadian Air Force, where high-fidelity simulation such as full-flight simulator is recommended for training personnel (Rawlings 2008), and it also represents a core training principle for the U.S. Army (Department of the Army 2012) as well as the Australian military (Australian Defence Force 2008). Aside from the physical and mental aspects of training, there is also growing recognition of this doctrine’s relevance to broader aspects of warfare such as military ethics. For example, Messervey (2013) has noted that “For military personnel to be ethically prepared for combat and operations, it is essential that ethics training be consistent with the Directorate of Army Training’s ‘Train as we intend to fight’ approach. When soldiers are engaged in battle, they experience extreme levels of stress that make it difficult for them to think in a rational and effortful way” (p. 46). Thus, there is broad agreement among militaries worldwide that the ideal approach to training involves a realistic representation of combat stress wherever possible, and to prepare personnel for all aspects of such stress ranging from the physical to the social-cultural ends of the spectrum.

Captivity survival training relies heavily on this approach. The intention is for trainees to acquire the necessary tools to survive capture by hostile forces. In the United States, the training is referred to as Survival, Evasion, Resistance, and Escape (SERE), which is typically a three-week course. Trainees are exposed to very high and uncontrollable levels of stress, including caloric and sleep deprivation, mock interrogations, and other physically and psychologically demanding experiences. Studies of SERE have shown that it causes significant perturbations in both psychological and physiological functions (Lieberman et al. 2005, 2015, 2016; Major et al. 2006, Morgan III et al. 2000a, 2000b, 2001a, 2001b; Taylor et al. 2007). SERE training has been shown to impair episodic and working memory and visuospatial and problem-solving abilities (Lieberman et al. 2005; Morgan III et al. 2000a, 2000b, 2001a, 2001b, 2006), and it was associated with significantly greater levels of self-reported dissociation, distortions in sensory perception, and an increase in endorsed posttraumatic stress disorder (PTSD)-like symptoms (Morgan III et al. 2001b), demonstrating that it is an effective stressor. The effects on cognition mirror those observed during U.S. Army Ranger training and the “Hell Week” of U.S. Navy SEAL training (Lieberman et al. 2005, 2009).

The stress of SERE training results in elevations in well-established biomarkers of stress, including cortisol, dehydroepiandrosterone (DHEA), neuropeptide Y (NPY), and epinephrine levels (Lieberman et al. 2016; Morgan III et al. 2000a, 2001a, 2000b). Because these biomarkers represent the activity of separate physiological systems (Morgan III et al. 2000a), stress-induced alterations in these systems are consistent with the conclusion that SERE represents a valid ethological model for the study of acute stress in human beings. Of note, SERE training affected the physiological biomarkers of stress to a degree equal to or greater than alterations measured in individuals undergoing major surgery or actual combat (Morgan III et al. 2000a, 2000b).

Interestingly, certain patterns of hormone secretion in response to stress have been associated with effective coping. An effective stress response may be associated with rapid elevation of certain stress hormones, for example, such as NPY, in response to acute stress, followed by a return to baseline when the acute stress ends (Morgan III et al. 2000a). For stress inoculation theory to work (i.e., a situation in which realistic exposure to stress is expected to confer resilience against future stress), captivity survival training should induce a measurable increase in acute stress that returns to (near) pre-training levels shortly following the end of the course (Stetz et al. 2007).

The Canadian equivalent of SERE training is Conduct After Capture (CAC) training, which contains many of the core features of SERE training and has similar aims (i.e., stress inoculation through the experience of intense but transient uncontrollable stress). One major difference is that the length of the Canadian course is only four days. A recent study examined whether the shorter training accomplishes the same goals as longer training (Suurd Ralph et al. 2017). Data were collected on mood, fatigue, dissociation, PTSD symptoms, short-term and working memory, and salivary cortisol and DHEA at four time points: at baseline during the didactic phase of training, following a first interrogation, following a second more challenging interrogation, and at debriefing and recovery. As the researchers predicted, a curvilinear relationship between stress and time emerged across the four assessment points. Specifically, scores on all measures were degraded during training but recovered after completion of training, and almost all measures were most degraded following the second more intense interrogation. The data involving both salivary cortisol and DHEA indicated that the training induced stress, and the corresponding impairments in mood, fatigue, dissociation, and PTSD symptomatology indicated impaired psychological functioning. After training, levels of all stress biomarkers and psychological functioning returned to baseline or near-baseline levels, indicating that the changes were transient. Interestingly, the researchers found that mood assessed prior to training predicted successful completion. This finding has important practical implications for increasing the success rate of training in similar environments because it suggests that positive mood might act as a buffer against the deleterious effects of stress, promoting better course performance and resilience. It also demonstrated that by and large, trainees can recover fully from an intense bout of acute stress .

Captivity survival research has focused almost exclusively on trainees, but it is important to realize that such training can also affect instructors. Performing mock interrogations under extreme conditions, for example, can be stressful, and whereas trainees typically undergo training once, the instructors often lead multiple courses, sometimes in successive weeks. Vartanian et al. (2018) assessed stress levels among CAC instructors before and after the delivery of training in consecutive weeks, offering an ecologically valid opportunity to assess carryover of stress from 1 week to the next. The researchers found that delivering CAC training was associated with impairments in mood, fatigue, and sleep, and a reduction in the ratio of testosterone/cortisol levels but that a three-day break between courses was sufficient to restore psychological and physiological function. This finding has important implications for trainer retention and recovery in contexts where limited access to qualified trainers necessitates that they administer multiple courses in succession.

Taken together, the results from the SERE and CAC studies show that captivity survival training causes significant increases in stress and transient perturbations in psychological and physiological function in both trainees and instructors, resulting in a training context that reflects what soldiers would experience in actual settings where the learning is to be applied. However, at present, additional research is required to examine whether such perturbations provide the skills and resilience needed to survive capture.

Arguably, one way to understand the impact of training-related stress on performance is to examine research findings from systems neuroscience – the discipline focused on understanding the functioning of large-scale networks in the brain. This is because understanding how the brain responds to stress can be useful in understanding the manner in which stress exerts downstream effects on behavior and performance (Hermans et al. 2014; Menon 2011). Large-scale networks in the brain are comprised of groups of regions that exhibit correlated activation during extended periods of awake rest. Three networks appear to be particularly important in the brain’s response to stress. Immediately following the onset of acute stress, there is an increase in activity within the salience network – dedicated to orienting attention to environmental cues that are relevant for survival – as well as a strengthening of its connections with sensory cortices (e.g., vision). At the same time, there is a corresponding decrease in activity in the executive control network – dedicated to cognitive control in the service of higher-order cognition (e.g., planning, decision making). In turn, there is an increase in activity in the anterior default mode network – dedicated to emotion regulation and self-referential thinking – as well as a strengthening of its connection with the salience network in the service of memory consolidation. This state resembles a hypervigilant mode of thought during which the person is maximally tuned to environmental cues necessary for survival, and is regulating emotions and consolidating memories for subsequent recall/recognition. When in this state, one would not expect the person to learn content that requires higher-order cognition, but the person can nevertheless perform and encode content into memory that is relevant for survival. This dissociation can perhaps explain why extreme stress (e.g., captivity survival training) can hamper executive functions but nevertheless be useful for teaching trainees the basic skills that they need to ensure survival in theater in the future .

Extreme Stress in Military Training: Insertion and Extraction Training

While few military courses cause the extreme stress experienced in SERE or CAC training, military training often includes stressors associated with the battlefield. One course known for its arduous training is the Patrol Pathfinder (PPF) course, which trains specialist soldiers in land, air, and sea insertion and extraction. The course includes many challenging physical and mental stressors (e.g., physical strain, exertional fatigue and sleep deprivation, energy deficit, external load from equipment carried, and climatic conditions), and is designed to be especially arduous to reflect the environment in which PPF forces operate. Because of its content and structure, PPF offers one the opportunity to study the relationship between stress and performance, including an examination of the mental and physical attributes related to course completion and the effect of stressors on the human body.

It is well known that the physiological and psychological demands of operational training can result in hormonal alterations. Indeed, blood cortisol levels in PPF candidates accurately profiled the stress levels of the course: Levels were significantly higher during periods of physical and mental intensity, and returned to baseline when activities ceased (Boscarino et al. 2016). In turn, testosterone is a hormone known to decrease with intense stress. Blood levels of testosterone also reflected the intensity of the PPF course: Testosterone levels decreased as the course intensity increased, a result consistent with the findings of Szivak et al. (2018) among sailors and Marines undergoing SERE training in the United States. Further, those who successfully completed the PPF course exhibited significantly higher blood lactate levels, red blood cell count, and hematocrit levels, a profile which is consistent with the literature in that regular and intense exercise increases the number of red blood cells and hematocrit levels. Complementing this finding, blood cortisol levels also reflected self-reported psychological distress scores on the Kessler Psychological Distress Scale (K10) in candidates who failed to successfully complete the course (Boscarino et al. 2016; Boscarino and Sullivan-Kwantes 2019). The significant and high correlation between distress scores and blood cortisol levels suggests that greater levels of experienced distress may be a contributing factor to training failure.

Recent evidence suggests a link between baseline stress hormone levels and performance in military training. Specifically, lower levels of blood cortisol were observed among PPF candidates who passed the course than among those who failed it (Shia et al. 2015). Similar findings have been reported in the United States in relation to captivity survival training (see Morgan III et al. 2009). Furthermore, candidates who passed the course also reported higher levels of DHEA-S at the start. DHEA-S is a hormone which has been shown to have neuroprotective properties in military men (Taylor 2013), which would appear to be critical in high-stress environments where rapid information processing and decision making are critical for operational success. These results suggest that success in training can in part be predicted based on a candidate’s initial vulnerability to stress, and the ability to regulate it. Furthermore, the fact that similar patterns of results have emerged in Canada and the United States suggests that the findings are reliable and replicable despite variations in methodology and subjects.

There also appears to be a correlation between DHEA-S levels and information-processing capacity, such as working memory. For example, in a group of active-duty Air Force members, Shia et al. (2015) found that higher DHEA-S levels and lower cortisol levels were related to faster working memory responses, suggesting it might be possible to identify stress-resilient individuals. In the PPF course, working memory scores were significantly higher among those who passed the course than among those who failed it (Boscarino et al. 2016; see also Boscarino and Sullivan-Kwantes 2019). In sum, what these findings suggest is that aside from one’s vulnerability to stress, cognitive capacity can also be used as a predictor for successfully completing stressful training. Greater cognitive capacity has been shown to enable both greater information-processing ability as well as provide greater motivation to engage in cognition – both of which could contribute to better performance (De Dreu et al. 2012).

Physical Stress and Performance

Training-related injuries due to physical stress are a common occurrence during military courses, accounting for more than 80% of soldiers’ noncombat-related injuries in the U.S. Army (Molloy et al. 2020). In the PPF course, training-related injuries accounted for over 50% of the attrition rate. This is particularly important from a military readiness perspective because reductions in the number of trained personnel decrease military’s operational capacity (Molloy et al. 2020). Further, research has shown that the best predictor of a future training-related injuries is a previous one. Indeed, in the PPF course for example, pre-existing injuries were found to be a significant contributing factor to attrition.

Apart from physical stress, evidence suggests that one’s beliefs about one’s pre-existing injuries can also influence performance outcomes. For instance, 67% of soldiers who failed the PPF course were concerned that their pre-existing injuries, such as a knee injury or back, would negatively impact their ability to complete the course. In contrast, soldiers who passed the course did not feel that their pre-existing injury would interfere with their ability to complete the course. This finding is consistent with the larger literature according to which anxiety due to anticipatory stress can affect functioning of the goal-directed attentional system (Eysenck et al. 2007), thereby impairing performance .

Physiological Response to Anticipatory Stress

A person’s inability to cope with stress can produce negative emotional states, such as anxiety. Negative emotions can in turn trigger a biological stress response, activating the release of the hormone cortisol, through the activation of the sympathetic nervous system and the hypothalamic-pituitary-adrenal axis. One way to study how anticipatory stress can impact performance is to learn from athletes, because like soldiers they must perform under stressful conditions. As such, studying how athletes respond to stress has the potential to shed light on soldiers’ performance under stress. It has been shown that the anticipatory cortisol response, known to be triggered by the anticipation of stress, can affect sports performance through its influence on cognitive processes (van Paridon et al. 2017), further highlighting the important interaction between emotion and cognition (Barrett and Armony 2006). When assessing the cortisol response, therefore, it is best to assess the magnitude of the response at the start of the competition, thereby differentiating it from exercise-induced changes in cortisol levels following competition (van Paridon et al. 2017).

Some researchers have proposed an inverted U-relationship between cortisol and performance, where moderate levels positively influence performance and low and high levels negatively influence it (van Paridon et al. 2017). On Day 1 of the PPF course, candidates who passed the course already had higher blood cortisol levels than baseline, but lower blood cortisol levels than those who failed the course. In addition, those who passed the course performed significantly in higher working memory test scores at baseline and reported lower scores of anxiety and distress on Day 1 and throughout the course. This pattern of findings suggests a close link between psychological and physiological measures of stress and course performance, and it points to stress regulation as a possible mechanism for buffering the deleterious effects of stress.

Research in this field also extends to the examination of personality, and has demonstrated that certain personality traits are associated with better or worse performance under stress. A high score on the factor openness to experience, for example, is correlated with high motivation to learn and to achieve a set goal (Major et al. 2006). Conversely, avoidance motivation (i.e., fear of failure leading to avoidance of goals, debilitating anxiety, withdrawing, and disliking school) was found to be associated positively with neuroticism and negatively with openness to experience (Komarraju and Karau 2005). When personality traits were assessed using the Big Five model in PPF candidates at baseline, those who passed the course scored higher on openness to experience and lower on neuroticism than those who failed the course. Further, soldiers who failed the PPF course reported significantly higher anxiety scores on Day 1 and throughout the course, further supporting the relationship between anxiety and cognition. With regard to performance, the same candidates consistently reported higher distress scores (K10) throughout the duration of the course (Boscarino and Sullivan-Kwantes 2019). These findings suggest that personality can moderate or mediate the relationship between stress and performance under demanding training conditions. These types of studies further contribute to the understanding of the underlying individual differences that moderate and mediate the effects of stress on learning and performance .

Stress in Simulated Training Environments

The studies summarized above show that military training can impose significant stress even in experts, sometimes to the point of severely disrupting learning and performance. Opinions differ as to how much different types of training can induce stress in trainees, however, and thus whether specific training methods induce enough stress to support specific training objectives. As discussed in Farrell and Jarmasz (2020), training using simulators (as opposed to field exercises or courses with scenario-based training using live actors) can provide trainees with basic skills but cannot adequately simulate operational stressors. In the sections below, the focus will shift to how military training tasks performed using simulated environments can result in significant stress in trainees, and a discussion of how the stress they experience can affect learning and performance.

Searching for Threats in a Simulated Visual Environment

Military personnel must contend with threats during operations. For example, during Canada’s combat mission in Afghanistan (2002–2010), Canadian Armed Forces personnel had to contend with the threat of improvised explosive devices (IEDs). This was not a problem unique to the Canadian Armed Forces. For example, IEDs accounted for 45% of all American deaths in operational war zones in the period 2006–2020. Being constructed from locally available materials and concealed, IEDs are very difficult to detect, even with sophisticated sensor technologies (Jarmasz et al. 2010). As a result, soldiers deployed on operations where IEDs were frequently used had to be vigilant. Anecdotal reports suggest that experience in such environments developed an intuitive awareness of IED threats in soldiers that they could not easily explain, often accompanied by physiological sensations, informally known as a “spidey sense” (Zotov et al. 2011).

Researchers have also investigated threat awareness to determine whether it was a trainable skill or a spurious phenomenon. For example, interviews with Canadian Armed Forces personnel who were deployed to Afghanistan suggested that IED detection depended on a body of knowledge that combined an understanding of IED construction and the opportunities for carrying out IED attacks afforded by different types of terrain (see Jarmasz et al. 2010). Investigation of actual IED threat detection behavior by personnel in theater would have been highly impractical and dangerous for obvious reasons; thus, researchers investigated the IED spidey sense and its physiological correlates by simulating IED search tasks using videos recorded in Afghanistan (Keillor et al. 2007; Zotov et al. 2011). The videos were obtained during convoy operations in Afghanistan, and while none depicted actual IED incidents, they were vetted by subject matter experts (SMEs) with operational experience and deemed to depict scenarios where IEDs could have been deployed against the convoy. Such study designs are useful in assessing the effects of simulated training-related stress on performance.

Keillor et al. (2007) investigated IED threat awareness by showing these videos to personnel with and without operational experience in Afghanistan. Participants were provided the same information on IED indicators that personnel were given in pre-deployment training, and instructed to press a button when they thought the video depicted a location where an IED could be concealed. Participants’ eye movement parameters (i.e., fixations and saccades) were recorded. Notably, the eye movements of participants without deployment experience seemed to follow patterns found in studies of eye movements during driving (i.e., fixations and saccades mainly along the horizon, looking ahead for road conditions), whereas those of experienced participants did not follow the horizon and instead seemed more consistent with the close inspection of specific objects in the scene. The experienced participants were debriefed after viewing the videos to explain their search behavior. Many experienced personnel reported feeling “as though they were there,” suggesting that the videos simulated the visual experience of searching for IED threats in an operational environment. These findings suggest that operational experience has the potential to alter one’s visual search behavior in relation to a simulated threat, such as IEDs.

Focusing on the physiological correlates of IED search behavior in a simulated environment, Zotov et al. (2011) compared eye movement parameters between participants with and without operational experience in Afghanistan. In addition to instructing participants to identify scenes depicting possible IEDs, the investigators also engaged the participants in visual search tasks deemed to be affectively neutral (i.e., identification of landmarks in video from urban environments), and measured heart rate variability (HRV) while participants performed both of these tasks. The results of the HRV analysis revealed that the experienced participants had HRV parameters associated with higher stress when viewing the IED threat videos than participants without deployment experience. In addition, they exhibited higher stress-related parameters when actively identifying IED threats in a scene. In contrast, the experienced participants’ HRV parameters were consistent with lower stress when watching the neutrally affective videos than the IED threat videos, suggesting that the elevated stress was context-specific and experience-driven.

Analysis of the eye movement data from the IED threat videos showed a pattern consistent with findings by Keillor et al. (2007), namely, closer and more frequent inspection (i.e., more fixations, shorter saccades) of potential IED threats by participants with deployment experience. This pattern was not observed with the affectively neutral videos. Together, these findings suggest that experienced participants appraised the IED threat videos as depicting more stressful content than did participants without deployment experience. This is consistent with the notion that deployment experience can alter attention and inspection patterns, but that the effect is present only for contextually relevant and semantically salient content.

As discussed above, it is not possible to assess these findings against objective norms of IED search performance. Nevertheless, the more systematic search patterns by experienced participants combined with the knowledge elicitation from SMEs discussed above suggest that the patterns exhibited by participants with deployment experience reflect a body of knowledge that helps deployed personnel discriminate between various types of operational threats. These findings also show that the behavioral patterns exhibited by experienced personnel during IED search tasks in a simulated visual environment can be accompanied by a stress response, thereby providing a biological signal that can be used to orient attention to potentially threatening stimuli in the environment.

Importantly, insights into physiological correlates of threat can also lead to important applications and improvements for training. For example, based on the findings described above, Zotov et al. (2014) developed an instructional intervention to augment IED awareness skills during pre-deployment training for servicemembers deployed to Afghanistan. This training tool used videos of Afghanistan to illustrate patterns of IED indicators in a variety of scenarios, and showed that personnel with no prior deployment experience could nevertheless learn to assess IED attack indicators in a manner similar to SMEs assessing the same scenarios. Such studies show that knowledge gained about the physiology of simulated-threat detection involving experts can be used to adjust training approaches for non-experts, thereby contributing significantly to pre-deployment training.

Stress in Marksmanship Training and Performance

The effects of psychological stressors – such as competition, social pressure, time limits, high workload, and hostile fire – have been extensively investigated in recent years (e.g., Torre et al. 1991 for competition, frustration, and social pressure; Nieuwenhuys and Oudejans 2011 for time limits; Scribner et al. 2007 for workload). However, no clear evidence has emerged that these stressors affect performance of motor-cognitive skills, such as marksmanship. Obviously, the measurement of the effects of stress on marksmanship during actual combat operations is not practical; nevertheless, indirect evidence from laboratory studies suggests a connection between psychological stressors and marksmanship performance. For example, Torre et al. (1991) showed that competition impairs the accuracy of shooting when the participants were under time limits. Kerick and Allender (2004) tested target exposure time and found that the stress induced by high cognitive workload affected shooting performance when the target exposure was short. The type of target also induced stress and affected performance. Hard-to-hit targets (e.g., distant targets, short exposure time, moving targets), for example, induced stress and affected marksmanship performance (Torre et al. 1991).

The studies above suggest that the marksmanship task itself is inherently stressful, apart from the stressors particular to combat. Given that military personnel must maintain minimum marksmanship proficiency standards through regular training and testing, it is important to prepare soldiers for the stress of marksmanship during training. A number of stress management techniques have been adopted to cope with combat stress. Tactical breathing, which is based on stress inoculation training mentioned above, is one of the simplest and most effective techniques to reduce acute stress. Another method of stress management is biofeedback, which involves self-regulation of physiological states. Bouchard et al. (2012) tested the effectiveness of the Immersion and Practice of Arousal Control Training (ImPACT) technique – an approach that included biofeedback and tactical breathing – in reducing symptoms of stress. The study used virtual combat settings based on a video game that was scripted from the Afghanistan context. A graphical interface linked the physiological stress indicators (i.e., heart rate and skin conductance) with visual and audio feedback in a video game. In addition to heart rate and skin conductance, salivary cortisol readings were also taken to measure stress levels. The study showed the effectiveness of the ImPACT technique in reducing stress when tested against a no-stress control condition.

A pilot study by Adams (2013) investigated the effect of tactical breathing and biofeedback on marksmanship performance under stress-inducing conditions in the Small Arms Trainer – a simulated range that replicates a typical live range. The stressors used were competition, weapon malfunction, and time pressure. The physiological monitoring equipment measured participants’ heart and respiration rates, skin conductivity, and salivary cortisol levels. The State-Trait Anxiety Inventory (STAI; Spielberger et al. 1970) was administered prior to the trial and during the debriefing session. Responding to the post-experimental questionnaires, participants reported that shooting under severe time pressure and shooting in front of others were the two stressors that affected them most. The stress measurement methods used in the trials can be applied to future studies that investigate different stressors in controlled experimental settings involving military personnel.

A similar stress inoculation approach that combined biofeedback and tactical breathing in marksmanship training was conducted by Lotfabadi et al. (2020). The study tested novice civilian shooters in a controlled indoor shooting range using real weapons and ammunition (as opposed to the simulated shooting task in Adams 2013). The stress level and shooting accuracy were compared between stress inoculation and control groups, and HRV analysis was used to measure stress level. All shooters experienced measurable elevations of stress during the task. While shooting performance was not different in the two groups, the stress inoculation group maintained a low stress level through the task. These findings together with the studies summarized above suggest that stress inoculation-based approaches can be effective in managing stress in a range of marksmanship contexts (live fire and simulated ranges, military and civilian shooters).

The studies described above are consistent with the findings reported in the broader marksmanship literature: Marksmanship is an inherently stressful task, even though simulated marksmanship tasks (for training or research purposes) do not induce as much stress as combat. Stressors in marksmanship are related to the parameters of the task itself (e.g., target size or distance) or to the training scenario in which it is embedded (e.g., time pressure, simulated combat threats). These stressors can serve as the basis for implementing stress inoculation-based interventions in marksmanship training to overcome the effects of stress and to better prepare personnel for combat situations.

Tactical Combat Casualty Care Training

“Combat casualty care” and “pre-hospital/combat casualty care” are generic terms used for the skill set taught to combat medics (formally known as medical technicians) to treat pre-hospital trauma care in the battlefield (NATO Task Group HFM-242 2020). In the United States military, the guidelines for combat casualty care are governed by Tactical Combat Casualty Care (TCCC or TC3), and routinely updated by the Committee on Tactical Combat Casualty Care (CoTCCC). Initially, TCCC was taught to physicians attached to Navy SEAL units, but it has since been adopted by the other military branches and taught to combat first responders. Indeed, TCCC is now a mandatory course for deploying United States medical personnel (Bradley et al. 2017).

In turn, in the Canadian context, the Tactical combat casualty care (TCCC) course is taught as part of each brigade’s pre-deployment training cycle. It has long been acknowledged that simulated stress plays an important role in casualty care training. For example, in TCCC training, trainees must perform complex clinical procedures, such as inserting nasal pharyngeal airways and needle decompressions, because they must be able to recognize how patients’ bodies respond to these procedures while coping with the stress of being responsible for the life of another person in battlefield conditions (Kim et al. 2017). These concerns have motivated different choices around training modes for casualty care training, in particular the use of non-human live tissue training (LTT) models, which are considered to provide more physiological realism and be more effective at simulating the stress of caring for a living patient than medical training mannequins (Savage et al. 2015). Nevertheless, the practitioners in this training community see the importance of using medical training mannequins – hereafter referred to as high-fidelity patient simulators (HFPS) – for a number of reasons, including better repeatability and control of the training experience and animal welfare considerations.

While both LTT and HFPS are used in casualty care training, some have questioned whether HFPSs provide the necessary psychological realism (including the stress aspects) needed to train combat medics (Kim et al. 2017; Savage et al. 2015). In response, research was designed to compare learning outcomes, stress measures, and self-reported confidence ratings for combat medics undergoing casualty care training using either LTT or HFPS (Peng et al. 2018; Savage et al. 2015; Vartanian et al. 2017). The study looked at skill acquisition in an instructional operating room and skill proficiency in a field assessment, a simulated battlefield condition using both LTT and HFPS training. Self-report (using the STAI) and physiological measures of stress (salivary cortisol and blood catecholamines) were measured throughout the training and assessment phases of the study. In addition, cognitive performance was measured using working memory and short-term memory measures. Half the participants underwent field assessments using the same training mode they had used in the operating room, whereas the other half switched modes before undergoing field assessment (i.e., LTT in the operating room and HFPS in field assessment, or vice-versa).

Results showed that participants acquired TCCC skills and performed them equally proficiently under field assessments with both training modes. However, those participants who switched modes between training in the operating room and the field assessment performed worse on assessments than those who did not switch. This indicates that neither mode provided an advantage in developing proficiency with the other. Importantly, participants in both training modes exhibited comparable stress levels on the self-report and physiological measures, and stress levels increased between the training and assessment phases of the study. No differences in stress between training modes were detected, except for increased blood catecholamines in participants who switched modes between training and assessment. Not only did many participants perform casualty care procedures well during the field assessments (as assessed by experienced instructors), but many also showed improved working and short-term memory performance during the assessments. These findings indicate that casualty care training can induce significant stress in trainees, even when using an artificial patient (HFPS). Further, the findings suggest that stress experienced during casualty care training is associated with improved as opposed to reduced cognitive performance – a theme that has emerged elsewhere in this chapter .

Summary

This chapter examined the impact of training-related stress on performance in many military training contexts, primarily in the United States and Canada (Table 1). As can be seen, in every case personnel were exposed to a variety of psychological and physiological stressors. The review of this diverse body of work demonstrated that stress interfered with good performance in some cases, but improved it in others. Indeed, this research supports two competing views of stress in training. According to one view, training is an opportunity to improve self-regulation skills and develop resilience against the stressors of combat. According to another view, the stress of training supports state-dependent learning because it helps encode new knowledge in a way that is cued and retrieved by stress. As such, the purpose of training is to simulate the stress of combat as realistically as possible, regardless of self-regulation capacities or the meaning of stressors for learners. These two views entail two different training strategies, both of which need further empirical investigation. However, the best way to account for the available evidence likely involves an integration of these two viewpoints. Specifically, it appears that the instantiation of the optimal level of stress that can both mimic key aspects of the operational environment while not overwhelming the coping capacities of personnel would be ideally suited for promoting training-related learning. Needless to say, due to individual differences in coping mechanisms, this optimal level will vary across individuals, a key factor that needs to be taken into consideration when designing training for military personnel. In addition, the understanding of the effects of training-related stress on performance in the military can improve by examining how the brain responds to stress, which can in turn point to specific types of content that might be more encodable than others when levels of stress are high.

Table 1 Varieties of training contexts and associated types of stress

Furthermore, the work in this area points to the moderating and mediating effects that social, contextual, and individual-differences factors exert on the relationship between stress and performance. Individual differences in one’s ability to regulate the impact of stress on performance, for example, need greater recognition in the military. In terms of simulating stressful conditions, this research shows how experience affects learning and suggested ways to optimize learning. In this sense, the field needs a more refined understanding of what constitutes expertise in both domain-general as well as domain-specific terms, and how to distill and teach its essential features during training. It is expected that the work reviewed here will serve to generate hypotheses that motivate further research on this important topic in military performance.