Introduction

Military conflicts have dramatically changed in recent decades, with asymmetric conflicts increasingly arising in urban areas (Morag 2023). Current examples include Donetsk, Luhansk (Hook and Marcantonio 2023), and Gaza (Levene 2024), where dense urban infrastructure in all dimensions (subsurface, surface, and supersurface; Hofer 2022) complicates the use of heavy equipment, aerial reconnaissance, and surveillance. Additionally, humanitarian law requires distinguishing between combatants and civilians (Durhin 2016), necessitating soldiers’ entry infrastructure. Urbanization increases the relevance of urban warfare so that not only special forces but also unspecialized soldiers are engaging in so-called Close Quarters Battle (CQB) as part of special forcification (Ben-Ari et al. 2010).

CQB describes tactical principles in confined spaces, where room clearing, maximizing existing cover, high speed, and surprise are used (Hyderkhan 2020) to overwhelm perpetrators and secure infrastructure. CQB demands an intuitively automated understanding of spatial geometry, optimizing angles and cover while minimizing potential hazards. Due to its high proximity, speed, and danger, CQB is psycho-physiologically demanding (Hyderkhan 2020).

Consequently, CQB training is highly time- and cost-intensive (King 2016). Nevertheless, acute operational situations may necessitate short-term CQB training, as seen in the Gaza operation, where even reservists had to fulfill CQB missions at short notice (Dress 2023). Despite the increasing relevance of CQB, there is currently limited research on training effects with high ecological validity in real-world environments. Moreover, it is essential to develop a CQB taxonomy to enhance assessment and training. Furthermore, investigating the stress response’s adaptations and the stress-performance relation is imperative for improving training conditions.

Physiological Stress Response

Maintaining homeostasis amid stressors necessitates physiological and behavioral adaptations (Chrousos 1998). Stress, as defined by Selye (1956), denotes a non-specific physiological reaction to stressors. Particularly, within high-risk occupations like the military or law-enforcement performance under stress is a central research focus (Nindl et al. 2018; Beckner et al. 2022). The activation of the two physiological stress systems, namely, the hypothalamic–pituitary–adrenal (HPA) axis and the sympathoadrenal system (Sved et al. 2002), can be determined through the salivary concentration of cortisol (sCort) and alpha-amylase (sAA; Ali and Pruessner 2012), respectively.

The HPA axis activation (Kirschbaum and Hellhammer 2008) secretes cortisol into the bloodstream (Katsu and Baker 2021), reaching a salivary concentration peak approximately 22 min after the stressor (Engert et al. 2011). Cortisol elicits an ergotropic response, mobilizing energy to establish the physiological prerequisites for stressor coping (Munck et al. 1984). The catastrophe model of anxiety and performance (Hardy and Parfitt 1991) provides a theoretical framework for understanding the impact of stress on CQB performance within a sports psychology context. The model predicts optimal performance at moderate physiological arousal and high anxiety levels, and it has found support in various sports, including basketball (Hardy and Parfitt 1991), bowling (Hardy et al. 1994), climbing (Hardy and Hutchinson 2007), and e-sports (Schmidt et al. 2020). Schmidt et al. (2020) summarize “Physiologically aroused and anxious subjects performed much better compared to subjects who were not physiologically aroused and not anxious independent of their respective flow levels” (p. 5). Strahler and Ziegert (2015) measured sCort and sAA during a simulated school shooting scenario involving police officers. They identified the highest cortisol concentration at the beginning of the scenario and the highest sAA concentration shortly after the scenario’s conclusion, suggesting both stress systems’ distinctiveness and potential compensatory nature.

While cortisol serves as an indicator of the HPA axis, sAA is considered an indicator of the sympathoadrenal system, reflecting the noradrenergic reaction (Nater and Rohleder 2009). SAA exhibits higher stress sensitivity (Gordis et al. 2006) and a faster response profile than cortisol (Chrousos and Gold 1992), reaching its salivary concentration peak approximately 7 min after the stressor (Engert et al. 2011). SAA indicates psychosocial stress and does not correlate strongly with cortisol (Ehlert et al. 2006; Nater et al. 2006). In the military context, sAA has been identified as a valid stress indicator during active shooter training (McAllister et al. 2020) and has been elevated during force-on-force (FoF) scenarios compared to cardboard conditions (Taverniers and De Boeck 2014).

Another physiological parameter directly associated with the noradrenergic system and the stress response is the heart rate (HR). The autonomic nervous system regulates HR through the sympathetic (increase) and parasympathetic (decrease) branches (Taelman et al. 2009). Studies demonstrate that various stressors, such as interrogations (Lieberman et al. 2005) and social stress (Kirschbaum et al. 2008), impact the HR. In a virtual reality spaceflight emergency training, stress levels were measured and adaptively increased based on the participants’ existing load. A pre-post-comparison showed that stress decreased in all groups (evidenced by decreased HR and increased HRV), while task performance improved. These findings support the hypothesis that self-regulation abilities can be enhanced through stress inoculation training (Finseth et al. 2021). Due to the noradrenergic influence on both, sAA and HR both indicators exhibit a high correlation (McGraw et al. 2013). In the military context, HR has proven to be a valid indicator of stress.

Research Objectives

As King (2022) outlined, military operations are increasingly urban-centric, with CQB emerging as a highly demanding skill. The continuous optimization of assessment and training is imperative to enhance operational readiness. Therefore, this study aims to construct a CQB taxonomy with assessable subskills to determine differentiated training effects and examine expertise on a facet level. Further, this investigation examines the effect of a compact CQB training program on the performance of non-specialized and special forces. Additionally, we examine the general CQB-specific stress-performance relation and the training effect on the anticipatory and co-occurring stress response.

Therefore, it is hypothesized that CQB training enhances the CQB performance (H1), with the training effect being stronger among non-specialized soldiers (H2). Furthermore, a reduction in the anticipatory (H3) and co-occurring stress response (H4; H5) is expected after training. Lastly, we expected a positive effect of the anticipatory stress response (sCort) on performance (H6a) and a negative co-occuring stress effect on performance (sAA, HR; H6b).

Method

This experimental investigation on CQB performance was conducted in two phases: firstly, from October 5th to October 13th, 2023, involving police special forces, and secondly, from November 3rd to November 5th, 2023, with non-specialized soldiers. Participation in the study was voluntary und informed consent was obtained. Prior to commencement, ethical approval was obtained from the ethics committee in psychology at the faculty of humanities and social sciences at the Helmut-Schmidt-University/ University of the German Armed Forces (approval number 2023_010). The study design was two-factorial with the between subject factor “group” (non-specialized vs special forces) and a repeated measurement factor.

Participants were also informed to refrain from alcohol consumption and intense physical activity 24 h before the study, in line with Strahler et al.’s (2017) recommendations for sAA and sCort assessments. Additionally, participants were instructed not to smoke and avoid high-caloric food the day before the investigation. Cigarette consumption and sleep quality were assessed through questionnaires during pre- and post-assessment. The data collection involved an online survey comprising demographic and personality instruments. Due to the focus of this article on training effects, personality influences on CQB performance will be reported elsewhere.

The pre-test phase involved participants completing a pre-defined sequence of ten stations, illustrated in supplements (Fig. 1A). After psychometrics, there was a 30-min relaxation period (using sleep masks and ear protection). Following this, sCort and sAA were collected (Sarstedt Salivette, Nuembrecht, Germany). Subsequently, participants put on the heart rate chest strap (Polar H10), tactical gear (protective vest, helmet), helmet camera (GoPro Hero 10), and mobile eye-tracking system (Viewpoint System 19) and were then instructed. The task involved tactically clearing three rooms. Red light (LED light puck) indicated direct enemy contact and shooting, whereby real persons had to be reacted to at their own discretion. Participants could use their preferred weapon, with police officers using the Walther P99 FX (FX for color cartridges) and soldiers using the Glock 17 airsoft (Glock GmbH).

Following the CQB scenario, another sAA sample was taken (1 min after scenario), and participants entered sensory deprivation for 22 min. The final step involved collecting the second cortisol saliva sample.

The training phase for all participants consisted of 12 h of CQB training (two training days) conducted by the CQB training company, Project Gecko. The training included 4 h of theoretical content (room geometry, movement, gaze behavior, risk potentials, weapon handling, and prioritization) and 8 h of practical training (movement automation, spatial analysis, decision-making, cutting techniques, use of angles and corners). After completing the training, the CQB post-test took place the next day. The post-test mirrored the pre-test, with changes in spatial arrangement to avoid memory effects.

Sample

The sample comprised N = 34 (n = 18 SOF). A within-subject CQB pre- and post-test was conducted with n = 23 individuals. The sample acquisition was carried out by soliciting participation from multiple police departments, as well as from the Helmut-Schmidt-University/University of the Federal Armed Forces. The overall sample included n = 11 from the Special Task Force (SEK), n = 7 from the Mobile Task Force (MEK) from various federal states, and n = 16 unspecialized soldiers. Most participants (n = 16) held a bachelor’s degree as their highest level of education. The average CQB experience in hours was M = 87 (SD = 118; range = 500), and there were n = 4 individuals who were smokers (M = 7.25; range = 1–25 cigarettes per day). The minimum sample size to detect a strong training effect (d = 0.8) with a paired sample t-test required n = 15, with an alpha level of 0.05 and a power of 0.80. Demographic details are summarized in the supplements (Table 1). The data are accessible online at https://osf.io/n4ugp/?view_only=21b8773f058a42039ec0a0fd9cef203e.

Table 1 Sample details

Measures

Close Quarters Battle Test

The CQB pre- and post-test comprised three CQB scenarios each with different spatial conditions to avoid memory effects. In the first scenario, participants had to clear a complex with numerous cabins (showers or toilets) without enemy encounters. In the second scenario, participants initially had to shoot at a red-marked target and were then surprised by a civilian (weapon-like object) they had to subdue to the ground. The third scenario involved an armed individual (handgun) in an adjacent room who also needed to be subdued to the ground.

Standardized CQB Performance

The CQB performance was assessed using a standardized observation instrument (Fig. 2A), which comprised nine items forming five scales. The items were rated on a scale from 1 (does not apply at all) to 7 (applies completely). The tactical behavior scale (three items) encompassed the efficiency of perceiving danger points, slicing behavior, distance management, and cover behavior. The weapon handling scale (one item) measured trigger behavior, safety, and congruence of sight and weapon direction. The scale gaze behavior (three items) included target perception speed, danger focus, and gaze movement economy. The scale response time (one item) measured the time between target perception and reaction (e.g., firing or verbal reaction). The scale mistakes (one item) counted critical errors, such as engaging civilians. The CQB test performances were assessed by two independent experts using the standardized instrument. The inter-rater reliability (ICC; Gisev et al. 2013) was sufficient (ICC = 0.733–0.834, p < 0.001).

Physiological Stress Response—Cortisol, Alpha-amylase, and Heart Rate

To assess the anticipatory stress response, sCort and sAA samples were obtained just before each CQB. Additionally, a sAA sample was collected 1 min after CQB, and a cortisol sample was collected 22 min after CQB to measure possible stress effects. A total of eight salivettes were collected per participant. The timing of sample collection adhered to the guidelines recommended by Engert et al. (2011). Following collection, the samples were promptly stored in a cooling compartment (− 15 °C) until assayed. The subsequent biochemical laboratory analysis involved centrifugation of the thawed samples. Analyses were performed completely automatically (BEP 2000, Siemens, Germany) with commercial enzyme immunoassays (IBL; Tecan, Hamburg, Germany) to determine cortisol and alpha-amylase concentrations. Intraassay variation (coefficient of variation) was lower 5% across all samples. Heart rate was monitored using the Polar H10 chest strap.

Statistical Analyses

We utilized the R software (R Core Team A and Team RC 2022) for statistical analysis. To assess whether the training positively impacted overall CQB performance and its facets (H1a–e), we conducted a repeated measures multivariate analysis of variance (rMANOVA) to compare CQB performance between groups and across time points (pre- and post-training). Additionally, we examined an interaction effect of time and group, assuming a stronger training effect on the CQB ability of non-specialized forces (H2).

We investigated the training effect on sCort and sAA concentration using a two-way repeated measures analysis of variance, with the factors being CQB execution (pre-CQB and post-CQB salivettes) and the measurement time points (pre-training and post-training). We anticipated a higher cortisol concentration prior to CQB scenarios due to anticipatory stress reactions (H3a) and a reduced salivary cortisol concentration both before and after CQB scenarios as a result of training (H3b). Furthermore, we examined the interaction between time (pre- and post-CQB) and training (pre- and post-training) and expected a decrease in anticipatory stress reactions post-training (H3c). Additionally, we predicted a higher sAA concentration after CQB scenarios (H4a) and a reduced sAA secretion due to training (H4b). We also expected an interaction effect, resulting in a reduced rise of sAA levels due to CQB training (H4c). In addition, we used a paired sample t-test to examine a reduction in median heart rate within CQB execution after the training (H5).

Finally, we examined the positive influence of the anticipatory stress response on performance (H6a) and the negative effect of the accompanying stress response (sAA post-CQB, HR) on performance (H6b) in the pre- and post-training CQB-test through correlations. Non-parametric tests (Spearman’s rho, Wilcoxon signed-rank test) were utilized when assumptions for parametric testing were not met. We followed the effect size guidelines by Gignac and Szodorai (2016; r = 0.10/0.20/0.30 for small/moderate/large). Mahalanobi’s distance measure was used for outlier identification (d-squared value of p < 0.001).

Results

The outlier analysis yielded no significant d-squared values, indicating no outliers. There was an improvement in CQB capability before and after training, F(1,15) = 11.372, p = 0.004, η2P = 0.431 (H1). Examination of the simple main effects revealed that the intervention increased the CQB total score, F(1,15) = 99.501, p < 0.001, η2P = 0.869 (H1a), and the facet tactical behavior, F(1,15) = 51.177, p = 0.004, η2P = 0.773 (H1b). However, the facet weapon handling, gaze behavior, and response time (H1c–e) did not show significant improvement (Table 2). Special forces demonstrated higher CQB performance compared to non-specialized forces, F(1,15) = 82.240, p < 0.001, η2P = 0.846, with no group differences in the training effect, F(1,15) = 0.138, p = 0.715, η2P = 0.009, thus rejecting Hypothesis 2 (Fig. 1).

Table 2 Effects of group and training on the CQB performance facets
Fig. 1
figure 1

Total CQB performance training effects. Note. Comparison of the CQB performance before and after the training between special forces (SOF) and unspecialized soldiers; error bars indicate the standard error

The comparison between the salivary cortisol level before and after the CQB scenario showed no significant difference, F(1,32) = 1.217, p = 0.278, η2 = 0.017, thus rejecting H3a (Table 3). The investigation of the training effect revealed a reduced cortisol secretion after CQB training, F(1,32) = 6.647, p = 0.015, η2 = 0.048 (H3b). Additionally, a significant interaction effect between measurement (pre- and post-CQB) and time (pre- and post-training) was observed, indicating a reduction in anticipatory stress response due to training, F(1,31) = 1.018, p = 0.321, η2 = 0.008 (Fig. 2; H3c).

Table 3 Training effects on salivary cortisol concentration before and after CQB
Fig. 2
figure 2

Training effects on salivary cortisol concentration. Note. Comparison of the CQB training effect on pre- and post-CQB salivary cortisol concentration before and after the CQB training; error bars indicate the standard error

Examining the sAA levels revealed a higher concentration after the CQB execution compared to baseline, F(1,31) = 7.774, p = 0.009, η2 = 0.063 (Table 4; H4a). Further, the training had a reducing effect on sAA secretion, F(1,31) = 6.531, p = 0.016, η2 = 0.077 (H4b). However, no significant interaction effect was found between measurement (pre- and post-CQB scenario) and training (pre- and post-training), indicating no change in the co-occurring stress response’s pattern due to training, F(1,31) = 1.018, p = 0.321, η2 = 0.008 (Fig. 3; H4c). In addition, the median heart rate during CQB scenarios decreased due to training, z = 3.003, p = 0.003, rrb = 0.628 (H5c).

Table 4 Training effects on salivary alpha-amylase concentration before and after CQB
Fig. 3
figure 3

Training effects on salivary alpha-amylase concentration. Note. Comparison of the CQB training effect on pre- and post-CQB salivary alpha-amylase concentration before and after the CQB training; error bars indicate the standard error

Further, there was a positive correlation between the anticipatory stress response (sCort pre-CQB) and the CQB pre-test performance (ρ = 0.512, p = 0.005; H6a), disappearing in the CQB post-test (ρ = 0.365, p = 0.080). No association was found between the noradrenergic stress response (sAA post-CQB) and the CQB performance in the pre-test (r =  − 0.121, p = 0.532) or post-test (ρ =  − 0.102, p = 0.642; H6b).

Discussion

This investigation aimed to enhance the understanding of the increasingly relevant field of CQB, mainly focusing on performance. The three main research objectives were as follows: examining the impact of CQB intensive training on (1) the CQB facets and total performance, (2) the stress response, and (3) exploring the association between the stress response and CQB performance.

Due to the increasing relevance of CQB in military and law enforcement, the role of individual skills in urban combat (Hills 2004), and the substantial training time required for this capability (King 2016), we investigated the impact of an intensive CQB training on police special forces and non-specialized soldiers. Given that more extended CQB trainings, such as over 15 days (Jensen et al. 2023), have already been examined, we aimed to inspect the effects of a shorter training. Overall, the compact training enhanced the CQB performance of all units.

Improvements were reached in the total performance and tactical behavior. The training, focusing on fundamental CQB principles, such as the efficient opening of angles and room cutting, proved to have the most significant effect on tactical behavior. While weapon handling and response time displayed a positive trend, they were not a core component of this training.

Notably, there was no observable change in gaze behavior despite it being a fundamental aspect of the theoretical and practical training, emphasizing to focus at the most informative view points and directing the gaze intentionally (use of apex as the point of interest). Therefore, gaze behavior is a CQB facet that may not be effectively enhanced within a 12-h intensive training and needs more time and repetition. Research in elite sports has demonstrated that experts, due to their superior ability to pick up cues, exhibit improved response time and visual search behavior characterized by fewer but longer fixations, leading to more efficient information uptake (Mann et al. 2007). This difference between experts and novices intensifies under stress as the field of view narrows and the periphery becomes less visible, necessitating search patterns to extend into the periphery (Williams and Elliott 1999). This difference in gaze behavior was also supported in this study in the context of CQB, as the comparison between special forces and non-specialized soldiers revealed a significant difference, (t(32) =  − 3.064, p = 0.004, d =  − 1.099). Gaze behavior, therefore, could be a valid criterion for assessing CQB automation and expertise, possibly serving as a metric for measuring training successes.

Interestingly, despite the different baseline performance levels between the police special forces and the non-specialized soldiers, there were no differences in the training effect. The results suggest that a compact course in single-CQB is an effective intervention for all levels of experience.

In addition to performance improvement, another effect of CQB training was reducing the anticipatory (sCort) and accompanying (sAA, HR) stress reactions. The anticipatory cortisol peak was no longer present after the training. Furthermore, the post-CQB sAA concentration decreased after training, indicating a reduced stress response during CQB, further supported by the decreased median heart rate in CQB after the training. Overall, the training exhibited a reducing effect on the physiologival stress response in both non-specialized and special forces. It remains uncertain to what extent the reduced stress response can be attributed to environment habituation (e.g., location and trainer) or increased self-efficacy and skill. In particular, the pretest might have functioned as preparatory information, and Inzana et al. (1996) demonstrated that preparatory information prior to stressful events reduces anxiety and increases performance as well as confidence in one’s abilities. Investigating whether the reduced stress response transfers to real CQB mission and how long the reduced stress response persists would be an essential research object to derive implications for military mission and training planning.

Finally, the relationship between stress reaction and CQB performance was examined. According to the catastrophe model of anxiety and performance (Hardy and Parfitt 1991), a higher cortisol concentration before the CQB pre-test positively correlated with performance, aligning with the model’s prediction of enhanced performance under moderate physiological arousal (Schmidt et al. 2020). A correlation between performance and the noradrenergic stress response, indicated by sAA, was not found, indicating the distinctiveness of both systems (Granger et al. 2007). It is questionable why cortisol, but not sAA, is associated with CQB performance. Cortisol is considered an indicator of low perceived control and predictability of a situation (Hellhammer et al. 2009). Analytical, anticipatory, and cautious behavior seems beneficial in CQB, so an increased cortisol concentration may enhance foresight and, therefore, CQB performance. This assumption is supported by the finding that CQB performance is negatively associated with extraversion (Ibrahim, submitted). The missing relationship between sAA and CQB performance could result from sAA being secreted more in acute psychological stress reactions but less in anticipatory ones, thus less impacting CQB procedures.

These findings contribute to the current research by demonstrating that even a compact 12-h CQB training can significantly enhance tactical performance and reduce stress responses, aligning with earlier studies on stress inoculation and training impact. The results add nuance by showing that while tactical behavior improves, gaze behavior requires more time to develop, reinforcing the importance of extended practice. Additionally, the relationship between cortisol levels and performance underlines the role of physiological arousal in CQB tasks, offering a refined understanding of stress-performance dynamics in high-pressure situations.

Limitations

While this study provides important insights into the psycho-physiological aspects of CQB, this study also has limitations that could be addressed in future research. The primary limitation is the small sample size. Although effect sizes were sufficient for assessing training effects and obtaining statistically independent results, future investigations should target a larger post-assessment sample to reliably evaluate smaller effects on physiological indicators. Another issue was the technical implementation of the study. HR measurement or eye-tracking failed in multiple instances, rendering the measurement unusable and producing missing data points. A potential adaptation for future studies could involve a less dense experimental schedule to verify equipment functionality more thoroughly. A further source of unsystematic variance was the influence of the time of day on circadian cortisol concentration. The assessment period from approximately 10 am to 5 pm might have influenced stress reactions since cortisol concentration typically peaks in the morning (Bailey and Heitkemper 1991). This may also explain the relatively high error bars depicted in Fig. 2.

Future studies could consider increasing stress levels, for instance, by conducting force-on-force (FoF) scenarios, which induce a significantly higher stress level than cardboard scenarios (Taverniers and De Boeck 2014). Lastly, it should be noted that single-CQB scenarios were conducted here. However, CQB is predominantly performed in squads or small teams. Thus, future investigations, particularly focusing on small groups and team CQB, should be conducted to create conditions closer to real operational scenarios.

Conclusion

Overall, this study demonstrates that even a 12-h compact training positively impacts CQB performance. Particularly, the facet of tactical behavior is addressable in a limited time frame. Gaze behavior appears as a valid indicator of CQB expertise. Furthermore, the training reduced the anticipatory and accompanying stress response regarding CQB. Finally, a positive association was found between the anticipatory stress response and CQB performance, suggesting increased caution and situational awareness positively impacting CQB performance. Lastly, this study’s CQB observation instrument can serve as a template for a detailed performance assessment to further standardize the training evaluation and enhance the force’s urban operation capability in the long run.