Introduction

A mass casualty incident (MCI), as defined by the National Emergency Medical Services Information System (NEMSIS; Salt Lake City, UT) [1], is “an event (that) generates more patients at one time than locally available resources can manage using routine procedures or resulting in a number of victims large enough to disrupt the normal course of emergency and health care services, and (that) would require additional non-routine assistance.” MCIs, also called mass casualty events (MCEs) globally, include exposure to nuclear, biological, or chemical (NBC) agents that can be generated either by human activity or by natural disasters [2]. Although infrequent, MCEs represent major public health emergencies as they can overwhelm quickly even well-resourced hospitals with an onslaught of tens-to-thousands of critically injured individuals. Additionally, hospitals themselves have been subjected to major incidents such as power loss, flooding, or fire [3]. From 2000 to 2019, MCEs claimed 1.23 million lives costing approximately US$2.97 trillion, a sharp increase over the previous twenty years, posited to be related to an increase in weather-related disasters [4, 5].

Review of past incidents and recent surveys in the United States (US) have shown that hospital staff and resident physicians (RPs) remain uncomfortable with current disaster mitigation materials and protocols [2, 6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29]. Insufficient training and inadequate equipment not only cost lives, but also put hospital staff at risk, as experienced during Hurricane Katrina [20]; the Fukushima, Japan nuclear disaster [30]; and with the COVID-19 pandemic [25, 26]. Furthermore, this lack of preparedness can disrupt residency training [25, 26] and manifest psychological effects [25, 26, 31]. To optimize MCE responses in the hospital/acute care setting, a disaster response protocol must be in place and simulated regularly. Whereas protocols have been emplaced in hospitals throughout the US [32], most have not been designed or simulated to account for RPs [9, 10, 14,15,16,17, 20, 21, 25, 26, 28]. The shortcomings of existing training are several, including communication barriers (among staff or between staff and patients,) lack of emphasis on triage, and a lack of definition of roles and responsibilities [17, 19, 20, 22, 28, 33, 34].

An organized, system-wide approach that provides clarity regarding staff roles and responsibilities contributes to successful disaster planning and recovery [33, 35]. Regular drills and simulations were recommended in physicians’ training by the American Association of Medical Colleges (AAMC) in 2003 [36]. As rapid decision making and triage of patients are part of their everyday practices, surgeons, most particularly trauma surgeons, are assumed to occupy leadership roles during a MCE [7, 9, 10, 13, 17, 37]. This was conveyed explicitly by the American College of Surgeons (ACS) [39], who adopted (also in 2003) the position statement drafted by its Committee on Trauma, that “surgeons ought to attain an appropriate level of education and training in the unique principles and practices of disaster and mass casualty management, and to serve as role models in this field” [17, 33, 38, 39]. Nevertheless, surveys since then have shown not only that general surgery (GS) and emergency medicine (EM) RPs, who are trauma team members (TTMs) of hospitals, are underprepared [7, 9, 10, 13,14,15,16,17], but also that GS residents (GSRs) were even less prepared than EM residents (EMRs) [14, 17].

Based on the results of the several studies referenced above [7, 9, 10, 13,14,15,16,17], a questionnaire was distributed in two parts by us at our institution, an urban level I trauma center in New York City (NYC), the nation’s highest-risk area for MCEs [40, 41]. The first phase was designed to compare preparedness of GSRs to other TTMs (i.e., EMRs, EM physician assistants (EMPAs) and GS physician assistants (GSPAs.) In the second phase of this study, we evaluated the extent of disaster preparedness training for RPs across all specialties not represented on trauma teams as outlined above. We tested the hypothesis that, regardless of specialty, RPs are not familiar with the disaster response plans and protocols that are in place for MCEs and need additional training to ensure an effective and safe response in the event of such an event.

Methods

An electronic survey (Supplementary Information 1 and 2) was distributed in two phases to RPs, as well as to PDs/APDs (in the second phase only), of all accredited residency training programs across all specialties at our urban, university tertiary/quaternary medical center that serves as a regional referral center for injury care; non-accredited programs were excluded. Consisting of multiple-choice and free-response questions, the survey was distributed from May 26-October 13, 2018 for the first phase, then February 2-March 20, 2020 for the second phase.

Residency training PDs/APDs were recruited by e-mail using an internal directory and asked to participate. That recruitment e-mail message offered no incentive for participation. If there was no response after four requests, the program was considered a non-responder. After agreement, the survey was disseminated directly to RPs and PDs/APDs, or via the program coordinators of participating departments, with an introductory e-mail message that, by contrast, included a lottery for a modest cash prize as a means of compensation for the time devoted to participation. Follow-up e-mails were distributed using the contact information provided by PDs/APDs/program coordinators. All responses were kept anonymous.

The survey inquired as to demographic information including name and location of program, role of the respondent, postgraduate year (PGY) level, questions specific to MCE preparedness competencies addressed during training, and what teaching methodologies were used. The MCE preparedness core competencies assessed were those recommended by the National Standardized All-Hazard Disaster Core Competencies in Disaster Medicine [42]. Data were collected when participants accessed the e-mailed survey link. De-identified data were housed in a secure online database. The Research Electronic Data Capture database (REDCap®; Vanderbilt University, Nashville, TN) was used under institutional license for data acquisition and analysis. The study was approved by the Committee on Human Rights in Research of Weill Cornell Medical College with waiver of informed consent (Protocol #1810019635).

Primary outcomes assessed were awareness of their institutional and departmental MCE plans with perceived degree of preparedness, and self-reporting of institutional MCE training and teaching during residency. Secondary outcomes assessed included differences in self-reporting comparing junior RPs (PGY-1, -2) vs. senior RPs (PGY-3+), and self-reporting of RPs’ awareness and preparedness between non-surgical specialties and surgical subspecialties. Additionally, PD/APD’s perceptions of appropriateness of time spent on MCE training, and reporting of barriers encountered were assessed.

To compare TTMs, including the EMRs and GSRs, with other medical specialty RPs we consolidated data from both phases of the survey and conducted a comprehensive analysis encompassing data from both surveys whenever applicable. The survey tool utilized initially differed slightly in that it was more concise, focusing on aspects such as awareness, knowledge of protocol, certainty regarding points of contact, responsibilities, and reporting procedures. In addition, the second phase assessed RPs' level of seniority and self-perceived levels of confidence. Furthermore, the assessment for advanced courses in the initial phase was limited to Advanced Trauma Life Support® (ATLS®, American College of Surgeons, Chicago, IL) and the Disaster Medical Education Program (DMEP, American College of Surgeons). Consequently, our comparative analysis between perceived levels of preparedness and certification in advanced courses was based solely on these two courses.

Statistical analysis

PDs/APDs were excluded from the analysis for the reported self-assessment of MCE plan-perceived awareness and preparedness (discussed in section II of Results). Descriptive results are reported as n (%) or median with interquartile range (IQR).

The Wilcoxon rank-sum test was used to compare responses to Likert scale questions between groups. The Fisher exact test was used for multiple-group comparisons. All p values are two-sided with statistical significance evaluated at the 0.05 alpha level. Data were compiled using REDCap® software. All analyses were performed in R Version 4.0.2 (R Foundation for Statistical Computing, Vienna, Austria).

Results

Characteristics of study subjects

A total of 123 (74%) individuals responded to the first phase corresponding to a response rate of 74% (123/166), which was comprised of 79 EMRs and GSRs, and of 44 EMPAs and GSPAs. Among 21 programs surveyed in the second phase (excluding EM and GS), 17 (81%) programs responded to the survey, whereas 34% of individuals (129 /380) responded to the second survey, in which 117 respondents were RPs and 12 were PD/APDs. Combining the two surveys, the response rate was 46% (252/546.) Specialties represented by RPs and PD/APD’s (PAs excluded) (n = 208) were anesthesia (2 respondents, 1.0%), dermatology (2, 1.0%), EM (48, 23.1%), GS (31, 14.9%), internal medicine (50, 24.0%), neurology (9, 4.3%), obstetrics/gynecology (6, 2.9%), ophthalmology (1, 0.5%), oral/maxillofacial surgery and dentistry (4, 1.9%), otolaryngology-head and neck surgery (5, 2.5%), pathology (13, 6.2%), pediatrics (1, 0.5%), plastic surgery (5, 2.4%), psychiatry (15, 7.2%), radiation oncology (2, 1.0%), radiology-diagnostic (8, 3.8%), radiology-interventional (1, 0.5%), rehabilitation medicine (2, 1.0%), and urology (2, 1.0%). Among the respondents, 55 (26.4%) were junior RPs whereas 62 (29.8%) were senior RPs of these specialties. There were 12 respondent PDs/APDs (5.8% of respondents) (Supplementary Information 3.)

RPs’ self-assessment of MCE plan-perceived awareness and preparedness

The survey assessed RPs’ awareness of institutional mass casualty response (yes, no, or unsure) through questions and responses shown in Table 1. Results from the combined surveys showed that 53% (n = 103) of RPs were unaware of such a plan. In our second-phase survey (that excluded TTMs,) 78% (n = 92) of respondents did not know or were unsure whether they were expected to contact someon.e and 73% (n=143) when combining surveys (including TTMs). Awareness of formal training was not assessed in the first-phase survey, but when non-TTM RPs were asked about awareness of formal MCE training offered by their program, 94% were unsure, and 81% reported no formal MCE training in the year prior (Table 2.) Despite the fact that TTM RPs are more likely to receive formal MCE training at our institution, a majority (68%, n = 134) of all RPs (including EMRs and GSRs) reported no formal MCE training over the past year (Table 1).

Table 1 Awareness of respondents to mass casualty event (MCE) plans
Table 2 Resident physicians’ (RPs) self-assessment of mass casualty event (MCE) plan awareness and preparedness across all non-trauma team specialties (emergency medicine and general surgery excluded)

We asked specifically about advanced training courses related to trauma and disaster/MCE preparedness that were taken during residency training, some of which are mandatory. These courses included Advanced Cardiac Life Support (ACLS, American Heart Association, Dallas, TX) and ATLS®. Whereas neither ACLS nor ATLS® are established components of DM preparedness, several aspects of these courses are relevant. Our consolidated findings revealed that 97% (n = 190) of respondents had completed ACLS, whereas 40% (n = 67) had undergone ATLS® training. Specific disaster/MCE preparedness courses were taken seldom. Advanced Disaster Medical Response (ADMR, American Association for the Surgery of Trauma, Chicago, IL), training was taken by 4% (n = 5) of participants, while Disaster Management and Emergency Preparedness (DMEP, American College of Surgeons), Advanced Disaster Life Support (ADLS®, National Disaster Life Support Foundation, Augusta, GA) and Fundamental Critical Care Support (FCCS®, Society of Critical Care Medicine, Mount Prospect, IL) were each taken by 3% (n = 4), although not by the same individuals.

Regarding self-assessment of perceived preparedness, responses were truncated similarly (Supplementary Information 4). Awareness of the existence of an MCE response plan across all residents showed a significant correlation with self-assessed preparedness, summarized in Table 2. However, differences in perceived preparedness between participants who were aware or unaware of the plan were slight. RPs who were aware of the plan (n = 93) had a “faint idea” of how knowledgeable they were about the plan (median 2.00 [IQR 1.00, 2.00]) compared to unaware RPs, who “had no idea” about how knowledgeable they were (median 1.00 [IQR 1.00, 1.00]) (p < 0.001) (Table 2.) A more robust relationship was identified between having knowledge of, or certainty about responsibilities during a MCE and possessing ATLS® or DMEP certification (Tables 3 and 4.) Median response for assessment of level of knowledge of MCE plan at institution is “I have a faint idea” (Likert 2) for ATLS®- or DMEP-certified RPs (median 2.00 [IQR1.00, 2.75]) vs. “I have no idea” for residents not certified (median 1.00 [IQR1.00, 1.00]) (p < 0.001). Similarly, the median response for assessment of level of certitude on responsibilities during a MCE is “slightly” (Likert 2) for ATLS- or DMEP-certified (median 2.00 [IQR 1.00, 2.75]) vs. “not at all” for residents not certified (median 1.00 [IQR 1.00, 1.00]) (p < 0.001).

Table 3 Resident physicians’ (RPs) self-assessment of certification in Advanced Trauma Life Support™ (ATLS) and mass casualty event (MCE) plan preparedness across all specialties, including emergency medicine and general surgery
Table 4 Resident physicians’ (RPs) self-assessment of certification in Disaster Management and Emergency Preparedness (DMEP) and mass casualty event (MCE) plan preparedness across all specialties, including emergency medicine and general surgery

Because confidence and level of seniority were not assessed in the first-phase survey, results from the analyses shown in Tables 5 and 6 reflect only non-TTM RPs. No differences were observed between RPs from non-surgical and surgical subspecialties regarding self-assessed preparedness. However, RPs from TTM subspecialties rated themselves more confident than non-TTM RPs (p = 0.031). Despite this, the median response in both groups indicated they were “not at all” confident (Table 5). Senior residents (PGY 3 +) rated themselves more prepared compared to junior residents regarding the hospital/institution’s MCE response (both, p < 0.05). However, the median response was “Not at all” in both groups. The level of seniority in training was not associated with participants’ self-assessment of preparedness (Table 6.)

Table 5 Resident physicians’ (RPs) self-assessment of mass casualty event (MCE) plan awareness: non-surgical specialties vs. surgical specialties, emergency medicine and general surgery excluded
Table 6 Self-assessment of mass casualty event (MCE) plan awareness comparing junior resident physicians (RPs) vs. senior RPs, emergency medicine and general surgery excluded

RPs’ perceived lack of preparedness was assessed by the RPs perceived lack of training, which was evaluated with the following three questions: (1) In your opinion, how effectively has your DEPARTMENT prepared you for a MCE?; (2) In your opinion, how effectively has your HOSPITAL prepared you for a MCE?; (3) As a health care provider, how LIKELY do you think a MCE (natural or man-made) will happen in the next 5 years in the area where you are training? Consolidated possible responses are shown in Supplementary Table 1. The median perceived departmental effectiveness of MCE among all EMRPs was “somewhat effective” 2.00 [median 2.00 (IQR 1.50, 2.00)] whereas the median value for all other RPs across non-surgical and surgical specialties was “not at all” [median 1.00 (IQR 1.00, 2.00)] (p = 0.001). The median for perceived hospital preparedness was also “somewhat effective” [median 2.00 (IQR 1.00, 2.00), but with a smaller distribution, not significantly different from all other specialties where the median was “not at all” [median 1.00 (1.00, 2.00)] (p = 0.110) (Table 7.)

Table 7 Resident physicians’ (RPs) perceived mass casualty event (MCE) preparedness by department and hospital across all specialties, including emergency medicine and general surgery specialties

RPs’ perceived lack of training, was further evaluated using a 4-point Likert scale [1: A lot less than needed; 2: somewhat less than needed; 3: just the right amount; 4: somewhat more than needed]) with the following question in the second survey, excluding TTMs: “In your opinion, how would you rate the amount of time your residency program spends on mass casualty training?” Our analysis revealed that one-half of RPs (n = 58) estimated that the time spent for training was “somewhat less than needed,” and 42% (n = 49) of them estimated that it was “a lot less than needed” (Table 8.)

Table 8 Assessment of resident physicians’ (RPs) and program directors (PDs)/associate PDs (APDs) training courses taken in trauma/mass casualty event (MCE) preparedness, and perceived time devoted to MCE training by residency programs, emergency medicine and general surgery excluded

PD/APDs perception of MCE training importance and barriers encountered

The median response of “somewhat less time than needed” from PDs/APDs was similar to RPs’ responses regarding the amount of time devoted to MCE in residency training (Table 8.) Nevertheless, the importance of training during residency was reported as “slightly” for 44% (n = 4) of directorial respondents. Furthermore, 22% (n = 2) were of the opinion that MCE training was “not at all” important, whereas another 22% (n = 2) opined the training to be “somewhat” important and a single respondent believed that it was “moderately” important. No one reported MCE training as “extremely” important. Limitations to providing disaster education identified commonly by PDs/APDs were “limited time” (75%), “limited infrastructure” (41.6%), and “financial” barriers (41.6%). One director (8.3%) mentioned “other” unspecified barriers, such as a lack of information regarding the required training for residents in their specialty (Table 9.)

Table 9 Program director/associate program directors’ (PDs/APDs) assessment of mass casualty event (MCE) training with respect to level of importance, and self-reporting of barriers to provide MCE training in residency programs

Discussion

In previous studies that highlight the lack of RPs’ disaster preparedness, the principal focus was on TTMs (e.g., EMRs [13,14,15,16,17, 20] and GSRs [7, 9, 14, 17, 20, 22, 25, 26]. Although trauma teams are first-responders during a MCE, scarce resources, patient volumes that exceed surge capacity, and special-needs populations, among other factors, may require early, active participation of RPs across all specialties. Therefore, there is a need to increase knowledge and awareness among all specialty residency programs.

To date, few studies have investigated RPs’ disaster preparedness in specialties other than GS and EM, such as anesthesia [21] and orthopedic surgery [18], although neither study conducted a comparative analysis with other specialties. Uddin et al. [28] addressed the existing gap in disaster and emergency preparedness training across various medical specialties. Whereas they acknowledged the need for improvement, their focus was primarily on specific fields such as surgery, anesthesiology, EM, pediatrics, and family medicine. In their study, they described a new curriculum specializing in preventive medicine, evaluating the progress of 15 residents who followed this curriculum. Given that RPs of all specialties constitute nearly 15% of the physician workforce in the US [43, 44], we emphasize the importance of incorporating MCE training across all specialties, rather than limiting it to selected ones. To our knowledge, this comprehensive approach has yet to be addressed in the existing literature.

Because RPs’ perceptions of preparedness across specialties have not been surveyed heretofore, we collaborated with all accredited residency training programs at our institution to assess the perceived level of preparedness of RPs, and to inquire about barriers faced by PDs/APDs to the implementation of MCE preparedness training. According to these results, further study can focus on more thorough educational needs assessments in MCE preparedness and care based on medical and surgical specialties, with the goals of developing curricula and advocating for changes to residency education relevant to the entire clinical enterprise and each specific department. Curricular content, teaching methodologies, and assessment tools may vary among the departments and reflect their needs.

Despite the vulnerability of NYC to MCEs [40, 41], and the disruption of training [25, 26] that would ensue, these data indicate that RPs across all specialties in a level I Trauma Center in NYC are underprepared. Physician awareness about an institutional MCE plan was not a strong predictor of preparedness, as the sole statistically different response was only slightly higher among physicians who reported awareness of a plan compared with those who were unaware. This suggests that physicians with awareness may still rely in times of disaster on directives from individuals that they know to contact. Moreover, subspecialty training did not predict self-assessment of preparedness (although sample sizes were small), nor did PGY seniority.

Previous studies indicate that EM PDs consider disaster preparedness as highly desired curricular additions [13, 16], but our surveyed PDs/APDs considered that, although “somewhat” less time than needed was devoted to MCE training in residency, it was only “slightly” important to add MCE training to their programs. This response highlights the contrast between PDs in specialties with a primary focus on disaster and emergency management, such as EM, and leaders from other disciplines less oriented toward this field. However, during MCEs, effective communication among specialties like EM, GS, and others becomes crucial. Certainly, the training and education of RPs as TTMs would differ from those in various other specialties.

Providing a more detailed description of the specific training objectives for residents in different specialties could potentially underscore the importance of this training for program leadership. It may prompt leaders to view the training as more than “slightly” important and perhaps recognize that the current allocation of time might be insufficient. As highlighted by one of the surveyed PDs in the open comment section of the survey, there is a recognized need for specific, detailed information regarding specific training requirements for residents in different specialties. Designing curricular content for various specialties is a major consideration. Even controlling for this, the responsibility for MCE training must be undertaken by the institution or department, and for all staff members, including trainees. Alternatively, once defined, outsourcing of training to a contracted third party, such as the National Training and Education Division from the Federal Emergency Management Agency (FEMA) [45] is a possibility.

Available time for instruction was of particular concern among leaders. Although PDs/APDs acknowledged that MCE training in residency receives less attention than required, envisioning how this training can be integrated into an already “crowded” and demanding training environment that prioritizes clinical responsibilities is a challenge. One PD, in another comment, captured well this sentiment: “I would be in favor of instituting some MCE education, but it should be limited in depth/scope such that the objectives can be achieved in a limited amount of time. Remember that there is an opportunity cost associated with adding to the curriculum-some other worthy topic will necessarily receive less time.” This valid concern could be addressed through the utilization of new technology that was democratized quickly due to the pandemic, such as online access to courses and tests at more convenient times for trainees. Moreover, as Haug et al. [46] described the anticipated enhancements in quality of care from the utilization of artificial intelligence (AI), one must recognize the possibility that AI will become an integral part of physician training sooner than anticipated.

“Finance” was cited as a barrier second only to “time.” Lack of governmental funding, whether in the US [47] or globally [48], may contribute to the lack of hospital staff training for MCEs. Technological advances (e.g., simulation) could be an approach to reduce the cost of MCE preparedness exercises [34, 37, 49]. Whereas in-person training with medical role players remains preferable, the emergence of AI could offer future opportunities for simulation, enabling trainees to practice and validation of some MCE training components in simulation centers. Other components of MCE training, such as triage, decontamination, or resource allocation, could be incorporated into didactic course segments or mandatory hospital training, in parallel to utilization of machine learning technology [50]. Other options that could be considered would be a centralized course for parts common to all specialties as part of GME orientation training when starting residency.

Crucial to the wider availability of MCE/DM training will be widespread recognition that it, too, is a core component of training for more than just GS and EM specialists [27, 29, 37, 47]. Creating plans of action tailored to each specialty, not just to EM [13, 15, 16, 42], combined with identification of impediments to disaster preparedness implementation will help convey the need to department chairs, hospital administration, governmental institutions, payors, and the Accreditation Council for Graduate Medical Education (ACGME.) Drawing from the training disruption experienced as residency graduates during the pandemic, Rojek et al. [51] emphasized further the need to expand residency education beyond traditional bedside clinical care and advocated for a broader curriculum in residency training, encompassing proficiency with the electronic health record, collaboration with specialists, and navigation skills within complex healthcare systems. They proposed collaborative efforts among residency programs, health systems, and regulatory bodies such as the ACGME to integrate RPs into system-level initiatives aiming at better preparation of residents for the complexities of modern healthcare delivery, including MCE. This integration would involve participation on committees, quality improvement projects, and other initiatives aimed at enhancing healthcare cost-effectiveness, equity, quality, and structural innovation. Nevertheless, successful implementation of an amalgam of orientation, didactics, frequent drills, and workshops that emphasize roles, responsibilities, triage, and effective communication under high-stress conditions will also require proper evaluation tools [42, 52].

There are several limitations to this study. These data were collected before the outbreak of COVID-19, which reached NYC in March 2020. We can speculate that prior to being exposed to severe shortages of resources generated by the pandemic, healthcare providers in most hospitals did not conflate the necessity for MCE training for multiple trauma events with other events that also generate a need for resource re-allocation, including re-assignment of duties. Given the impact of the COVID-19 pandemic on residency training [53, 54], it is possible that PDs/APDs would report a different answer than “slightly” when gauging the importance for MCE training during residency. As mentioned previously, residency graduates have underscored the need to address training disruptions caused by the COVID-19 pandemic, including staffing shortages, virtual visits supplanting in-person learning experiences, and high patient census [51]. Thus, it is possible that both PD/APDs and RPs would have higher response rates to this survey due to the urgent need to address these residency training gaps.

Another limitation is the relatively low response rates from PDs/APDs and non-TTM RPs, making it difficult to generalize the data across institutions and specialty types. A self-selected response group (such as herein) may introduce bias if responding programs/participants represent those with the most (or least) comfort with and interest in DM, but this is a matter of speculation. We did not assess specific indicators of what existing training encompasses.

The combination of two similar but not identical survey instruments introduced limitations from a statistical standpoint. First, measurement bias may be introduced, as the alignment of responses for analytic purposes may not capture fully the nuances of the constructs being assessed. Moreover, the temporal gap between survey phases might raise concern about the comparability of the data due to potential changes of population characteristics over time, although turnover is inherent to residency training. The study design precluded the application of advanced statistical techniques, such as multivariable analyses or structural equation modeling.

Conclusion

Although MCE preparedness is garnering increased attention and a variety of educational tools exist today to teach DM in US residency programs, trainees across all programs at a university level I trauma center remain unaware and unprepared for a disaster in a metropolitan area, comprising an estimated 20 million individuals, that has already been targeted for multiple MCEs [2, 6, 19, 40, 41, 55]. This survey highlights the need and opportunity for the creation of an educational model in DM, not only for TTMs but also for RPs across all medical specialties.