Pursuing a surgical fellowship has become the rule rather than the exception. In fact, studies have shown that approximately 80% of graduating surgery residents are applying for fellowships [1, 2]. While there may be many reasons for residents wanting to apply for fellowship [3, 4], there are also many unintended consequences, including, but not limited to, a significant cost and manpower burden during the application process [5].

There are multiple organizations overseeing fellowships that include the Accreditation Council for Graduate Medical Education (ACGME), Fellowship Council (FC), San Francisco Match, and the Society of Surgical Oncology. As more people apply to fellowships, the volume of programs applied to tends to increase. This phenomenon could be due to a perceived increase in competitiveness, which then compels residents to not only apply to more programs but also accept more interviews. Applicants must pay for these applications and simultaneously consider the cost of travel for each interview. These costs can be quite substantial; for pediatric surgery interviews, studies have found that applicants spent on average over $8000 [6, 7]. The significant cost can be burdensome for residents’ finances and may disadvantage qualified residents of a lower socioeconomic status. The financial loss of interviews is similarly mirrored in the clinical duties missed while on interviews. One survey study documented that residents missed in aggregate at least one week of work for interviews, which seem to be a conservative estimate [8]. With numerous fellowship interviews occurring during similar time periods, clinical service coverage gaps frequently result. Additionally, due to local and national regulations regarding the amount of time that can be missed, residents often use vacation time for interviews, which in turn may cause a strain on resident wellness [7].

Fellowship programs must also take time from clinical activities and incur cost to conduct interviews. Significant administrative time is required to review applications; additionally, clinics or operative dates must often be canceled to accommodate faculty participation in the interview process. Accordingly, there is an unquantifiable cost associated with this loss of clinical productivity. In some cases, programs also choose to provide housing accommodations for their applicants and incur further costs. Attempts to quantify these costs in general surgery residency interviews have estimated that programs spend, on average, over $1,200 per candidate interviewed [9].

Clearly there is a significant, yet variable, burden associated with interviews for both applicants and programs. Furthermore, there is limited documented information on either objective data or subjective perceptions of applicants and programs regarding current practices. The FC has successfully conducted its match process since 2004, but these issues have not yet been explored [10]. Given the competitive nature of FC fellowships (with a current mat rate of 64%) [11, 12] and the aforementioned issues, the FC began planning a quality improvement initiative for their interview and match processes. However, the onset of the COVID-19 pandemic, which disrupted the 2020 FC interview season, caused a rapid and unprecedented nearly universal shift to virtual interviewing formats [13]. This transition forced the FC community to adopt new methods for interviews and provided the opportunity to evaluate these novel processes.

This study aimed to describe traditional FC interview practices and then further analyze perceptions of applicants and programs regarding the impact of COVID-19 on the interview process, compare expected versus actual resource and financial utilization, and inquire about preferred future practices. We hypothesized that the virtual format would drastically reduce costs and still provide an adequate form of assessment, making it a viable option for the future.

Methods

This study was reviewed and approved by the University of Texas Southwestern Medical Center Institutional Review Board, The Fellowship Council Research Committee, and the Fellowship Council Board of Directors. Applicant and program surveys were developed by iterative efforts until consensus was reached. The surveys focused on various components of the interview process, including cost, time burden/work-days missed, effectiveness of different types of interviews, and the unanticipated changes secondary to COVID-19. The FC distributed the surveys electronically to both applicants and programs immediately following the 2020 main Advanced GI/MIS/Bariatric/Endoscopy/HPB match rank submission. Completion of the surveys was voluntary and anonymous.

The applicant survey (Appendix A) included questions regarding the number of programs applied to, number of interviews offered, and number accepted. Additionally, each type of interview location (in-person, centralized, virtual, etc.) was assessed for number of interviews offered, overall cost, time, and effectiveness in providing rank list decision-making on a Likert scale 1 (strongly disagree) to 5 (strongly agree). Questions also assessed interview format and cost changes associated with the transition to virtual format interviews due to the pandemic and whether these changes affected their comfort in making a rank list. Preference for future formats and optional comments was obtained.

The program survey (Appendix B) included questions regarding number of applications received, number of interviews offered, and number actually performed. Additionally, questions assessed the type of interview location the program planned on performing and the associated resources (days, hours, and interviewer numbers) expected. Finally, the survey also assessed changes associated with the transition to virtual format and effectiveness in decision-making for rank on a Likert scale 1 (strongly disagree) to 5 (strongly agree). Regarding decision-making, Likert score of 4 and 5 were considered overall sufficiently informative. Preference for future formats and optional comments was asked last.

Data were analyzed after removal of outliers (± 1.5 interquartile range [IQR]) and are represented as median [IQR]. Descriptive statistics and non-parametric analysis with Mann–Whitney U tests were performed using R-Studio®. A p-value of < 0.05 was considered significant. Freeform comments underwent thematic analysis and were grouped by recurring concepts for further review.

Results

Applicant response rate was 53% (140/265 certified applicants). Advanced GI/MI Bariatrics (n = 107) and Advanced GI/MI (97) received the majority of applications (Table 1). Applicants submitted a median of 27.5 (13.25–40) applications to 2 (2–3) types of program accreditations (Fig. 1). Note the FC allows applicants to apply to multiple different program designations in a single cycle. No single applicant applied to all 6 program designations. On average, applicants received interview offers from 36% (10/27.5) of the programs to which they applied and accepted and ranked the majority of these offers. Applicants received a median of 10 (4–17) interview invitations and accepted 10 (4–15.25) interviews. The number of programs ranked was 10 (5–15). Approximately 70% of applicants ranked every program at which they interviewed.

Table 1 Program designation application distribution
Fig. 1
figure 1

This graph shows the breakdown in the number of program designations to which applicants applied. Numbers 1–5 indicate the number of programs to which applicants applied. Most commonly (30%), applicants applied to two types of program designations

While candidates’ original plans included 9 (3.5–14) in-person interviews, only 30% of their interviews were actually completed in-person. In fact, 27% of applicants did not complete any in-person interviews. Due to the COVID-19 pandemic, 6.5 (3–10) interviews had format changes, which represented approximately 74% of interviews. Virtual pre-interviews were not common prior to COVID-19, with applicants having 0 (0–1) pre-interview offers and 73% of applicants attending none. Applicants attended 7 (3–11) virtual interviews as compared to no virtual interviews being initially planned. This made virtual interviews the most common form of interview in this cycle. The cancelation of all major in-person national meetings during the interview process also prevented applicants from participating in centralized (e.g., national meeting) interviews. A large portion of applicants, 90%, felt that in-person interviews (average score 4.39) and 81% felt virtual interviews (average score 3.9) were sufficiently informative for rank list decision-making.

Applicants were asked to assess both their expected expense if interview formats had not changed and their actual cost. Four data points estimating costs of greater than $17,000 were removed as outliers from analysis. The resulting total expected cost was $4750 ($2,000–$6,000) or $536 ($330.65–$863.45) per interview, while actual cost was $1000 ($250-$2,250) or $125 ($41.67–$250) per interview (p-value < 0.05). Additionally, applicants estimated in a normal interview year expected missed work-days would have been 10 (5–16). For this cycle, actual missed work-days were significantly fewer at 3 (0–6.25) (p-value < 0.05) (Table 2).

Table 2 Applicant resource utilization changes

Despite the transition to virtual interviews, 75% of applicants felt they were able to get enough information from the interview process to make an informed decision in their rank list. When asked how applicants would like interviews to proceed in the future, 45% preferred an in-person interview after a preliminary virtual interview and 29% preferred virtual interviews only (Table 3).

Table 3 Future interview format preferences

Specific comments from applicants indicated that this year’s interview changes resulted in saving a large amount of money, a reduction in time previously used for travel, and decreased disruption to clinical duties. While most reported the virtual interview format still afforded an adequate amount of information to create a rank list, some commented they sacrificed the “true” experience/feel of programs. Additionally, they were less able to evaluate nonverbal cues, evaluate faculty and fellow interactions, or discuss shared experiences and knowledge about programs with those they met on the interview trail. They also noted that if virtual interviews were to continue, they needed more consistency in platforms and format. When asked about the future of interviews, applicants indicated that virtual interviews would be a good way to screen programs and applicants. Once the screening was completed, applicants could visit fewer on-site programs and programs could offer fewer on-site interviews. There was, however, concern that if programs offered both virtual and on-site interviews that it would result in bias of their rank list, with programs more likely to rank candidates who “made the effort” to come in-person, rather than seeing them as those who had the financial means and time to do so.

For the program surveys, the response rate was 38% (55/143 certified programs). Responses came from 51% (31) of the Advanced GI and Advanced GI/MIS programs, 29% (12) of the Advanced GI/MIS Bariatric programs, and 35% (9) of the Bariatric programs. Flexible Endoscopy and Hepatobiliary programs were removed from our analysis due to having only one and two responses, respectively. In aggregate, programs received 60 (43–85.5) applications, offered 20 (15–26) interviews, interviewed 16 (12.5–21), and ranked 14 (10–18) applicants. These data equate to an average program ranking rate of 23% of applications received. Advanced GI/MIS Bariatrics ranked the fewest applicants at 17%, while Bariatrics ranked the highest amount of applicants at 28% (Table 4).

Table 4 Program designation rank statistics

In-person interviews were planned initially by 93% of programs. They expected to interview 16 (13–20) applicants in this manner; however, due to pandemic restrictions programs actually interviewed only 10 (3–14.4) candidates in-person. Central location interviews (e.g., national meetings) were planned by 15% of programs, but were eliminated due to meeting cancelations and pandemic constraints. Prescreening with virtual interviews followed by another form of interview was initially planned by 9% of programs. The majority of programs (53%) did not initially plan to do completely virtual interviews; however, 47% ultimately performed at least one virtual interview. The vast majority (71%) of those programs did all of their interviews virtually. Of note, 25% programs did not do any virtual interviews (some programs had completed their in-person interviews prior to pandemic restrictions). Expected in-person resources were 2.5 (2–3.25) days with 4 (3–5) faculty spending 5 (5–7) hours per day. Instead, virtual interview resources were 2 (1–3) days, with 3 (2.5–4) faculty members spending 4 (3–5) hours per day. When comparing person-hours (days × faculty × hours), there was a significant difference between expected in-person 48 (27.5–80) and virtual 24 (9–40) interviews (p < 0.05) (Table 5).

Table 5 Program resource utilization changes

For the 2020 interview cycle, 84% of programs felt they had adequate information to form a rank list. For future cycles, 21% of programs would prefer only virtual interviews, 38% would prefer virtual pre-interviews followed by an in-person interview, and 42% would prefer in-person only interviews (Table 3).

Comments from programs indicated that they would be willing to convert to virtual interviews, but would rather leave the choice to the applicant or use it as a screening/pre-interview. They acknowledged that reforming the process would result in financial benefits on all sides, but also felt that an in-person assessment of the program is important for applicants. They were also concerned that the ease of virtual interviews may result in receiving large numbers of applications from less committed candidates.

Discussion

Historically, the interview season necessitates a high cost and time burden on applicants and programs. The lack of standardization and absence of any attempts to reform this process was the impetus for the FC to begin evaluating their own interviewing process. As soon as broad-based travel restrictions were being put into place across the USA in March 2020, the FC quickly issued a statement that advised all programs to conduct any remaining non-local fellowship interviews using a virtual format. This abrupt change meant that many programs had to quickly adapt on the fly and structure a virtual interview format and template without availability of prior models [14]. Consequently, the changes necessitated by the COVID-19 pandemic offered a unique ability for the FC to evaluate traditional practices, as well as interventions that could be implemented to improve this process in the future.

Overall, our survey indicated that although this cycle’s virtual interview process had to be quickly created from scratch, it was incredibly successful. Both programs and applicants felt they could glean sufficient information from a virtual format, while saving both time and money. Given that this was the first year the FC undertook this new interview platform and the short time with which each individual program had to plan, it is likely that this process could lead to lasting reforms.

Our study showed that applicants estimated an average cost for interviews of approximately $4500 with some applicants even estimating costs as high as $17,000. Additionally, time away from work easily estimated over a week per resident for planned in-person interviews. This is consistent with other studies which showed an average cost of $4000–$7000 and over a week of missed work for residents applying to a variety of surgical fellowships [6, 8, 15]. The average PGY4 (the year of the FC match) salary is $64,255 [16]. Our results indicated that they would be spending roughly 7.4% of their annual income on in-person interviews. Unfortunately, not evaluated within our survey was the potential discrimination inherent in this system. Those with higher debt or less financial liquidity may not be able to afford to apply or interview as broadly. Even for those who can afford the steep cost, being away from home may make simultaneously caring for a family difficult or impossible. Actual cost for the predominantly virtual interview format was significantly lower, totaling less than $1000 or 1.5% of the average PGY4 annual income. Presumably, these costs were related to incidental costs, such as clothing or computer equipment/microphone, and application fees, as well as a small number of in-person interviews that were conducted. Additionally, the predominantly virtual format was associated with significantly less time missed from clinical duties or time away from family.

When asked about preferences for future interview formats, the most commonly chosen option by both applicants and programs was virtual pre-interviews followed by selective in-person interviews. Interestingly, virtual-only interviews were preferred by applicants over in-person only interviews by a margin of 29% to 18%. The low favorability rating for centralized on-site interviews likely reflects the relative success of the virtual format which was heretofore unknown. While both applicants and programs felt they were able to gain sufficient information from virtual interviews, many commented that in-person interviews offer intangible interpersonal observations, such as faculty interactions and interpersonal attributes that were not evident in virtual interviews. Respondents also felt that formats and platforms for interviews should be standardized, thus making it easier for all involved and limiting any bias which could occur from varying formats.

There were several limitations to our study. This was a voluntary survey and therefore did not encompass all applicants and programs. The questionnaire was distributed immediately following the match, but was still subject to recall bias. Most obviously, though, while the pandemic afforded an opportunity to evaluate a different interview process, it also forced applicants to estimate and guess as to what the process would have been like in an ordinary year. No questions were asked to elucidate the reasons for the large variety in cost estimations or financial concerns that could better describe the effects of these changes on particularly vulnerable applicants. Additionally, applicants and programs could only make an educated guess regarding whether they received adequate information, as their fellowship cycle had not yet started. We made efforts to report representative data; however, we received some data points that were obvious outliers. For instance, one applicant reported estimated costs for interview season as high as $17,000. Therefore, we defined outliers as 1.5IQR and excluded these data points so as not to skew our results. Similarly, the median of 10 programs ranked and 10 estimated days off from training for travel may have been an underestimate of actual time away as applicants may not have included partial days off for travel the day before a scheduled interview.

Clearly, the pandemic resulted in a dramatic change to how FC programs conducted interviews in 2020. This study showed that virtual interviews can give valuable information to candidates and programs and may ultimately be a good screening tool while decreasing the amount of time and money spent on interviews. This information may be valuable in planning future interview cycles such that resources utilization may be optimized and benefits to applicants and programs may be maximized.