Where Do International Medical Graduates Matriculate for Internal Medicine Training? A National Longitudinal Study

Abstract

Introduction

In 2020, roughly 25% of applicants who matched into internal medicine (IM) residencies were international medical graduates (IMGs). We examine 12-year trends in distribution of IMGs among IM training programs and explore differences in program perceptions towards IMG recruitment.

Methods

Since 2007, Association of Program Directors in Internal Medicine Annual Surveys have collected data about trainees by medical school graduate type. Sixteen additional questions regarding perceptions of IMGs were included in the 2017 spring survey.

Results

The 2017 survey response rate was 63.3% (236/373) and ranged from 61.9 to 70.2% for the 2007–2019 Annual Surveys. During that 12-year period, 55–70% of community programs’ and 22–30% of university programs’ PGY1 positions were filled by IMGs. In 2017, 45% of community programs’ and 15% of university programs’ interview and ranking positions were allocated to IMGs. Departmental pressure (university 45.6% [95% CI 43.7–47.5]; community 28.2% [95% CI 26.6–29.7]; p = 0.007), institutional priority (university 64.0% [95% CI 62.1–66.0]; community 41% [95% CI 36.9–44.6]; p = 0.001), and reputational concerns (university 52.8% [95% CI 50.0–55.6]; community 38.5% [95% CI 36.0–40.9]; p = 0.045) were cited as factors influencing recruitment of IMGs.

Conclusion

Our study was limited to exploring program factors in residency recruitment and did not assess applicant preferences. There is a large, longstanding difference in the recruitment of IMGs to US community-based and university residencies, beginning during the interview and ranking process. Further research in disparities in IMG recruitment is needed, including exploring pressures, preferences, and potential biases associated with the recruitment of IMGs.

INTRODUCTION

International medical graduates (IMGs) comprise nearly one-quarter of residency program trainees in the USA and practicing physician workforce, with substantially higher representation in certain specialties.1 In the 2020 National Residency Matching Program (NRMP) Main Match, 24% of applicants matched into internal medicine (IM) residency programs were non-US born IMGs with the remaining positions filled by US allopathic trained graduates (USMDs) and osteopathic medical graduates (DOs).2

Despite the large number of IMGs enrolled in US residency programs, little is known about their distribution among US teaching hospitals. Although the NRMP provides detailed Match data annually, it does not publish information about the distribution of IMGs by residency program type.2 A recent study by Jenkins et al. examined categories of programs from a large national database, classified as USMD-dominated vs DO/IMG-dominated and found university hospitals to be USMD-dominated rather than IMG-dominated.3 Limitations of that study included its cross-sectional design from one academic year (2017–2018) and preponderance of missing data.

Using 12 years of data from IM program directors (PDs), this study explores the hypothesis that community programs fill a higher percentage of their IM post-graduate year 1 (PGY1) positions with IMGs than university programs. Secondarily, this study describes IM PD perceptions about the recruitment of IMGs.

METHODS AND DATA ANALYSIS

Data Collection and Survey Instrument

The Association of Program Directors in Internal Medicine (APDIM), a charter organization of the Alliance for Academic Internal Medicine (AAIM), administers annual research surveys and occasional thematic surveys to IM PDs.4 Annual surveys are typically administered in the fall between September and October and since 2007, have included questions about the number of trainees in residency programs by medical school graduate type: US Medical Doctor (USMD), International Medical Graduate (IMG), and Doctor of Osteopathy (DO). Longitudinal data about trainees at IM programs were collected from 2007 to 2019 in the annual survey. In the fall of 2016, an AAIM subcommittee created and iteratively modified de novo a section of questions about IMG recruitment for a separate spring 2017 survey. The section was submitted to the 18-member APDIM Survey Committee for review, and then blinded and scored based on merit and relevance to graduate medical education (GME). The committee pilot-tested and revised the survey instrument iteratively to improve content and response process validity. This process followed the basic protocol for developing the annual APDIM survey and has been described previously in detail.5 The final spring 2017 survey instrument included 16 questions with conditional (skip and display) logic (Appendix 1) that queried IM PDs about number of IMG applicants, ranking preferences, and perceptions of IMGs. Results from question 2 A–D are not included in this manuscript. The study protocol was exempted from full review by the Mayo Clinic Institutional Review Board (study #08-007125), with MK serving as project staff.

The spring 2017 survey was fielded from March to May 2017 and distributed to 373 APDIM member programs, representing 89% of the 418 IM programs accredited by the Accreditation Council for Graduate Medical Education (ACGME) as of July 1, 2016. Non-respondents were sent email reminders until the survey closed in May 2017. Respondents who reported that their program did not participate in the 2017 NRMP Match did not complete the IMG recruitment section. Prior to de-identification, the survey dataset was appended with data from publicly available third-party sources to include program characteristics such as number of ACGME-approved positions.6 Programs were assigned to US Census Bureau region7 and program type was obtained from the American Medical Association Fellowship and Residency Electronic Interactive Database Access System (AMA-FREIDA).8

Data Analysis

Data analysis was conducted in Stata SE 14.2.9 Descriptive statistics were used to summarize survey responses and residency program characteristics. Military programs and programs that did not participate in the 2017 Main Match were excluded from the analysis for this manuscript. To assess representativeness of the survey data, essential characteristics of residency programs were compared between the respondents and the survey population using the third-party data referenced above. Group-based testing for statistical associations was conducted using Pearson’s chi-square statistic (or the adjusted Wald [Pearson] test) or Fisher’s exact test for categorical variables, Welch’s t test or the Mann-Whitney test to compare mean differences for continuous variables, and an equality-of-medians test to compare group-based differences between medians. To quantify non-response bias and item non-response (questions skipped/not answered), we conducted a sensitivity analysis using characteristics (from the third-party data referenced above) that explained the most variance among all survey-eligible programs. The combined effect size of non-response bias and item non-response was a mean of 2.9% (99% CI 97.1–100.0; p = 0.423), suggesting that bias and item non-response had a minimal effect on our estimates. All non-university programs (community and community-based university-affiliated) are referred to as “community programs” in this manuscript.

RESULTS

Respondents

The response rate for the 2017 spring survey was 63.3% (236/373). Response rates for the 2007 through 2019 surveys ranged from 61.9 to 70.2% (mean of 66.1%).10,11,12,13,14 Characteristics of responding vs non-responding programs are reported in Table 1. There was slight over-representation of respondents from programs with a higher mean fill rate of USMDs (responding 47.1%; SD 36.0; population 43.1%; SD 35.6; p = 0.004) and slight under-representation of community-based, university-affiliated programs (responding 47.6%; population 52.3%; p = 0.016). Respondents from eight programs that did not participate in the 2017 Main Match did not complete this survey section. We note that any over-representation among responding programs for the characteristics above likely is explained by their ACGME original accreditation year. That is, comparatively newer programs (accredited within the past ~ 5 years) tend to be statistically associated with non-university programs and with lower fill rates by USMDs, and newer programs might be less inclined to provide survey data because they have not been training residents long enough to provide representative data.

Table 1 Core Characteristics of Internal Medicine Residency Responding and Non-responding Programs: 2017 APDIM Spring Survey of US Internal Medicine Residency Program Directors

Table 2 summarizes the characteristics of responding university and community programs excluding military programs and programs that did not participate in the 2017 NRMP Match. University and community programs differed in size with university programs being larger both in approved (university median-approved positions 67; SD 15.8; community median-approved positions 50; SD 13.2; p < 0.001) and filled spots (university median-filled positions 59; SD 14.5; community median-filled positions 46; SD 11.9; p < 0.001) but were similar in all other characteristics.

Table 2 Program Characteristics: 2017 APDIM Spring Survey of US Internal Medicine Residency Program Directors

Twelve-Year Trends

APDIM survey data collected from 2007 to 2019 demonstrate that IMGs comprised larger percentages of categorical PGY1s in community compared to university programs, with persistent 30–40% differences over that period (Fig. 1). IMG PGY1s in community programs decreased from 70 to 55% during this time, whereas the percentage of IMGs in university programs decreased from 30 to 22%.

Figure 1
figure1

12-year trends in IMG composition of US internal medicine residency programs: Academic year 2008 to academic year 2019. Y axis: % of internal medicine positions filled by IMGs. X axis: academic year. Green arrow indicates when the all-in policy was implemented for the NRMP Main Residency Match.

Table 3 reflects data collected in the 2017 spring survey only and highlights significant differences between university and community programs along the recruitment process. University programs receive a lower percentage of their total applications from IMGs (university mean 54.7%; SD 26.4; community mean 73%; SD 25.4; p < 0.001), offer a lower percentage of their total interview spots to IMGs (university mean 16.6%; SD 21.8; community mean 45.5%; SD 32.6; p < 0.001), and offer a lower percentage of their total rank spots to IMGs (university mean 15.7%; SD 21; community mean 46.1%; SD 33.5; p < 0.001). Results for Table 3 did not change when calculations were adjusted for programs that offered visas.

Table 3 Applications, Interviews, and Ranking of IMG Candidates by Program Type: 2017 APDIM Spring Survey of US Internal Medicine Residency Program Directors

Table 4 highlights differences in self-reported pressures and perceptions for recruiting IMGs. A higher percentage of university programs reported to offer visas than did community programs (95.7% vs 74.3%, p < 0.001). A higher percentage of university programs reported institutional priority to recruit US medical graduates to a great extent (64% vs 40.7%, p = 0.001) and departmental pressure to favor US medical graduates. (45.6% vs 28.2%, p = 0.007). A higher percentage of university programs reported departmental concern about their program’s reputation with recruiting IMGs than did community programs did (52.8% vs 38.5%, p = 0.045). We used Cronbach’s alpha (α) to confirm the internal consistency of the items in Table 4 and how well they represented program director perceptions about recruiting IMGs (α = 0.7418 [test scale]; average inter-item correlation = 0.2532; scale reliability coefficient = 0.7532); an inter-item correlation between 0.20 and 0.40 was deemed acceptable.

Table 4 Perceptions About Recruiting IMGs by Program Directors of Internal Medicine Residencies: 2017 APDIM Spring Survey of US Internal Medicine Residency Program Directors

DISCUSSION

To our knowledge, this is the first longitudinal study examining the difference in percent PGY1 training positions filled by IMGs by program type. Our results provide over 10 years of national data for IM programs, which constitute the largest specialty of trainees in the country.15 Our study confirms our hypothesis and shows a marked difference in the percentage of PGY1 categorical training positions filled by IMGs. In absolute terms, university programs filled roughly 30 to 40% less of their intern positions with IMGs than community programs each year for the past 12 years.

These gaps can be traced back to the interview and ranking stages of recruitment. Although the percent of all applications that a program receives from IMGs is lower for university programs than for community programs by a ratio of roughly 2:3 (55% vs 73% respectively), when one examines the percent of all interviews that programs conduct on IMGs, this ratio drops to 1:3 (17% vs 46% respectively). Once the interviews are conducted and rank lists are made, university programs devote 16% of their rank list positions to IMGs, whereas community programs devote 46% of their rank list positions to IMGs, again a ratio of 1:3. These results did not differ when studying only programs that offer visas, which interestingly, a higher percentage of universities had done (p < 0.001).

Our study raises two important questions: (1) Why are university and community programs so far apart in how they interview, rank, and match IMGs? and (2) what implications can be drawn about the differential recruitment of IMGs by program type? Our study sheds some light on the first question and we offer some perspectives about the second.

The process of recruitment into a training program is complex, driven both by applicant and program factors. We did not study factors inherent to applicant preferences, such as applicants’ decisions to apply, accept interview invitations, and ultimately rank programs. Our study focused on several program factors that may affect IMG recruitment. We found that a higher percentage of university programs consisted of those that reported pressures related to IMG recruitment, and the differences between program types were statistically significant. University programs compared to community programs report that their institution prioritized the recruitment of US graduates to a great extent (64% vs 41%, p < 0.001) and that they felt pressure from their department leadership to preferentially recruit US graduates (46% vs 28%, p < 0.007). University program directors reported that departmental leadership’s perception of the program’s reputation was a concern for them when recruiting IMGs (53% vs 39%, p = 0.045). Such pressures likely contribute to the 30% difference in the interviewing and ranking of IMGs between university and community programs.

There may be logical and acceptable reasons why university programs would favor the recruitment of USMDs. For example, universities have medical schools and would naturally need to support their school’s students, many of whom are graduating with significant financial debt. Universities may differ from community programs in institutional mission, research endeavors, location, and faculty practice issues that could affect how they prioritize residency applicants. However, reputational pressure not to recruit IMGs was reported by over 50% of university PDs, raising questions of bias. Programs could review their recruitment practices and reflect on whether this may be occurring.

Community programs that become reliant on IMGs to fill their spots may get comfortable selectively interviewing and ranking IMGs year after year, thereby perpetuating the system that sorts IMGs into them. The current COVID-19 pandemic may further drive this phenomenon by allowing IMGs located overseas to interview virtually rather than having to fly to the USA for an in-person interview.

Few prior studies have examined residency program attitudes towards IMGs and none that we are aware of have compared university to community programs. Two previous reports have suggested that, all other things being equal, the relative response to requests for interviews was 2.0 (p < 0.01) and 1.5 (p < 0.001) in favor of USMDs vs IMGs for family practice and psychiatry residencies.16,17 A third study of surgical residencies found that 87% of program directors agreed or strongly agreed with the statement, “In reality, all things being equal, our program would rather offer positions to USMGs than to IMGs.” Almost one-fifth (18%) felt external pressure not to rank a better qualified IMG over a USMG and 70% felt there was discrimination against IMGs.18 Other authors have lent their perspectives on the notion of IMG friendly programs being perceived as less competitive or desirable. 19,20,21 Our data confirm high rates of prioritization, pressure, and concerns about reputation regardless of program type. The preferential sorting of IMGs into community programs has been anecdotally experienced for years and led to the common colloquialism “IMG friendly” program. What might the impact of such sorting be for patients, trainees, and programs? Our study raises these questions and calls for further study in this area.

We also find that over the past 12 years, the percentage of university and community program PGY1 positions allocated to IMGs declined from 70 to 56% vs 30 to 21%, respectively. Allopathic and osteopathic applications and match fill rates have increased over the years for all specialties combined and may explain the temporal downward decline depicted in Figure 1. The institution of the “All-In” rule in March of 2013 likely explains the downward inflection point in Figure 1. Prior to “All-In,” programs could recruit IMGs up front before the Match and then potentially fill more IMGs in the Match, a common practice especially for community programs22. When the “All-In” took effect in 2013, it would stand to reason that IMGs would be adversely affected and Figure 1 confirms this notion.

Strengths of our study include (1) its 12-year data collection period; (2) collection of nationally representative data among IM residency programs representing the largest training specialty in the USA; (3) a respectable 63% response rate with minimal non-response bias; and (4) data that include self-reported perceptions and pressures with respect to IMG recruitment, an understudied area and heretofore not conducted in a way that compares responses by program type.

Limitations include the inability to study applicant factors and to comprehensively measure all program factors affecting recruitment, studying only IM residencies, assessing perceptions about IMG recruitment specific to one academic year, and possible self-reported (cognitive) bias especially for survey items that explore PD opinions.

In conclusion, our study confirms a long-held belief among residency program directors and applicants alike: community programs enroll a higher percentage of IMGs in their PGY1 class than university programs, likely due to the interview and ranking steps of the recruitment process. We find that this trend has occurred for at least 12 years, with a large gap persisting despite the NRMP “All-in” decision in 2013. PDs report program pressures and perceptions about the recruitment of IMGs as contributory factors.

References

  1. 1.

    Ahmed AA, Hwang WT, Thomas CR Jr, Deville C Jr. International Medical Graduates in the US Physician Workforce and Graduate Medical Education: Current and Historical Trends. J Grad Med Educ. 2018;10(2):214–218. https://doi.org/10.4300/JGME-D-17-00580.1.

    Article  PubMed  PubMed Central  Google Scholar 

  2. 2.

    National Resident Matching Program. Results and Data: 2020 Main Residency Match®. Vol 2020. Washington, DC: National Resident Matching Program; 2020: http://www.nrmp.org/main-residency-match-data/, accessed 9/28/2020.

    Google Scholar 

  3. 3.

    Jenkins TM, Franklyn G, Klugman J, Reddy ST. Separate but Equal? The Sorting of USMDs and Non-USMDs in Internal Medicine Residency Programs. J Gen Intern Med. 2019;35(5):1458-1464.

    Article  Google Scholar 

  4. 4.

    Chacko KM, Reddy S, Kisielewski M, Call S, Willett LL, Chaudhry S. Postinterview Communications: Two Surveys of Internal Medicine Residency Program Directors Before and After Guideline Implementation. Acad Med. 2018;93(9):1367-1373.

    Article  Google Scholar 

  5. 5.

    Angus SV, Williams CM, Kwan B, Vu TR, Harris L, Muntz M, Pereira A. Drivers of Application Inflation: A National Survey of Internal Medicine Residents. Am J Med. 2018;131(4):447-452. https://doi.org/10.1016/j.amjmed.2018.01.002. Epub 2018 Jan 31.

    Article  PubMed  Google Scholar 

  6. 6.

    ACGME - Accreditation Data System (ADS). 2017. https://apps.acgme.org/ads/Public/Programs/Search. Published 2017. Accessed 12/2017.

  7. 7.

    US CB. Census Data. www.census.gov/geo/www/us_regdiv.pdf. Published 2017. Updated 2012-09-01. Accessed December 2017.

  8. 8.

    American Medical Association FREIDA Residency Program Database | Medical Fellowship Database | AMA. www.ama-assn.org/ama/pub/education-careers/graduate-medical-education/freida-online.shtml. Published 2019. Accessed 3/21/19.

  9. 9.

    Stata. Version Release 14. College Station, TX; 2015.

  10. 10.

    Dupras DM, Edson RS, Halvorsen AJ, Hopkins RH, Jr, McDonald FS. “Problem Residents”: Prevalence, Problems and Remediation in the Era of Core Competencies. Am J Med. 2012;125(4):421-425.

    Article  Google Scholar 

  11. 11.

    Brummond A, Sefcik S, Halvorsen AJ, et al. Resident recruitment costs: a national survey of internal medicine program directors. Am J Med. 2013;126(7):646-653.

    Article  Google Scholar 

  12. 12.

    Angus S, Vu TR, Halvorsen AJ, et al. What skills should new internal medicine interns have in July? A national survey of internal medicine residency program directors. Acad Med. 2014;89(3):432-435.

    Article  Google Scholar 

  13. 13.

    Angus S, Adams M, Willett LL, et al. The new internal medicine fellowship match timeline: a change in the right direction. Am J Med. 2014;127(11):1132-1136.

    Article  Google Scholar 

  14. 14.

    Finn KM, Zaas AK, McDonald FS, Melfe M, Kisielewski M, Willett LL. Misinterpretation of the American Board of Internal Medicine Leave Policies for Resident Physicians Around Parental Leave. Ann Intern Med. 2020;172(8):570-572.

    Article  Google Scholar 

  15. 15.

    Education ACGME. Program Search. https://apps.acgme.org/ads/Public/Programs/Search. Published 2017. Accessed December, 2017.

  16. 16.

    Nasir LS. Evidence of discrimination against international medical graduates applying to family practice residency programs. Fam Med. 1994;26(10):625-9.

    CAS  PubMed  Google Scholar 

  17. 17.

    Balon R, Mufti R, Williams M, Riba M. Possible discrimination in recruitment of psychiatry residents? Am J Psychiatry. 1997;154(11):1608-9. https://doi.org/10.1176/ajp.154.11.1608.

    CAS  Article  PubMed  Google Scholar 

  18. 18.

    Moore RA, Rhodenbaugh EJ. The unkindest cut of all: are international medical school graduates subjected to discrimination by general surgery residency programs? Curr Surg. 2002;59(2):228-36. https://doi.org/10.1016/s0149-7944(01)00644-4.

    Article  PubMed  Google Scholar 

  19. 19.

    Manthous CA. Confronting the elephant in the room: can we transcend medical graduate stereotypes? J Grad Med Educ. 2012;4(3):290-292.

    Article  Google Scholar 

  20. 20.

    Desbiens NA, Vidaillet HJ Jr. Discrimination against international medical graduates in the United States residency program selection process. BMC Med Educ. 2010;10:5. Published 2010 Jan 25. https://doi.org/10.1186/1472-6920-10-5.

    Article  PubMed  PubMed Central  Google Scholar 

  21. 21.

    Woods SE, Harju A, Rao S, Koo J, Kini D. Perceived Biases and Prejudices Experienced by International Medical Graduates in the US Post-Graduate Medical Education System. Med Educ Online. 2006;11(1):4595. https://doi.org/10.3402/meo.v11i.4595.

    Article  PubMed  Google Scholar 

  22. 22.

    Whitcomb ME, Miller RS. Comparison of IMG-dependent and non-IMG-dependent residencies in the national resident matching program. JAMA. 1996;276(9):700-703.

    CAS  Article  Google Scholar 

Download references

Acknowledgments

The authors wish to acknowledge the input of Dr. Lia Logia in her critical review of this manuscript. This work was previously presented at the 2018 Alliance for Academic Internal Medicine Academic Internal Medicine Week.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Shalini T. Reddy MD MHPE FACP.

Ethics declarations

Conflict of Interest

None of the authors report any relevant conflicts of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

ESM 1

(PDF 81 kb)

ESM 2

(DOCX 16 kb)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Reddy, S.T., Kisielewski, M., Willett, L.L. et al. Where Do International Medical Graduates Matriculate for Internal Medicine Training? A National Longitudinal Study. J GEN INTERN MED (2021). https://doi.org/10.1007/s11606-020-06519-1

Download citation