The Ultrasound Journal

, 11:3 | Cite as

Correlation of OSCE performance and point-of-care ultrasound scan numbers among a cohort of emergency medicine residents

  • Youyou DuanmuEmail author
  • Patricia C. Henwood
  • Sukhjit S. Takhar
  • Wilma Chan
  • Joshua S. Rempell
  • Andrew S. Liteplo
  • Viktoria Koskenoja
  • Vicki E. Noble
  • Heidi H. Kimberly
Open Access
Original article



Point-of-care ultrasound (POCUS) is an important clinical tool for a growing number of medical specialties. The current American College of Emergency Physicians (ACEP) Ultrasound Guidelines recommend that trainees perform 150–300 ultrasound scans as part of POCUS training. We sought to assess the relationship between ultrasound scan numbers and performance on an ultrasound-focused observed structured clinical examination (OSCE).


This was a cross-sectional cohort study in which the number of ultrasound scans residents had previously performed were obtained from a prospective database and compared with their total score on an ultrasound OSCE. Ultrasound fellowship trained emergency physicians administered a previously published OSCE that consisted of standardized questions testing image acquisition and interpretation, ultrasound machine mechanics, patient positioning, and troubleshooting. Residents were observed while performing core applications including aorta, biliary, cardiac, deep vein thrombosis, Focused Assessment with Sonography in Trauma (FAST), pelvic, and thoracic ultrasound imaging.


Twenty-nine postgraduate year (PGY)-3 and PGY-4 emergency medicine (EM) residents participated in the OSCE. The median OSCE score was 354 [interquartile range (IQR) 343–361] out of a total possible score of 370. Trainees had previously performed a median of 341 [IQR 289–409] total scans. Residents with more than 300 ultrasound scans had a median OSCE score of 355 [IQR 351–360], which was slightly higher than the median OSCE score of 342 [IQR 326–361] in the group with less than 300 total scans (p = 0.04). Overall, a LOWESS curve demonstrated a positive association between scan numbers and OSCE scores with graphical review of the data suggesting a plateau effect.


The results of this small single residency program study suggest a pattern of improvement in OSCE performance as scan numbers increased, with the appearance of a plateau effect around 300 scans. Further investigation of this correlation in diverse practice environments and within individual ultrasound modalities will be necessary to create generalizable recommendations for scan requirements as part of overall POCUS proficiency assessment.


Education Point of care Ultrasound Competency 



point-of-care ultrasound


emergency medicine


observed structured clinical exam


Focused Assessment with Sonography in Trauma


postgraduate year


interquartile range


American College of Emergency Physicians


Accreditation Council for Graduate Medical Education


locally weighted scatter plot smoothing


intraclass correlation


confidence interval


Point-of-care ultrasound (POCUS) has become an important clinical tool across a variety of medical specialties [1, 2, 3, 4, 5]. Proficiency in POCUS is especially vital to the practice of emergency medicine (EM) [6, 7, 8, 9, 10, 11]. Since 2012, the Accreditation Council for Graduate Medical Education (ACGME) has designated the use of ultrasound for diagnosing emergent medical conditions, critical care and trauma resuscitation, and procedural guidance as 1 of 23 milestone competencies for EM residents [8]. The 2016 American College of Emergency Physicians (ACEP) policy statement on emergency ultrasound advises that a trainee should perform 25–50 ultrasounds in each of the core applications and a total of 150–300 scans as part of POCUS training [9].

It has been suggested that the completion of a predetermined number of ultrasounds correlates with proficiency in clinical practice; however, there remains significant variability in the number of scans required by different training programs [12, 13]. There are data showing that residents who performed greater than 150 ultrasounds scored significantly higher on a written ultrasound examination [14]. However, a more recent consensus statement recommended that 150 scans may not be sufficient as a competency benchmark, but should be regarded as a minimum standard beyond which other measures of competency should be assessed [15]. A 2017 survey of 539 EM residents found that residents believed an average of 325 scans is required for proficiency [16].

Medical training programs have adapted the observed structured clinical examination (OSCE) to evaluate competency in a variety of applications [17, 18, 19, 20, 21]. The OSCE is an especially useful tool for medical skills that involve a combination of technical and knowledge-based aptitude. Its use is recommended by the ACEP policy statement to assess for ultrasound competency, but to our knowledge there is no prior data comparing OSCE performance to overall scan numbers [9].

Ultrasound is a multi-modal skill set requiring complex methods of competency assessment. In this study, we sought to assess whether there was a relationship between the number of ultrasounds previously performed by senior EM residents and their performance on a standardized ultrasound OSCE.


This was a cross-sectional cohort study in which the number of ultrasound scans residents had previously performed was obtained from a prospective database and compared to total scores on an ultrasound OSCE. A modified version of a previously published ultrasound OSCE was given to all 29 postgraduate year (PGY)-3 and PGY-4 residents at a single academic emergency medicine residency program that spans two institutions [17, 22, 23]. The OSCE took place in a simulation center using standardized patients as models. It consisted of standardized questions testing image acquisition and interpretation with points for technique, image quality, and correct interpretation of anatomy. Twelve ultrasound fellowship trained emergency physicians served as evaluators. Residents were observed performing aorta, biliary, cardiac, deep vein thrombosis, Focused Assessment with Sonography in Trauma (FAST), pelvic, and thoracic ultrasounds, which are included in the ACEP core emergency ultrasound applications. The total possible score for the OSCE was 370. Based on the total OSCE score and overall evaluator impression, residents were given an OSCE general competency score from 1 to 5 and those with a score of 2 or below were provided individualized remediation. To assess inter-rater reliability, 11 residents had two evaluators independently grade their OSCE performance. For those 11 subjects, the mean of their two OSCE scores was used in the general analysis.

The number of ultrasound scans the residents had performed was obtained from a prospective database and included ultrasounds performed clinically and during the 1-week PGY-1 and 2-week PGY-2 ultrasound rotations. Scans were evaluated in total and by application type. None of the residents had participated in an outside ultrasound rotation. The scans logged in the database had all been reviewed for quality assurance by ultrasound-trained faculty.

Data were analyzed using Stata 14.2 (Stata Corporation, College Station, TX). A locally weighted scatter plot smoothing (LOWESS) method was used to visually estimate the trend between OSCE score and previously performed ultrasound scan numbers. A Wilcoxon rank-sum test was used to assess for differences in OSCE score by PGY year. A two-sample t test was used to assess for differences in OSCE score at a scan number cutoff of 300, because this appeared to be the region of plateau of the LOWESS curve as well as the upper range of recommended scan numbers based on the ACEP guidelines. Intraclass correlation (ICC) with one-way random effects was used to assess inter-rater reliability for those OSCEs that had two evaluators. The study was deemed exempt from review by the Partners Healthcare Institutional Review Board.


All 29 senior residents in our program participated in the OSCE, including 15 PGY-3 and 14 PGY-4 residents. The median OSCE score for all participants was 354 [interquartile range (IQR) 343–361]. Residents had performed a median of 341[IQR 289–409] total scans, including 105 [IQR 87–120] cardiac, 79 [IQR 65–96] FAST, 48 [IQR 36–64] thoracic, 15 [IQR 12–10] pelvic, 16 [IQR 10–21] biliary, 15 [IQR 13–19] aorta and 3 [IQR 1–5] deep vein thrombosis. The LOWESS smoother curve suggested a pattern of increase and then a plateau in OSCE total score as total scan numbers increased (Fig. 1). Residents who had previously performed more than 300 scans had a slightly higher median OSCE score (355 [IQR 350–360]) than residents who had performed fewer than or equal to 300 scans (342 [IQR 326–361]), p = 0.04. The median OSCE score for PGY-3 residents was 354 [IQR 346–364], while the median score for PGY-4 residents was 352 [IQR 335–359], which was not significantly different (p = 0.43). There was a moderate ICC of 0.61 [CI 0.08–0.88], p = 0.01, for the 11 residents who had two evaluators.
Fig. 1

Scatter plot of observed structured clinical exam (OSCE) total score vs. total ultrasound scan numbers fit with locally weighted scatter plot smoothing (LOWESS) curve


Ultrasound is a multi-faceted skill that involves acquisition of images, interpretation of scans, and knowledge of how to incorporate findings into clinical practice. The evaluation of ultrasound competency also requires a multi-modal approach. OSCEs have been found to be a useful tool for measuring ultrasound proficiency, as both scanning technique and interpretation can be assessed in real time [17, 18, 19, 20].

The results of this small single residency program study demonstrate that a higher number of scans performed were associated with higher performance on the OSCE with graphical review of the data suggesting a plateau around 300 scans. Defining a minimum number of scans required to attain competency can be challenging, and the current ACEP recommendation of 150–300 scans is a wide target. Our data intimates that within the suggested range of ultrasounds recommended by ACEP, there may be a plateau effect at the higher range of this requirement, but further investigation should be aimed at determining whether this plateau is consistent in different cohorts or rather may be institutionally dependent and vary with training methodologies. We found that in our cohort, a cutoff of 300 overall scans showed a small division in OSCE performance between residents, although it is unclear if this is clinically significant.

A recent study by Blehar et al. suggested that ultrasound image accuracy improved with the number of scans performed up to a “plateau” point where additional scan numbers did not greatly increase accuracy [24]. Our LOWESS curve of OSCE score and prior scan numbers also suggests an asymptotic pattern. This is similar to the concept outlined in the Pusic et al. paper on learning curves, in which medical training is correlated with rising skill level until competence is gained, after which increasing proficiency becomes more subtle despite additional practice [25]. Due to the varied complexity of ultrasound modalities, Blehar et al. found that proficiency for different ultrasound applications plateaued at different scan numbers [24]. We assessed scan proficiency as an aggregate, but were unable to analyze separate proficiency for each modality due to limited sample size.

It would also be valuable to identify a scan number requirement for credentialing physicians who are currently practicing. However, it is unclear if data obtained from emergency medicine residency training programs would be applicable to physicians who did not have structured ultrasound teaching during residency, or whether the plateau effect would differ based on the practice setting. Further investigation to examine the performance of competency assessment tools in a multi-center observational study across a variety of settings is indicated to better characterize the relationship between ultrasound numbers and observed performance in aggregate and within modalities.

This study has several limitations. It was performed at a single residency program with a small sample size and may not be generalizable outside of our institution or beyond emergency medicine. Although OSCE performance visually appeared to plateau around 300 scans, the clinical significance of this number is unclear. Only one resident had less than 150 scans and thus we did not evaluate a cutoff value of 150 scans. Residents had performed a larger percentage of cardiac and FAST examinations than the other modalities. The pattern of OSCE performance and plateau as an aggregate may be affected by this uneven scan number distribution, which included several modalities with median scan numbers below the ACEP-recommended benchmark for individual ultrasound applications [9]. Absolute number of ultrasounds performed may not be a reliable measure of a resident’s comprehensive ultrasound education, as it does not include instruction during the ultrasound elective, informal clinical teaching or exposure to a variety of pathology. Furthermore, residents’ ultrasounds were completed under the supervision of ultrasound-certified attending physicians and were often accompanied by real-time teaching and quality assurance review. There is a possibility that the database underrepresents actual scanning activity considering scans performed outside of the emergency department or those not properly documented would not have been captured.

The OSCE was created by the ultrasound faculty at our institution and has not been independently validated, although it had gone through an iterative process of improvement and has been used in multiple different clinical contexts over the past 4 years [17, 22, 23]. There was only moderate reliability between faculty who administered OSCE to the same resident, which indicates that the OSCE may require additional standardization to be a reliable measure of ultrasound proficiency. Small sample size precluded sub-group analysis of correlation between OSCE score and prior scan numbers for the individual ultrasound applications.


The results of this small single residency program study suggest a pattern of improvement in OSCE performance as scan numbers increased, with the appearance of a plateau effect around 300 scans. While these are interesting findings for the assessment of ultrasound competency, additional investigation of this correlation in diverse practice environments and within individual ultrasound modalities will be necessary to create generalizable recommendations for scan requirements needed as part of overall POCUS competency assessment.


Authors’ contributions

YD contributed to the data analysis and was the major contributor in writing the manuscript. PCH was involved in study design, data collection and in writing the manuscript. SST assisted with statistics and manuscript revisions. WC was involved in study design, data collection and review of the manuscript. JSR, ASL, VK and VEN were involved in data collection and manuscript revisions. HHK was responsible for project design, data collection and was a major contributor in writing the manuscript. All authors read and approved the final manuscript.


Not applicable.

Competing interests

The authors declare that they have no competing interests.

Availability of data and materials

The datasets generated and analyzed during the current study are not publicly available to ensure confidentiality of resident performance metrics, but are available from the corresponding author on reasonable request.

Consent for publication

Not applicable.

Ethics approval and consent to participate

This study was deemed exempt from review by the Partner’s Healthcare Institutional Review Board.


There was no source of funding for this study.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.


  1. 1.
    Schnobrich D, Gladding S, Olson A et al (2013) Point-of-care ultrasound in internal medicine: a national survey of educational leadership. J Grad Med Educ. 5(3):498–502CrossRefGoogle Scholar
  2. 2.
    Meineri M, Bryson GL, Arellano R et al (2018) Core point-of-care ultrasound curriculum: what does every anesthesiologist need to know? Can J Anesth/Journal canadien danesthésie. 65(4):417–426CrossRefGoogle Scholar
  3. 3.
    Bornemann P (2017) Assessment of a novel point-of-care ultrasound curriculums effect on competency measures in family medicine graduate medical education. J Ultrasound Med 36(6):1205–1211CrossRefGoogle Scholar
  4. 4.
    Beal EW, Sigmond BR, Sage-Silski L et al (2017) Point-of-care ultrasound in general surgery residency training: a proposal for milestones in graduate medical education ultrasound. J Ultrasound Med 36(12):2577–2584CrossRefGoogle Scholar
  5. 5.
    Strony R, Marin J, Bailitz J et al (2018) Systemwide clinical ultrasound program development: an expert consensus model. West J Emerg Med 19(4):649–653CrossRefGoogle Scholar
  6. 6.
    Reardon R, Heegaard B, Plummer D et al (2006) ultrasound is a necessary skill for emergency physicians. Acad Emerg Med 13(3):334–336CrossRefGoogle Scholar
  7. 7.
    Moore C, Gregg S, Lambert M (2004) Performance, training, quality assurance, and reimbursement of emergency physician-performed ultrasonography at academic medical centers. J Ultrasound Med 23(4):459–466CrossRefGoogle Scholar
  8. 8.
    Accreditation Council for Graduate Medical Education, American Board of Emergency Medicine. The Emergency Medicine Milestone Project. Accessed 12 June 2017
  9. 9.
    American College of Emergency Physicians (2017) Ultrasound guidelines: emergency, point of care, and clinical ultrasound guidelines in medicine. Ann Emerg Med 69(5):e27–e54CrossRefGoogle Scholar
  10. 10.
    Akhtar S, Theodoro D, Gaspari R et al (2009) Resident training in emergency ultrasound: consensus recommendations from the 2008 council of emergency medicine residency directors conference. Acad Emerg Med 16:S32–S36CrossRefGoogle Scholar
  11. 11.
    Lewiss R, Pearl M, Nomura J et al (2013) CORD-AEUS: consensus document for the emergency ultrasound milestone project. Acad Emerg Med 20(7):740–745CrossRefGoogle Scholar
  12. 12.
    Costantino T, Burton J, Tayal V (2015) Ultrasound competency and practice: what’s in a number? Acad Emerg Med 22(5):597–599CrossRefGoogle Scholar
  13. 13.
    Ahern M, Mallin MP, Weitzel S et al (2010) Variability in ultrasound education among emergency medicine residencies. West J Emerg Med. 11(4):314–318PubMedPubMedCentralGoogle Scholar
  14. 14.
    Costantino TG, Satz WA, Stahmer SA et al (2003) Predictors of success in emergency medicine ultrasound education. Acad Emerg Med 10(2):180–183CrossRefGoogle Scholar
  15. 15.
    Nelson M, Abdi A, Adhikari S et al (2016) Goal-directed focused ultrasound milestones revised: a multiorganizational consensus. Acad Emerg Med 23(11):1274–1279CrossRefGoogle Scholar
  16. 16.
    Stolz LA, Stolz U, Fields JM et al (2017) Emergency medicine resident assessment of the emergency ultrasound milestones and current training recommendations. Acad Emerg Med 24(3):353–361CrossRefGoogle Scholar
  17. 17.
    Henwood P, Mackenzie D, Rempell J et al (2016) Intensive point-of-care ultrasound training with long-term follow-up in a cohort of Rwandan physicians. Trop Med Int Health 21(12):1531–1538CrossRefGoogle Scholar
  18. 18.
    Hofer M, Kamper L, Sadlo M et al (2011) Evaluation of an OSCE assessment tool for abdominal ultrasound courses. Ultraschall Med 32(02):184–190CrossRefGoogle Scholar
  19. 19.
    Sisley A, Johnson S, Erickson W et al (1999) Use of an objective structured clinical examination (OSCE) for the assessment of physician performance in the ultrasound evaluation of Trauma. J Trauma 47(4):627CrossRefGoogle Scholar
  20. 20.
    Sornsuphalimchareon S, Puwichcharoenchue P, Jitrapornintrarak J (2018) Factors affecting objective structured clinical examination scores of final year medical students in the evaluation of focused assessment with sonography for Trauma. J Contemp Med Educ 7(2):42CrossRefGoogle Scholar
  21. 21.
    Amini R, Adhikari S, Fiorello A (2014) Ultrasound competency assessment in emergency medicine residency programs. Acad Emerg Med 21(7):799–801CrossRefGoogle Scholar
  22. 22.
    Henwood P, Mackenzie D, Rempell J et al (2014) A practical guide to self-sustaining point-of-care ultrasound education programs in resource-limited settings. Ann Emerg Med 64(3):277–285.e2CrossRefGoogle Scholar
  23. 23.
    Bell G, Wachira B, Denning G (2016) A pilot training program for point-of-care ultrasound in Kenya. Afr J Emerg Med. 6(3):132–137CrossRefGoogle Scholar
  24. 24.
    Blehar D, Barton B, Gaspari R (2015) Learning curves in emergency ultrasound education. Acad Emerg Med 22(5):574–582CrossRefGoogle Scholar
  25. 25.
    Pusic MV, Boutis K, Hatala R et al (2015) Learning curves in health professions education. Acad Med 90(8):1034–1042CrossRefGoogle Scholar

Copyright information

© The Author(s) 2019

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  • Youyou Duanmu
    • 1
    Email author
  • Patricia C. Henwood
    • 2
  • Sukhjit S. Takhar
    • 3
  • Wilma Chan
    • 4
  • Joshua S. Rempell
    • 5
  • Andrew S. Liteplo
    • 6
  • Viktoria Koskenoja
    • 7
  • Vicki E. Noble
    • 8
  • Heidi H. Kimberly
    • 2
  1. 1.Department of Emergency MedicineStanford University School of MedicinePalo AltoUSA
  2. 2.Department of Emergency MedicineBrigham and Women’s HospitalBostonUSA
  3. 3.Department of Emergency MedicineMills-Peninsula Medical CenterBurlingameUSA
  4. 4.Department of Emergency MedicineHospital of the University of PennsylvaniaPhiladelphiaUSA
  5. 5.Department of Emergency MedicineCooper University HospitalCamdenUSA
  6. 6.Department of Emergency MedicineMassachusetts General Hospital and Harvard Medical SchoolBostonUSA
  7. 7.Department of Emergency MedicineUP Health System-MarquetteMarquetteUSA
  8. 8.Department of Emergency MedicineUniversity Hospitals-Cleveland Medical CenterClevelandUSA

Personalised recommendations