Skip to main content

Evaluation of competence in ultrasound-guided procedures—a generic assessment tool developed through the Delphi method



To develop a generic and objective tool for assessing competence in percutaneous ultrasound-guided procedures.


Interventional ultrasound experts from the Nordic countries were invited to participate in a three-round Delphi process. A steering committee was formed to manage the process. In round 1, the experts were asked to suggest all aspects to consider when assessing competence in US-guided procedures. Suggestions were analyzed and condensed into assessment items. In round 2, the expert panel rated these items on a 1–5 scale and suggested new items. Items with a mean rating of ≤ 3.5 were excluded. In round 3, the expert panel rated the list items and suggested changes to the items.


Twenty-five experts were invited, and response rates in the three rounds were 68% (17 out of 25), 100% (17 out of 17), and 100% (17 out of 17). The three-round Delphi process resulted in a 12-item assessment tool, using a five-point rating scale. The final assessment tool evaluates pre-procedural planning, US technique, procedural technique, patient safety, communication, and teamwork.


Expert consensus was achieved on a generic tool for assessment of competence in percutaneous ultrasound-guided procedures—the Interventional Ultrasound Skills Evaluation (IUSE). This is the initial step in ensuring a valid and reliable method for assessment of interventional US skill.

Key Points

• Through a Delphi process, expert consensus was achieved on the content of an assessment tool for percutaneous ultrasound-guided procedures—the Interventional Ultrasound Skills Evaluation (IUSE) tool.

• The IUSE tool is comprehensive and covers pre-procedural planning, US technique, procedural technique, patient safety, communication, and teamwork.

• This is an important step in ensuring valid and reliable assessment of interventional US skills.

This is a preview of subscription content, access via your institution.

Fig. 1



Interventional Ultrasound Skills Evaluation




  1. 1.

    Nazeer SR, Dewbre H, Miller AH (2005) Ultrasound-assisted paracentesis performed by emergency physicians vs the traditional technique: a prospective, randomized study. Am J Emerg Med 23:363–367 Philadelphia.

    Article  Google Scholar 

  2. 2.

    Gordon CE, Feller-Kopman D, Balk EM, Smetana GW (2010) Pneumothorax following thoracentesis: a systematic review and meta-analysis. Arch Intern Med 170:332–339.

    Article  PubMed  Google Scholar 

  3. 3.

    Mercaldi CJ, Lanes SF (2013) Ultrasound guidance decreases complications and improves the cost of care among patients undergoing thoracentesis and paracentesis. Chest 143:532–538.

    Article  PubMed  Google Scholar 

  4. 4.

    Radiologi. Accessed 30 Jul 2018

  5. 5.

    ESR European Training Curriculum Level I-II (2018).pdf. In: European Society of Radiology. Accessed 13 Oct 2019

  6. 6.

    Nayahangan LJ, Nielsen KR, Albrecht-Beste E et al (2018) Determining procedures for simulation-based training in radiology: a nationwide needs assessment. Eur Radiol 28:2319–2327.

    Article  PubMed  Google Scholar 

  7. 7.

    Duncan DR, Morgenthaler TI, Ryu JH, Daniels CE (2009) Reducing iatrogenic risk in thoracentesis: establishing best practice via experiential training in a zero-risk environment. Chest 135:1315–1320.

    Article  PubMed  Google Scholar 

  8. 8.

    Education and Practical Standards Committee, European Federation of Societies for Ultrasound in Medicine and Biology (2006) Minimum training recommendations for the practice of medical ultrasound. Ultraschall Med 27:79–105.

    Article  Google Scholar 

  9. 9.

    Hertzberg BS, Kliewer MA, Bowie JD et al (2000) Physician training requirements in sonography. AJR Am J Roentgenol 174:1221–1227.

  10. 10.

    Kahr Rasmussen N, Andersen TT, Carlsen J et al (2019) Simulation-based training of ultrasound-guided procedures in radiology - a systematic review. Ultraschall Med.

  11. 11.

    Stefanidis D, Korndorffer JR Jr, Markley S, Sierra R, Scott DJ (2006) Proficiency maintenance: impact of ongoing simulator training on laparoscopic skill retention. J Am Coll Surg 202:599–603.

  12. 12.

    Stefanidis D, Korndorffer JR Jr, Sierra R, Touchard C, Dunne JB, Scott DJ (2005) Skill retention following proficiency-based laparoscopic simulator training. Surgery 138:165–170.

  13. 13.

    Van Der Vleuten CP (1996) The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ Theory Pract 1:41–67.

    Article  Google Scholar 

  14. 14.

    Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR (2010) The role of assessment in competency-based medical education. Med Teach 32:676–682.

  15. 15.

    Tolsgaard MG, Todsen T, Sorensen JL et al (2013) International multispecialty consensus on how to evaluate ultrasound competence: a Delphi consensus survey. PLoS One 8:e57687.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  16. 16.

    Tolsgaard MG, Ringsted C, Dreisler E et al (2014) Reliable and valid assessment of ultrasound operator competence in obstetrics and gynecology. Ultrasound Obstet Gynecol 43:437–443.

    CAS  Article  PubMed  Google Scholar 

  17. 17.

    Hsu C-C, Sandford B (2019) The Delphi technique: making sense of consensus. Pract Assess Res Eval 12:1–8.

    Article  Google Scholar 

  18. 18.

    Hasson F, Keeney S, McKenna H (2000) Research guidelines for the Delphi survey technique. J Adv Nurs 32:1008–1015.

    CAS  Article  PubMed  Google Scholar 

  19. 19.

    Lorentzen T, Nolsøe CP, Ewertsen C et al (2015) EFSUMB guidelines on Interventional Ultrasound (INVUS), part I – general aspects (long version). Ultraschall Med 36:E1–E14.

    CAS  Article  PubMed  Google Scholar 

  20. 20.

    Gordon M, Darbyshire D, Baker P (2012) Non-technical skills training to enhance patient safety: a systematic review. Med Educ 46:1042–1054.

    Article  PubMed  Google Scholar 

  21. 21.

    Yudkowsky R, Park YS, Downing SM (2019) Assessment in health professions education, 2nd edn. Routledge, New York

    Book  Google Scholar 

  22. 22.

    Hodges B, Regehr G, McNaughton N, Tiberius R, Hanson M (1999) OSCE checklists do not capture increasing levels of expertise. Acad Med 74:1129–1134.

  23. 23.

    Cook DA, Hatala R (2016) Validation of educational assessments: a primer for simulation and beyond. Adv Simul (Lond) 1:31.

    Article  Google Scholar 

  24. 24.

    Streiner DL, Norman GR, Cairney J (2015) Health measurement scales: a practical guide to their development and use. Oxford University Press

  25. 25.

    Strøm M, Lönn L, Bech B, Konge L, EVARATE Delphi Panel (2017) Assessment of competence in EVAR procedures: a novel rating scale developed by the Delphi technique. Eur J Vasc Endovasc Surg 54:34–41.

  26. 26.

    Jensen K, Petersen RH, Hansen HJ, Walker W, Pedersen JH, Konge K (2018) A novel assessment tool for evaluating competence in video-assisted thoracoscopic surgery lobectomy. Surg Endosc 32:4173–4182.

  27. 27.

    Østergaard ML, Rue Nielsen K, Albrecht-Beste E, Ersbøll AK, Konge L, Nielsen MB (2019) Simulator training improves ultrasound scanning performance on patients: a randomized controlled trial. Eur Radiol 29:3210–3218.

  28. 28.

    Dalkey NC (1969) The Delphi method: an experimental study of group opinion. Accessed 10 Feb 2020

  29. 29.

    Nayahangan LJ, Stefanidis D, Kern DE, Konge L (2018) How to identify and prioritize procedures suitable for simulation-based training: experiences from general needs assessments using a modified Delphi method and a needs assessment formula. Med Teach 40:676–683.

    Article  PubMed  Google Scholar 

  30. 30.

    Rosenkrantz AB, Wang W, Hughes DR, Duszak R (2017) Generalist versus subspecialist characteristics of the U.S. Radiologist Workforce. Radiology 286:929–937.

    Article  PubMed  Google Scholar 

  31. 31.

    Borgersen NJ, Naur TMH, Sørensen SMD et al (2018) Gathering validity evidence for surgical simulation: a systematic review. Ann Surg 267:1063–1068.

    Article  PubMed  Google Scholar 

Download references


We would like to acknowledge the following ultrasound specialists for their participation in the Delphi study.

Christian Nolsøe

Bo Nyhuus

Linus Sant

Beth Olsen

Arne Hørlyck

Merete Kønig

Lars Larsen

Anders Elvin

Anders Nilsson

Ali Ovissi

Jouni Kuronen

Sara Protto

Trygve Syversveen

Anders Drolsum

Magnús Baldvinsson

Author information



Corresponding author

Correspondence to Niklas Kahr Rasmussen.

Ethics declarations


The scientific guarantor of this publication is Professor Michael Bachmann Nielsen and PhD-stu.

Conflict of interest

The authors of this manuscript declare no relationships with any companies whose products or services may be related to the subject matter of the article.

Statistics and biometry

No complex statistical methods were necessary for this paper.

Informed consent

Not applicable.

Ethical approval

Institutional Review Board approval was obtained.

Institutional Review Board approval was not required because no patients or animals were involved.


• Delphi Study

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Kahr Rasmussen, N., Nayahangan, L.J., Carlsen, J. et al. Evaluation of competence in ultrasound-guided procedures—a generic assessment tool developed through the Delphi method. Eur Radiol 31, 4203–4211 (2021).

Download citation


  • Ultrasonography
  • Delphi technique
  • Education, medical
  • Ultrasonography, interventional
  • Educational measurement