Advertisement

Surgical Education, Simulation, and Simulators—Updating the Concept of Validity

  • Mitchell Goldenberg
  • Jason Y. Lee
Endourology (P Mucksavage, Section Editor)
  • 131 Downloads
Part of the following topical collections:
  1. Topical Collection on Endourology

Abstract

Purpose of Review

Competency-based medical education (CBME) is rooted in the use of iterative assessments. We must ensure that the assessments used in CBME are valid, to make acceptable and accurate decisions regarding the competency of a trainee. Until recently, much of the educational and assessment literature in urology have used a now-outdated method of determining validity, based on theory and recommendations from over 50 years ago. We describe a contemporary approach to gathering construct validity evidence for the assessment of urologic trainees, for use in both clinical and simulation environments.

Recent Findings

Five sources of evidence make up Messick’s contemporary framework of validity: test content, response process, internal structure, relationship to other variables, and consequences. These are all components of construct validation and concern the accuracy, quality, reproducibility, generalizability, and wider impact of the scores generated by an assessment, respectively.

Summary

When deciding the competency of a trainee, program directors and educators must have a clear understanding of how the validity is established and is determined in each assessment context. The contextual specificity of validity means that stakeholders must be prepared to defend the outcome of an assessment, particularly when making high-stake or summative decisions.

Keywords

Surgical education Simulation Validity Assessment Trainees Urology 

Notes

Compliance with Ethical Standards

Conflict of Interest

Mitchell Goldenberg and Jason Y. Lee each declare no potential conflicts of interest.

Human and Animal Rights and Informed Consent

This article does not contain any studies with human or animal subjects performed by any of the authors.

References

Papers of particular interest, published recently, have been highlighted as: • Of importance •• Of major importance

  1. 1.
    Frank JR, Snell LS, Cate OT, Holmboe ES, Carraccio C, Swing SR, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638–45.  https://doi.org/10.3109/0142159X.2010.501190.CrossRefPubMedGoogle Scholar
  2. 2.
    Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010;32(8):676–82.  https://doi.org/10.3109/0142159X.2010.500704.CrossRefPubMedGoogle Scholar
  3. 3.
    • Aydin A, Raison N, Khan MS, Dasgupta P, Ahmed K. Simulation-based training and assessment in urological surgery. Nat Rev Urol. 2016;13(9):503–19.  https://doi.org/10.1038/nrurol.2016.147. This article from 2015 summarizes simulation-based training interventions and assessment methods used across different urology surgical modalities. While thorough in its review of the literature at the time, it uses the now-outdated taxonomy of validity to describe and categorize the literature. This article provides the reader with an in-depth description of this outmoded validity language. CrossRefPubMedGoogle Scholar
  4. 4.
    Arora S, Lamb B, Undre S, Kneebone R, Darzi A, Sevdalis N. Framework for incorporating simulation into urology training. BJU Int. 2011;107(5):806–10.  https://doi.org/10.1111/j.1464-410X.2010.09563.x.CrossRefPubMedGoogle Scholar
  5. 5.
    education SDM, 2003. Validity: on the meaningful interpretation of assessment data. Wiley Online Library.Google Scholar
  6. 6.
    Cook DA, Hatala R. Validation of educational assessments: a primer for simulation and beyond. Adv Simul. 2016;1(1):1–12.  https://doi.org/10.1186/s41077-016-0033-y.CrossRefGoogle Scholar
  7. 7.
    Korndorffer JR, Kasten SJ, Downing SM. A call for the utilization of consensus standards in the surgical education literature. Am J Surg. 2010;199(1):99–104.  https://doi.org/10.1016/j.amjsurg.2009.08.018.CrossRefPubMedGoogle Scholar
  8. 8.
    Association AER, Association AP, National Council on Measurement in Education, U.S JCOSFEAPT. Standards for Educational and Psychological Testing. Amer Educational Research Assn; 1999.Google Scholar
  9. 9.
    McDougall EM. Validation of surgical simulators. J Endourol. 2007;21(3):244–7.  https://doi.org/10.1089/end.2007.9985.CrossRefPubMedPubMedCentralGoogle Scholar
  10. 10.
    CRONBACH LJ, MEEHL PE. Construct validity in psychological tests. Psychol Bull. 1955;52(4):281–302.CrossRefPubMedGoogle Scholar
  11. 11.
    Aghazadeh MA, Mercado MA, Pan MM, Miles BJ, Goh AC. Performance of robotic simulated skills tasks is positively associated with clinical robotic surgical performance. BJU Int. 2016;118(3):475–81.  https://doi.org/10.1111/bju.13511.CrossRefPubMedGoogle Scholar
  12. 12.
    Hatala R, Cook DA, Brydges R, Hawkins R. Constructing a validity argument for the Objective Structured Assessment of Technical Skills (OSATS): a systematic review of validity evidence. Adv Health Sci Educ. 2015;20(5):1–27.  https://doi.org/10.1007/s10459-015-9593-1.CrossRefGoogle Scholar
  13. 13.
    Association AER, Association AP, National Council on Measurement in Education, U.S JCOSFEAPT. Standards for Educational and Psychological Testing. 2014.Google Scholar
  14. 14.
    Messick S. Validity of psychological assessment. 1994.Google Scholar
  15. 15.
    Sweet RM, Hananel D, Lawrenz F. A unified approach to validation, reliability, and education study design for surgical technical skills training. Arch Surg. 2010;145(2):197–201.  https://doi.org/10.1001/archsurg.2009.266.CrossRefPubMedGoogle Scholar
  16. 16.
    Raison N, Wood T, Brunckhorst O, Abe T, Ross T, Challacombe B, et al. Development and validation of a tool for non-technical skills evaluation in robotic surgery—the ICARS system. Surg Endosc. 2017;7(7):403–8.  https://doi.org/10.1007/s00464-017-5622-x.CrossRefGoogle Scholar
  17. 17.
    Dagnaes-Hansen J, Mahmood O, Bube S, et al. Direct observation vs. video-based assessment in flexible cystoscopy. J Surg Educ. 2017;  https://doi.org/10.1016/j.jsurg.2017.10.005.
  18. 18.
    Goh AC, Goldfarb DW, Sander JC, Miles BJ, Dunkin BJ. Global evaluative assessment of robotic skills: validation of a clinical assessment tool to measure robotic surgical skills. J Urol. 2012;187(1):247–52.  https://doi.org/10.1016/j.juro.2011.09.032.CrossRefPubMedGoogle Scholar
  19. 19.
    Perrenot C, Perez M, Tran N, Jehl JP, Felblinger J, Bresler L, et al. The virtual reality simulator dV-Trainer(®) is a valid assessment tool for robotic surgical skills. Surg Endosc. 2012;26(9):2587–93.  https://doi.org/10.1007/s00464-012-2237-0.CrossRefPubMedGoogle Scholar
  20. 20.
    Goldenberg MG, Goldenberg L, Grantcharov TP. Surgeon performance predicts early continence after robot-assisted radical prostatectomy. J Endourol. 2017;31(9):858–63.  https://doi.org/10.1089/end.2017.0284.PubMedCrossRefGoogle Scholar
  21. 21.
    Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med. 2006;119(2):166–e7–166.e16.  https://doi.org/10.1016/j.amjmed.2005.10.036.CrossRefPubMedGoogle Scholar
  22. 22.
    Downing SM, Haladyna TM. Validity threats: overcoming interference with proposed interpretations of assessment data. Med Educ. 2004;38(3):327–33.CrossRefPubMedGoogle Scholar
  23. 23.
    •• Cook DA, Zendejas B, Hamstra SJ, Hatala R, Brydges R. What counts as validity evidence? Examples and prevalence in a systematic review of simulation-based assessment. Adv Health Sci Educ Theory Pract. 2014;19(2):233–50.  https://doi.org/10.1007/s10459-013-9458-4. This article provides an extensive review, but more importantly description, of the simulation-based literature as framed by Messick’s framework. The various sources of validity evidence are described with examples from the literature, and the “data elements” that are encompassed by these domains are outlined as well. CrossRefPubMedGoogle Scholar
  24. 24.
    Williams RG, Klamen DA, McGaghie WC. Cognitive, social and environmental sources of bias in clinical performance ratings. Teach Learn Med. 2003;15(4):270–92.  https://doi.org/10.1207/S15328015TLM1504_11.CrossRefPubMedGoogle Scholar
  25. 25.
    Cook DA, Brydges R, Ginsburg S, Hatala R. A contemporary approach to validity arguments: a practical guide to Kane’s framework. Med Educ. 2015;49(6):560–75.  https://doi.org/10.1111/medu.12678.CrossRefPubMedGoogle Scholar
  26. 26.
    Hull AL, Hodder S, Berger B, Ginsberg D, Lindheim N, Quan J, et al. Validity of three clinical performance assessments of internal medicine clerks. Acad Med. 1995;70(6):517–22.CrossRefPubMedGoogle Scholar
  27. 27.
    Lee JY, Mucksavage P, Kerbl DC, Huynh VB, Etafy M, McDougall EM. Validation study of a virtual reality robotic simulator—role as an assessment tool? J Urol. 2012;187(3):998–1002.  https://doi.org/10.1016/j.juro.2011.10.160.CrossRefPubMedGoogle Scholar
  28. 28.
    • Brydges R, Hatala R, Zendejas B, Erwin PJ, Cook DA. Linking simulation-based educational assessments and patient-related outcomes. Acad Med. 2015;90(2):246–56.  https://doi.org/10.1097/ACM.0000000000000549. This article provides a good review of the literature relating simulation-based assessments with patient outcomes. Consequences evidence is a key element of the validity framework that has been vastly underexplored to date, and this article not only collates the available evidence but also clearly underscores the importance of this data in the design and implementation of competency-based assessments. CrossRefPubMedGoogle Scholar
  29. 29.
    Fecso AB, Szasz P, Kerezov G, Grantcharov TP. The effect of technical performance on patient outcomes in surgery: a systematic review. Ann Surg. August 2016;  https://doi.org/10.1097/SLA.0000000000001959.
  30. 30.
    Gordon M, Darbyshire D, Baker P. Non-technical skills training to enhance patient safety: a systematic review. Med Educ. 2012;46(11):1042–54.  https://doi.org/10.1111/j.1365-2923.2012.04343.x.CrossRefPubMedGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Urology Residency Training Program, Division of UrologyUniversity of TorontoTorontoCanada
  2. 2.Toronto General Hospital – University Health Network, Division of UrologyUniversity of TorontoTorontoCanada

Personalised recommendations