Advertisement

The Role of Assessment in Surgical Education

  • P. SzaszEmail author
  • T. P. Grantcharov
Chapter
Part of the Innovation and Change in Professional Education book series (ICPE, volume 17)

Overview

As competency-based medical education (CBME) continues to infiltrate postgraduate training, the focus among educators and researchers has shifted toward trainee assessment. Summative assessments are those from which consequences arise for both the trainee and training program. The main intent of summative assessments is to differentiate between different trainee states. Thus psychometric rigor must be at the center of such assessments to ensure defensible results. While no model has been created specifically to design such assessments, the evidence-centered assessment design (ECD) framework can be adapted to serve this purpose. Furthermore, there is published literature which outlines the criteria for “good” assessments – which taken together can serve a great starting point in the creation of summative assessments. Although progress has been made to date, the current summative assessments have limitations. As such more evidence is needed to support the interpretation of the results of such assessments.

Keywords

Competency-based education Competency-based assessment Summative Formative Design frameworks 

References

  1. 1.
    Hawkins, R. E., Welcher, C. M., Holmboe, E. S., Kirk, L. M., Norcini, J. J., Simons, K. B., et al. (2015). Implementation of competency-based medical education: Are we addressing the concerns and challenges? Medical Education, 49(11), 1086–1102.CrossRefGoogle Scholar
  2. 2.
    Holmboe, E. S., Sherbino, J., Long, D. M., Swing, S. R., & Frank, J. R. (2010). The role of assessment in competency-based medical education. Medical Teacher, 32(8), 676–682.CrossRefGoogle Scholar
  3. 3.
    Epstein, R. M. (2007). Assessment in medical education. The New England Journal of Medicine, 356(4), 387–396.CrossRefGoogle Scholar
  4. 4.
    Konopasek, L., Norcini, J., & Krupat, E. (2016). Focusing on the formative: Building an assessment system aimed at student growth and development. Academic Medicine: Journal of the Association of American Medical Colleges, 91, 1492–1497.CrossRefGoogle Scholar
  5. 5.
    Ramani, S., & Krackov, S. K. (2012). Twelve tips for giving feedback effectively in the clinical environment. Medical Teacher, 34(10), 787–791.CrossRefGoogle Scholar
  6. 6.
    Woloschuk, W., McLaughlin, K., & Wright, B. (2013). Predicting performance on the Medical Council of Canada qualifying exam part II. Teaching and Learning in Medicine, 25(3), 237–241.CrossRefGoogle Scholar
  7. 7.
    (2014). USMLE bulletin of information 2015. Philadelphia: Federation of State Medical Boards of the United States, Inc., and the National Board of Medical Examiners.Google Scholar
  8. 8.
    (2015). ABS booklet of information surgery. Philadelphia: American Board of Surgery.Google Scholar
  9. 9.
    (2015). RCPSC specialty training requirements in general surgery. Ottawa: Royal College of Physicians and Surgeons of Canada.Google Scholar
  10. 10.
    Norcini, J., Anderson, B., Bollela, V., Burch, V., Costa, M. J., Duvivier, R., et al. (2011). Criteria for good assessment: Consensus statement and recommendations from the Ottawa 2010 conference. Medical Teacher, 33(3), 206–214.CrossRefGoogle Scholar
  11. 11.
    Rolfe, I., & McPherson, J. (1995). Formative assessment: How am I doing? Lancet, 345(8953), 837–839.CrossRefGoogle Scholar
  12. 12.
    Schuwirth, L. W., & Van der Vleuten, C. P. (2011). Programmatic assessment: From assessment of learning to assessment for learning. Medical Teacher, 33(6), 478–485.CrossRefGoogle Scholar
  13. 13.
    Wass, V., Van der Vleuten, C., Shatzer, J., & Jones, R. (2001). Assessment of clinical competence. Lancet, 357(9260), 945–949.CrossRefGoogle Scholar
  14. 14.
    Pereira, E. A., & Dean, B. J. (2013). British surgeons’ experiences of a mandatory online workplace based assessment portfolio resurveyed three years on. Journal of Surgical Education, 70(1), 59–67.CrossRefGoogle Scholar
  15. 15.
    Mislevy, R. J. (2011). Evidence-centered design for simulation-based assessment – CRESST report 800. Los Angeles: The National Center for Research on Evaluation, Standards, and Student Testing (CRESST).Google Scholar
  16. 16.
    Pearlman, M. (2008). The design architecture of NBPTS certification assessments. In R. E. Stake, S. Kushner, L. Ingvarson, & J. Hattie (Eds.), Assessing teachers for professional certification: The first decade of the national board for professional teaching standards advances in program evaluation (Vol. 11, pp. 55–91). Bingley: Emerald.CrossRefGoogle Scholar
  17. 17.
    Huff, K., Steinberg, L., & Matts, T. (2010). The promises and challenges of implementing evidence-centered design in large-scale assessment. Applied Measurement in Education, 23, 310–324.CrossRefGoogle Scholar
  18. 18.
    Mislevy, R. J., & Haertel, G. D. (2006). Implications of evidence-centered design for educational testing. Educational Measurement: Issues and Practice, 25(4), 6–20.CrossRefGoogle Scholar
  19. 19.
    Mislevy, R. J., Steinberg, L. S., & Almond, R. G. (2003). On the structure of educational assessments. Measurement: Interdisciplinary Research and Perspectives, 1(1), 3–62.Google Scholar
  20. 20.
    Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2), 13–23.CrossRefGoogle Scholar
  21. 21.
    Downing, S. M. (2003). Validity: On meaningful interpretation of assessment data. Medical Education, 37(9), 830–837.CrossRefGoogle Scholar
  22. 22.
    Cook, D. A., & Beckman, T. J. (2006). Current concepts in validity and reliability for psychometric instruments: Theory and application. The American Journal of Medicine, 119(2), 166 e7–166 16.CrossRefGoogle Scholar
  23. 23.
    Messick, S. (1989). Validity. In R. L. Linn (Ed.), Educational measurement (3rd ed.). New York: American Council on Education and Macmillan.Google Scholar
  24. 24.
    Ghaderi, I., Manji, F., Park, Y. S., Juul, D., Ott, M., Harris, I., et al. (2015). Technical skills assessment toolbox: A review using the unitary framework of validity. Annals of Surgery, 261(2), 251–262.CrossRefGoogle Scholar
  25. 25.
    Schindler, N., Corcoran, J., & DaRosa, D. (2007). Description and impact of using a standard-setting method for determining pass/fail scores in a surgery clerkship. American Journal of Surgery, 193(2), 252–257.CrossRefGoogle Scholar
  26. 26.
    Norcini, J. J. (2003). Setting standards on educational tests. Medical Education, 37(5), 464–469.CrossRefGoogle Scholar
  27. 27.
    de Montbrun, S., Satterthwaite, L., & Grantcharov, T. P. (2016). Setting pass scores for assessment of technical performance by surgical trainees. The British Journal of Surgery, 103(3), 300–306.CrossRefGoogle Scholar
  28. 28.
    Norcini JJ, Holmboe, E.S., Hawkins, R.E. Evaluation challenges in the era of outcomes-based education. Holmboe E.S., Hawkins, R.E. Practical guide to the evaluation of clinical competence. 1st Philadelphia: Mosby; 2008. 1–9.Google Scholar
  29. 29.
    McGaghie, W. C., Butter, J., & Kaye, M. (2009). Observational assessment. In S. M. Downing & R. Yudkowsky (Eds.), Assessment in health professions education (1st ed., pp. 185–215). New York: Taylor and Francis.Google Scholar
  30. 30.
    Feldman, M., Lazzara, E. H., Vanderbilt, A. A., & DiazGranados, D. (2012). Rater training to support high-stakes simulation-based assessments. The Journal of Continuing Education in the Health Professions, 32(4), 279–286.CrossRefGoogle Scholar
  31. 31.
    (2014). RCPSC objectives of surgical foundations training. Ottawa: Royal College of Physicians and Surgeons of Canada.Google Scholar
  32. 32.
    Szasz, P., Grantcharov, T.P., Sweet, R.M., Korndorffer, J.R., Pedowitz, R.A., Roberts, P.L., Sachdeva, A.K. (2016). Simulation-based summative assessments in surgery. Surgery (in press).Google Scholar
  33. 33.
    Goldenberg, M., Garbesn, A., Szasz, P., Hauer, T., Grantcharov, T. P.(2016). Establishing absolute standards for technical performance in surgery: A systematic review. British Journal of Surgery (Submitted).Google Scholar
  34. 34.
    (2016). Fundamentals of laparoscopic surgery (FLS). Los Angeles: Society of American Gastrointestinal and Endoscopic Surgeons (SAGES). Available from: http://www.flsprogram.org/about-fls/.
  35. 35.
    Peters, J. H., Fried, G. M., Swanstrom, L. L., Soper, N. J., Sillin, L. F., Schirmer, B., et al. (2004). Development and validation of a comprehensive program of education and assessment of the basic fundamentals of laparoscopic surgery. Surgery, 135(1), 21–27.CrossRefGoogle Scholar
  36. 36.
    de Montbrun, S., Roberts, P. L., Satterthwaite, L., & MacRae, H. (2016). Implementing and evaluating a national certification technical skills examination: The colorectal objective structured assessment of technical skill. Annals of Surgery, 264, 1–6.CrossRefGoogle Scholar
  37. 37.
    Angelo, R. L., Ryu, R. K., Pedowitz, R. A., Beach, W., Burns, J., Dodds, J., et al. (2015). A proficiency-based progression training curriculum coupled with a model simulator results in the acquisition of a superior arthroscopic Bankart skill set. Arthroscopy: The Journal of Arthroscopic & Related Surgery: Official Publication of the Arthroscopy Association of North America and the International Arthroscopy Association, 31(10), 1854–1871.CrossRefGoogle Scholar
  38. 38.
    Pedersen, P., Palm, H., Ringsted, C., & Konge, L. (2014). Virtual-reality simulation to assess performance in hip fracture surgery. Acta Orthopaedica, 85(4), 403–407.CrossRefGoogle Scholar
  39. 39.
    Thomsen, A. S., Kiilgaard, J. F., Kjaerbo, H., la Cour, M., & Konge, L. (2015). Simulation-based certification for cataract surgery. Acta Ophthalmologica, 93(5), 416–421.CrossRefGoogle Scholar
  40. 40.
    Vassiliou, M. C., Dunkin, B. J., Fried, G. M., Mellinger, J. D., Trus, T., Kaneva, P., et al. (2014). Fundamentals of endoscopic surgery: Creation and validation of the hands-on test. Surgical Endoscopy, 28(3), 704–711.CrossRefGoogle Scholar
  41. 41.
    Tjiam, I. M., Schout, B. M., Hendrikx, A. J., Muijtjens, A. M., Scherpbier, A. J., Witjes, J. A., et al. (2013). Program for laparoscopic urological skills assessment: Setting certification standards for residents. Minimally Invasive Therapy & Allied Technologies: MITAT: Official Journal of the Society for Minimally Invasive Therapy, 22(1), 26–32.CrossRefGoogle Scholar
  42. 42.
    Beard, J. D. (2005). Education, training committee of the Vascular Society of Great B, Ireland. Setting standards for the assessment of operative competence. European Journal of Vascular and Endovascular Surgery: The Official Journal of the European Society for Vascular Surgery, 30(2), 215–218.CrossRefGoogle Scholar
  43. 43.
    Teitelbaum, E. N., Soper, N. J., Santos, B. F., Rooney, D. M., Patel, P., Nagle, A. P., et al. (2014). A simulator-based resident curriculum for laparoscopic common bile duct exploration. Surgery, 156(4), 880–887 90–3.CrossRefGoogle Scholar
  44. 44.
    Ginsburg, S., Eva, K., & Regehr, G. (2013). Do in-training evaluation reports deserve their bad reputations? A study of the reliability and predictive ability of ITER scores and narrative comments. Academic Medicine: Journal of the Association of American Medical Colleges, 88(10), 1539–1544.CrossRefGoogle Scholar
  45. 45.
    Ginsburg, S., Gold, W., Cavalcanti, R. B., Kurabi, B., & McDonald-Blumer, H. (2011). Competencies “plus”: The nature of written comments on internal medicine residents’ evaluation forms. Academic Medicine: Journal of the Association of American Medical Colleges, 86(10 Suppl), S30–S34.CrossRefGoogle Scholar
  46. 46.
    Driessen, E., van der Vleuten, C., Schuwirth, L., van Tartwijk, J., & Vermunt, J. (2005). The use of qualitative research criteria for portfolio assessment as an alternative to reliability evaluation: A case study. Medical Education, 39(2), 214–220.CrossRefGoogle Scholar
  47. 47.
    van der Vleuten, C. P., & Schuwirth, L. W. (2005). Assessing professional competence: From methods to programmes. Medical Education, 39(3), 309–317.CrossRefGoogle Scholar
  48. 48.
    Frohna, A., & Stern, D. (2005). The nature of qualitative comments in evaluating professionalism. Medical Education, 39(8), 763–768.CrossRefGoogle Scholar
  49. 49.
    Govaerts, M., & van der Vleuten, C. P. (2013). Validity in work-based assessment: Expanding our horizons. Medical Education, 47(12), 1164–1174.CrossRefGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  1. 1.University of TorontoTorontoCanada

Personalised recommendations