Skip to main content

The Role of Assessment in Surgical Education

  • Chapter
  • First Online:
Advancing Surgical Education

Part of the book series: Innovation and Change in Professional Education ((ICPE,volume 17))

  • 1450 Accesses

Overview

As competency-based medical education (CBME) continues to infiltrate postgraduate training, the focus among educators and researchers has shifted toward trainee assessment. Summative assessments are those from which consequences arise for both the trainee and training program. The main intent of summative assessments is to differentiate between different trainee states. Thus psychometric rigor must be at the center of such assessments to ensure defensible results. While no model has been created specifically to design such assessments, the evidence-centered assessment design (ECD) framework can be adapted to serve this purpose. Furthermore, there is published literature which outlines the criteria for “good” assessments – which taken together can serve a great starting point in the creation of summative assessments. Although progress has been made to date, the current summative assessments have limitations. As such more evidence is needed to support the interpretation of the results of such assessments.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Hawkins, R. E., Welcher, C. M., Holmboe, E. S., Kirk, L. M., Norcini, J. J., Simons, K. B., et al. (2015). Implementation of competency-based medical education: Are we addressing the concerns and challenges? Medical Education, 49(11), 1086–1102.

    Article  Google Scholar 

  2. Holmboe, E. S., Sherbino, J., Long, D. M., Swing, S. R., & Frank, J. R. (2010). The role of assessment in competency-based medical education. Medical Teacher, 32(8), 676–682.

    Article  Google Scholar 

  3. Epstein, R. M. (2007). Assessment in medical education. The New England Journal of Medicine, 356(4), 387–396.

    Article  Google Scholar 

  4. Konopasek, L., Norcini, J., & Krupat, E. (2016). Focusing on the formative: Building an assessment system aimed at student growth and development. Academic Medicine: Journal of the Association of American Medical Colleges, 91, 1492–1497.

    Article  Google Scholar 

  5. Ramani, S., & Krackov, S. K. (2012). Twelve tips for giving feedback effectively in the clinical environment. Medical Teacher, 34(10), 787–791.

    Article  Google Scholar 

  6. Woloschuk, W., McLaughlin, K., & Wright, B. (2013). Predicting performance on the Medical Council of Canada qualifying exam part II. Teaching and Learning in Medicine, 25(3), 237–241.

    Article  Google Scholar 

  7. (2014). USMLE bulletin of information 2015. Philadelphia: Federation of State Medical Boards of the United States, Inc., and the National Board of Medical Examiners.

    Google Scholar 

  8. (2015). ABS booklet of information surgery. Philadelphia: American Board of Surgery.

    Google Scholar 

  9. (2015). RCPSC specialty training requirements in general surgery. Ottawa: Royal College of Physicians and Surgeons of Canada.

    Google Scholar 

  10. Norcini, J., Anderson, B., Bollela, V., Burch, V., Costa, M. J., Duvivier, R., et al. (2011). Criteria for good assessment: Consensus statement and recommendations from the Ottawa 2010 conference. Medical Teacher, 33(3), 206–214.

    Article  Google Scholar 

  11. Rolfe, I., & McPherson, J. (1995). Formative assessment: How am I doing? Lancet, 345(8953), 837–839.

    Article  Google Scholar 

  12. Schuwirth, L. W., & Van der Vleuten, C. P. (2011). Programmatic assessment: From assessment of learning to assessment for learning. Medical Teacher, 33(6), 478–485.

    Article  Google Scholar 

  13. Wass, V., Van der Vleuten, C., Shatzer, J., & Jones, R. (2001). Assessment of clinical competence. Lancet, 357(9260), 945–949.

    Article  Google Scholar 

  14. Pereira, E. A., & Dean, B. J. (2013). British surgeons’ experiences of a mandatory online workplace based assessment portfolio resurveyed three years on. Journal of Surgical Education, 70(1), 59–67.

    Article  Google Scholar 

  15. Mislevy, R. J. (2011). Evidence-centered design for simulation-based assessment – CRESST report 800. Los Angeles: The National Center for Research on Evaluation, Standards, and Student Testing (CRESST).

    Google Scholar 

  16. Pearlman, M. (2008). The design architecture of NBPTS certification assessments. In R. E. Stake, S. Kushner, L. Ingvarson, & J. Hattie (Eds.), Assessing teachers for professional certification: The first decade of the national board for professional teaching standards advances in program evaluation (Vol. 11, pp. 55–91). Bingley: Emerald.

    Chapter  Google Scholar 

  17. Huff, K., Steinberg, L., & Matts, T. (2010). The promises and challenges of implementing evidence-centered design in large-scale assessment. Applied Measurement in Education, 23, 310–324.

    Article  Google Scholar 

  18. Mislevy, R. J., & Haertel, G. D. (2006). Implications of evidence-centered design for educational testing. Educational Measurement: Issues and Practice, 25(4), 6–20.

    Article  Google Scholar 

  19. Mislevy, R. J., Steinberg, L. S., & Almond, R. G. (2003). On the structure of educational assessments. Measurement: Interdisciplinary Research and Perspectives, 1(1), 3–62.

    Google Scholar 

  20. Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2), 13–23.

    Article  Google Scholar 

  21. Downing, S. M. (2003). Validity: On meaningful interpretation of assessment data. Medical Education, 37(9), 830–837.

    Article  Google Scholar 

  22. Cook, D. A., & Beckman, T. J. (2006). Current concepts in validity and reliability for psychometric instruments: Theory and application. The American Journal of Medicine, 119(2), 166 e7–166 16.

    Article  Google Scholar 

  23. Messick, S. (1989). Validity. In R. L. Linn (Ed.), Educational measurement (3rd ed.). New York: American Council on Education and Macmillan.

    Google Scholar 

  24. Ghaderi, I., Manji, F., Park, Y. S., Juul, D., Ott, M., Harris, I., et al. (2015). Technical skills assessment toolbox: A review using the unitary framework of validity. Annals of Surgery, 261(2), 251–262.

    Article  Google Scholar 

  25. Schindler, N., Corcoran, J., & DaRosa, D. (2007). Description and impact of using a standard-setting method for determining pass/fail scores in a surgery clerkship. American Journal of Surgery, 193(2), 252–257.

    Article  Google Scholar 

  26. Norcini, J. J. (2003). Setting standards on educational tests. Medical Education, 37(5), 464–469.

    Article  Google Scholar 

  27. de Montbrun, S., Satterthwaite, L., & Grantcharov, T. P. (2016). Setting pass scores for assessment of technical performance by surgical trainees. The British Journal of Surgery, 103(3), 300–306.

    Article  Google Scholar 

  28. Norcini JJ, Holmboe, E.S., Hawkins, R.E. Evaluation challenges in the era of outcomes-based education. Holmboe E.S., Hawkins, R.E. Practical guide to the evaluation of clinical competence. 1st Philadelphia: Mosby; 2008. 1–9.

    Google Scholar 

  29. McGaghie, W. C., Butter, J., & Kaye, M. (2009). Observational assessment. In S. M. Downing & R. Yudkowsky (Eds.), Assessment in health professions education (1st ed., pp. 185–215). New York: Taylor and Francis.

    Google Scholar 

  30. Feldman, M., Lazzara, E. H., Vanderbilt, A. A., & DiazGranados, D. (2012). Rater training to support high-stakes simulation-based assessments. The Journal of Continuing Education in the Health Professions, 32(4), 279–286.

    Article  Google Scholar 

  31. (2014). RCPSC objectives of surgical foundations training. Ottawa: Royal College of Physicians and Surgeons of Canada.

    Google Scholar 

  32. Szasz, P., Grantcharov, T.P., Sweet, R.M., Korndorffer, J.R., Pedowitz, R.A., Roberts, P.L., Sachdeva, A.K. (2016). Simulation-based summative assessments in surgery. Surgery (in press).

    Google Scholar 

  33. Goldenberg, M., Garbesn, A., Szasz, P., Hauer, T., Grantcharov, T. P.(2016). Establishing absolute standards for technical performance in surgery: A systematic review. British Journal of Surgery (Submitted).

    Google Scholar 

  34. (2016). Fundamentals of laparoscopic surgery (FLS). Los Angeles: Society of American Gastrointestinal and Endoscopic Surgeons (SAGES). Available from: http://www.flsprogram.org/about-fls/.

  35. Peters, J. H., Fried, G. M., Swanstrom, L. L., Soper, N. J., Sillin, L. F., Schirmer, B., et al. (2004). Development and validation of a comprehensive program of education and assessment of the basic fundamentals of laparoscopic surgery. Surgery, 135(1), 21–27.

    Article  Google Scholar 

  36. de Montbrun, S., Roberts, P. L., Satterthwaite, L., & MacRae, H. (2016). Implementing and evaluating a national certification technical skills examination: The colorectal objective structured assessment of technical skill. Annals of Surgery, 264, 1–6.

    Article  Google Scholar 

  37. Angelo, R. L., Ryu, R. K., Pedowitz, R. A., Beach, W., Burns, J., Dodds, J., et al. (2015). A proficiency-based progression training curriculum coupled with a model simulator results in the acquisition of a superior arthroscopic Bankart skill set. Arthroscopy: The Journal of Arthroscopic & Related Surgery: Official Publication of the Arthroscopy Association of North America and the International Arthroscopy Association, 31(10), 1854–1871.

    Article  Google Scholar 

  38. Pedersen, P., Palm, H., Ringsted, C., & Konge, L. (2014). Virtual-reality simulation to assess performance in hip fracture surgery. Acta Orthopaedica, 85(4), 403–407.

    Article  Google Scholar 

  39. Thomsen, A. S., Kiilgaard, J. F., Kjaerbo, H., la Cour, M., & Konge, L. (2015). Simulation-based certification for cataract surgery. Acta Ophthalmologica, 93(5), 416–421.

    Article  Google Scholar 

  40. Vassiliou, M. C., Dunkin, B. J., Fried, G. M., Mellinger, J. D., Trus, T., Kaneva, P., et al. (2014). Fundamentals of endoscopic surgery: Creation and validation of the hands-on test. Surgical Endoscopy, 28(3), 704–711.

    Article  Google Scholar 

  41. Tjiam, I. M., Schout, B. M., Hendrikx, A. J., Muijtjens, A. M., Scherpbier, A. J., Witjes, J. A., et al. (2013). Program for laparoscopic urological skills assessment: Setting certification standards for residents. Minimally Invasive Therapy & Allied Technologies: MITAT: Official Journal of the Society for Minimally Invasive Therapy, 22(1), 26–32.

    Article  Google Scholar 

  42. Beard, J. D. (2005). Education, training committee of the Vascular Society of Great B, Ireland. Setting standards for the assessment of operative competence. European Journal of Vascular and Endovascular Surgery: The Official Journal of the European Society for Vascular Surgery, 30(2), 215–218.

    Article  Google Scholar 

  43. Teitelbaum, E. N., Soper, N. J., Santos, B. F., Rooney, D. M., Patel, P., Nagle, A. P., et al. (2014). A simulator-based resident curriculum for laparoscopic common bile duct exploration. Surgery, 156(4), 880–887 90–3.

    Article  Google Scholar 

  44. Ginsburg, S., Eva, K., & Regehr, G. (2013). Do in-training evaluation reports deserve their bad reputations? A study of the reliability and predictive ability of ITER scores and narrative comments. Academic Medicine: Journal of the Association of American Medical Colleges, 88(10), 1539–1544.

    Article  Google Scholar 

  45. Ginsburg, S., Gold, W., Cavalcanti, R. B., Kurabi, B., & McDonald-Blumer, H. (2011). Competencies “plus”: The nature of written comments on internal medicine residents’ evaluation forms. Academic Medicine: Journal of the Association of American Medical Colleges, 86(10 Suppl), S30–S34.

    Article  Google Scholar 

  46. Driessen, E., van der Vleuten, C., Schuwirth, L., van Tartwijk, J., & Vermunt, J. (2005). The use of qualitative research criteria for portfolio assessment as an alternative to reliability evaluation: A case study. Medical Education, 39(2), 214–220.

    Article  Google Scholar 

  47. van der Vleuten, C. P., & Schuwirth, L. W. (2005). Assessing professional competence: From methods to programmes. Medical Education, 39(3), 309–317.

    Article  Google Scholar 

  48. Frohna, A., & Stern, D. (2005). The nature of qualitative comments in evaluating professionalism. Medical Education, 39(8), 763–768.

    Article  Google Scholar 

  49. Govaerts, M., & van der Vleuten, C. P. (2013). Validity in work-based assessment: Expanding our horizons. Medical Education, 47(12), 1164–1174.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to P. Szasz .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Szasz, P., Grantcharov, T.P. (2019). The Role of Assessment in Surgical Education. In: Nestel, D., Dalrymple, K., Paige, J., Aggarwal, R. (eds) Advancing Surgical Education. Innovation and Change in Professional Education, vol 17. Springer, Singapore. https://doi.org/10.1007/978-981-13-3128-2_20

Download citation

  • DOI: https://doi.org/10.1007/978-981-13-3128-2_20

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-13-3127-5

  • Online ISBN: 978-981-13-3128-2

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics