Skip to main content
Log in

“It Changed How I Think”—Impact of Programmatic Assessment Upon Practicing Physicians: A Qualitative Study

  • Original Research
  • Published:
Medical Science Educator Aims and scope Submit manuscript

Abstract

Programmatic assessment is a systematic approach used to document and assess learner performance. It offers learners frequent formative feedback from a variety of contexts and uses both high- and low-stakes assessments to determine student progress. Existing research has explored learner and faculty perceptions of programmatic assessment, reporting favorable impact on faculty understanding of the importance of assessment stakes and feedback to learners while students report the ability to establish and navigate towards goals and reflect on their performance. The Cleveland Clinic Lerner College of Medicine (CCLCM) of Case Western Reserve University adopted programmatic assessment methods at its inception. With more than 18 years’ experience with programmatic assessment and a portfolio-based assessment system, CCLCM is well-positioned to explore its graduates’ perceptions of their programmatic assessment experiences during and after medical school. In 2020, the investigators interviewed 26 of the 339 physician graduates. Participants were purposefully sampled to represent multiple class cohorts (2009–2019), clinical specialties, and practice locations. The investigators analyzed interview transcripts using thematic analysis informed by the frameworks of self-determination theory and professional identity formation. The authors identified themes and support each with participant quotes from the interviews. Based on findings, the investigators compiled a series of recommendations for other institutions who have already or plan to incorporate elements of programmatic assessment into their curricula. The authors concluded by discussing future directions for research and additional avenues of inquiry.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Schuwirth LW, Van der Vleuten CP. Programmatic assessment: from assessment of learning to assessment for learning. Med Teach. 2011. https://doi.org/10.3109/0142159X.2011.565828.

  2. Van der Vleuten C, Lindemann I, Schmidt L. Programmatic assessment: the process, rationale and evidence for modern evaluation approaches in medical education. Med J Aust. 2018. https://doi.org/10.5694/mja17.00926.

  3. Schut S, Maggio LA, Heeneman S, van Tartwijk J, van der Vleuten C, Driessen E. Where the rubber meets the road – an integrative review of programmatic assessment in health care professions education. Perspect Med Educ. 2021. https://doi.org/10.1007/s40037-020-00625-w.

  4. Van der Vleuten CP, Schuwirth LW. Assessing professional competence: from methods to programmes. Med Educ. 2005. https://doi.org/10.1111/j.1365-2929.2005.02094.x.

  5. Van Der Vleuten CPM, Schuwirth LWT, Driessen EW, Govaerts MJB, Heeneman S. Twelve tips for programmatic assessment. Med Teach. 2015. https://doi.org/10.3109/0142159X.2014.973388.

  6. Heeneman S, de Jong LH, Dawson LJ, et al. Ottawa 2020 consensus statement for programmatic assessment – 1. Agreement on the principles. Med Teach. 2021. https://doi.org/10.1080/0142159X.2021.1957088.

  7. Bok HG, Teunissen PW, Favier RP, et al. Programmatic assessment of competency-based workplace learning: when theory meets practice. BMC Med Educ. 2013. https://doi.org/10.1186/1472-6920-13-123.

  8. Heeneman S, Oudkerk Pool A, Schuwirth LW, van der Vleuten CP, Driessen EW. The impact of programmatic assessment on student learning: theory versus practice. Med Educ. 2015. https://doi.org/10.1111/medu.12645.

  9. Roberts C, Khanna P, Bleasel J, et al. Student perspectives on programmatic assessment in a large medical programme: a critical realist analysis. Med Educ. 2022. https://doi.org/10.1111/medu.14807.

  10. Schut S, Driessen E, van Tartwijk J, van der Vleuten C, Heeneman S. Stakes in the eye of the beholder: an international study of learners’ perceptions within programmatic assessment. Med Educ. 2018. https://doi.org/10.1111/medu.13532.

  11. Perry M, Linn A, Munzer BW, et al. Programmatic assessment in emergency medicine: implementation of best practices. J Grad Med Educ. 2018. https://doi.org/10.4300/JGME-D-17-00094.1.

  12. Chan T, Sherbino J; McMAP Collaborators. The McMaster Modular Assessment Program (McMAP): a theoretically grounded work-based assessment system for an emergency medicine residency program. Acad Med. 2015. https://doi.org/10.1097/ACM.0000000000000707.

  13. Colbert CY, Bierer SB. The importance of professional development in a programmatic assessment system: one medical school’s experience. Education Sciences. 2022. https://doi.org/10.3390/educsci12030220.

    Article  Google Scholar 

  14. Torre DM, Schuwirth LWT, Van der Vleuten CPM. Theoretical considerations on programmatic assessment. Med Teach. 2020. https://doi.org/10.1080/0142159X.2019.1672863.

  15. Rich JV, Fostaty Young S, Donnelly C, et al. Competency-based education calls for programmatic assessment: but what does this look like in practice?. J Eval Clin Pract. 2020. https://doi.org/10.1111/jep.13328.

  16. Pearce J, Prideaux D. When I say … programmatic assessment in postgraduate medical education. Med Educ. 2019. https://doi.org/10.1111/medu.13949.

  17. Schut S, Heeneman S, Bierer B, Driessen E, van Tartwijk J, van der Vleuten C. Between trust and control: teachers’ assessment conceptualisations within programmatic assessment. Med Educ. 2020. https://doi.org/10.1111/medu.14075.

  18. Eva KW, Bordage G, Campbell C, et al. Towards a program of assessment for health professionals: from training into practice. Adv Health Sci Educ Theory Pract. 2016. https://doi.org/10.1007/s10459-015-9653-6.

  19. Palaganas JC, Edwards RA. Six common pitfalls of feedback conversations. Acad Med. 2021. https://doi.org/10.1097/ACM.0000000000003767.

  20. Boursicot K, Kemp S, Wilkinson T, et al. Performance assessment: consensus statement and recommendations from the 2020 Ottawa Conference. Med Teach. 2021. https://doi.org/10.1080/0142159X.2020.1830052.

  21. Norcini J, Anderson B, Bollela V, et al. Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011. https://doi.org/10.3109/0142159X.2011.551559.

  22. Watling CJ, Ginsburg S. Assessment, feedback and the alchemy of learning. Med Educ. 2019. https://doi.org/10.1111/medu.13645.

  23. Hauer KE, O’Sullivan PS, Fitzhenry K, Boscardin C. Translating theory into practice: implementing a program of assessment. Acad Med. 2018. https://doi.org/10.1097/ACM.0000000000001995.

  24. Ryan RM, Deci EL. Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. Am Psychol. 2000. https://doi.org/10.1037//0003-066x.55.1.68.

  25. Sandars J, Cleary TJ. Self-regulation theory: applications to medical education: AMEE Guide No. 58. Med Teach. 2011. https://doi.org/10.3109/0142159X.2011.595434.

  26. Ten Cate TJ, Kusurkar RA, Williams GC. How self-determination theory can assist our understanding of the teaching and learning processes in medical education. AMEE guide No. 59. Med Teach. 2011. https://doi.org/10.3109/0142159X.2011.595435.

  27. Brydges R, Butler D. A reflective analysis of medical education research on self-regulation in learning and practice. Med Educ. 2012. https://doi.org/10.1111/j.1365-2923.2011.04100.x.

  28. Kusurkar RA, Croiset G, Galindo-Garré F, Ten Cate O. Motivational profiles of medical students: association with study effort, academic performance and exhaustion. BMC Med Educ. 2013. https://doi.org/10.1186/1472-6920-13-87.

  29. Kusurkar RA, Ten Cate TJ, Vos CM, Westers P, Croiset G. How motivation affects academic performance: a structural equation modelling analysis. Adv Health Sci Educ Theory Pract. 2013. https://doi.org/10.1007/s10459-012-9354-3.

  30. Cruess RL, Cruess SR, Boudreau JD, Snell L, Steinert Y. Reframing medical education to support professional identity formation. Acad Med. 2014. https://doi.org/10.1097/ACM.0000000000000427.

  31. Wong A, Trollope-Kumar K. Reflections: an inquiry into medical students’ professional identity formation. Med Educ. 2014. https://doi.org/10.1111/medu.12382.

  32. Rees CE, Crampton PES, Monrouxe LV. Re-visioning academic medicine through a constructionist lens. Acad Med. 2020. https://doi.org/10.1097/ACM.0000000000003109.

  33. Thomas PA, Wilson-Delfosse AL, Mehta N, Papp KK, Bierer SB, Isaacson JH. Case Western Reserve University School of Medicine, including the Cleveland Clinic Lerner College of Medicine. Acad Med. 2020. https://doi.org/10.1097/ACM.0000000000003411.

  34. Fishleder AJ, Henson LC, Hull AL. Cleveland Clinic Lerner College of Medicine: an innovative approach to medical education and the training of physician investigators. Acad Med. 2007. https://doi.org/10.1097/ACM.0b013e318033364e.

  35. Dannefer EF, Henson LC. The portfolio approach to competency-based assessment at the Cleveland Clinic Lerner College of Medicine. Acad Med. 2007. https://doi.org/10.1097/ACM.0b013e31803ead30.

  36. Dannefer EF, Bierer SB, Gladding SP. Evidence within a portfolio-based assessment program: what do medical students select to document their performance?. Med Teach. 2012. https://doi.org/10.3109/0142159X.2012.652241.

  37. Foshee CM, Mehta N, Bierer SB, Dannefer EF. A model for integrating technology into an assessment system: building an e-portfolio to support learning. In: Lin L, Atkinson R, editors. Educational technology: challenges, applications and learning opportunities. Hauppauge: Nova Science Publishers; 2016. p. 223–51.

    Google Scholar 

  38. Bierer SB, Dannefer EF, Tetzlaff JE. Time to loosen the apron strings: cohort-based evaluation of a learner-driven remediation model at one medical school. J Gen Intern Med. 2015. https://doi.org/10.1007/s11606-015-3343-1.

  39. De Jong L, Bok H, Bierer SB, Van der Vleuten C. Quality assurance in programmatic assessment. In: Van der Vleuten C, Hays R, Maulau-Aduli B, editors. Understanding assessment in medical education through quality assurance. New York: McGraw Hill; 2021. p. 137–217.

    Google Scholar 

  40. Bierer SB, Colbert CY, Foshee CM, French JC, Pien LC. Tool for diagnosing gaps within a competency-based assessment system. Acad Med. 2018. https://doi.org/10.1097/ACM.0000000000002060.

  41. Bierer SB, Dannefer EF. The learning environment counts: longitudinal qualitative analysis of study strategies adopted by first-year medical students in a competency-based educational program. Acad Med. 2016. https://doi.org/10.1097/ACM.0000000000001363.

  42. Harrison CJ, Könings KD, Dannefer EF, Schuwirth LW, Wass V, van der Vleuten CP. Factors influencing students’ receptivity to formative feedback emerging from different assessment cultures. Perspect Med Educ. 2016. https://doi.org/10.1007/s40037-016-0297-x.

  43. Altahawi F, Sisk B, Poloskey S, Hicks C, Dannefer EF. Student perspectives on assessment: experience in a competency-based portfolio system. Med Teach. 2012. https://doi.org/10.3109/0142159X.2012.652243.

  44. Bierer SB, Dannefer EF. Does students’ gender, citizenship, or verbal ability affect fairness of portfolio-based promotion decisions? Results from one medical school. Acad Med. 2011. https://doi.org/10.1097/ACM.0b013e318217e14b.

  45. Torre D, Schuwirth L, Van der Vleuten C, Heeneman S. An international study on the implementation of programmatic assessment: understanding challenges and exploring solutions. Med Teach. 2022. https://doi.org/10.1080/0142159X.2022.2083487.

  46. Torre D, Rice NE, Ryan A, et al. Ottawa 2020 consensus statements for programmatic assessment – 2. Implementation and practice. Med Teach. 2021. https://doi.org/10.1080/0142159X.2021.1956681.

  47. Johnson JL, Adkins D, Chauvin S. A review of the quality indicators of rigor in qualitative research. Am J Pharm Educ. 2020. https://doi.org/10.5688/ajpe7120.

  48. Creswell JW. Collecting qualitative data. In: Planning, conducting, and evaluating quantitative and qualitative research (4th ed). Boston: Pearson Education, Inc. 2012. Pp 204–235.

  49. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res in Psych. 2006. https://doi.org/10.1191/1478088706qp063oa.

    Article  Google Scholar 

  50. LaDonna KA, Artino AR Jr, Balmer DF. Beyond the guise of saturation: rigor and qualitative interview data. J Grad Med Educ. 2021. https://doi.org/10.4300/JGME-D-21-00752.1.

  51. Olmos-Vega FM, Stalmeijer RE, Varpio L, Kahlke R. A practical guide to reflexivity in qualitative research: AMEE Guide No. 149 [published online ahead of print, 2022 Apr 7]. Med Teach. 2022. https://doi.org/10.1080/0142159X.2022.2057287.

  52. Schon D. The reflective practitioner: how professionals think in action. Basic Books. 1984.

  53. Mahajan R, Saiyad S, Virk A, Joshi A, Singh T. Blended programmatic assessment for competency based curricula. J Postgrad Med. 2021. https://doi.org/10.4103/jpgm.JPGM_1061_20.

  54. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990. https://doi.org/10.1097/00001888-199009000-00045.

  55. Swan Sein A, Rashid H, Meka J, Amiel J, Pluta W. Twelve tips for embedding assessment for and as learning practices in a programmatic assessment system. Med Teach. 2021. https://doi.org/10.1080/0142159X.2020.1789081.

  56. Driessen EW, van Tartwijk J, Govaerts M, Teunissen P, van der Vleuten CP. The use of programmatic assessment in the clinical workplace: a Maastricht case report. Med Teach. 2012. https://doi.org/10.3109/0142159X.2012.652242.

  57. Sawatsky AP, Huffman BM, Hafferty FW. Coaching versus competency to facilitate professional identity formation. Acad Med. 2020. https://doi.org/10.1097/ACM.0000000000003144.

  58. Moreau KA. Exploring the connections between programmatic assessment and program evaluation within competency-based medical education programs. Med Teach. 2021. https://doi.org/10.1080/0142159X.2020.1841128.

    Article  Google Scholar 

  59. Harrison CJ, Könings KD, Schuwirth L, Wass V, van der Vleuten C. Barriers to the uptake and use of feedback in the context of summative assessment. Adv Health Sci Educ Theory Pract. 2015. https://doi.org/10.1007/s10459-014-9524-6.

    Article  Google Scholar 

  60. Wong SN, Luo CJ, MacDonald G, Hatala R. A qualitative study of medical students’ perceptions of resident feedback. Med Educ. 2022. https://doi.org/10.1111/medu.14847.

  61. AlRadini FA. Perceptions of portfolio assessment in family medicine graduates: a qualitative interview study. BMC Med Educ. 2022. https://doi.org/10.1186/s12909-022-03991-7.

  62. Blamoun J, Hakemi A, Armstead T. Perspectives on transitioning step 1 of the United States Medical Licensing Examination to a pass/fail scoring model: defining new frameworks for medical students applying for residency. Adv Med Educ Pract. 2021. https://doi.org/10.2147/AMEP.S296286.

  63. Lewis CE, Hiatt JR, Wilkerson L, Tillou A, Parker NH, Hines OJ. Numerical versus pass/fail scoring on the USMLE: what do medical students and residents want and why?. J Grad Med Educ. 2011. https://doi.org/10.4300/JGME-D-10-00121.1.

  64. Wilkinson TJ, Tweed MJ. Deconstructing programmatic assessment. Adv Med Educ Pract. 2018. https://doi.org/10.2147/AMEP.S144449.

    Article  Google Scholar 

  65. Seligman L, Abdullahi A, Teherani A, Hauer KE. From grading to assessment for learning: a qualitative study of student perceptions surrounding elimination of core clerkship grades and enhanced formative feedback. Teach Learn Med. 2021. https://doi.org/10.1080/10401334.2020.1847654.

Download references

Acknowledgements

The authors wish to thank Dr. Bradley Gill and Dr. Christine Warren for their willingness to participate in the pilot testing of our interview guide. We also thank Dr. Klara Papp for providing important feedback on our manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jessica Greenfield.

Ethics declarations

Ethical Approval

This study was approved by the institutional review board of the Cleveland Clinic.

Conflict of Interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Greenfield, J., Qua, K., Prayson, R.A. et al. “It Changed How I Think”—Impact of Programmatic Assessment Upon Practicing Physicians: A Qualitative Study. Med.Sci.Educ. 33, 963–974 (2023). https://doi.org/10.1007/s40670-023-01829-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s40670-023-01829-5

Keywords

Navigation