Didactic CME and practice change: don’t throw that baby out quite yet
- 306 Downloads
Skepticism exists regarding the role of continuing medical education (CME) in improving physician performance. The harshest criticism has been reserved for didactic CME. Reviews of the scientific literature on the effectiveness of CME conclude that formal or didactic modes of education have little or no impact on clinical practice. This has led some to argue that didactic CME is a highly questionable use of organizational and financial resources, and a cause of lost opportunities for physicians to engage in meaningful learning. The authors’ current program of research has forced them to reconsider the received wisdom regarding the relationship between didactic modes of education and learning, and the role frank dissemination can play in bringing about practice change. The authors argued that the practice of assessing and valuing educational methods based only on their capacity to directly influence practice reflects an impoverished understanding of how change in clinical practice actually occurs. Drawing on case studies research, examples were given of the functions didactic CME served in the interest of improved practice. Reasons were then explored as to why the contribution of didactic CME is often missed or dismissed. The goal was not to advocate for a return to the status quo ante where lecture-based education is the dominant modality, but rather to acknowledge both the limits and potential of this longstanding approach to delivering continuing education.
KeywordsMedical education Continuing medical education Didactic education Practice change Physician performance Assessment healthcare outcomes Evaluation
The case studies research described in this article was funded in part by an unrestricted educational grant from Wyeth Pharmaceuticals; this publication was supported by grant 1UL1RR025011 from the Clinical and Translational Science Award (CTSA) program of the National Center for Research Resources, National Institutes of Health.
The case studies were reviewed and determined to be exempt by the Health Sciences Institutional Review Board of the University of Wisconsin-Madison.
- Accreditation Council for Graduate Medical Education. (2007). Common program requirements: General competencies. http://www.acgme.org/outcome/comp/GeneralCompetenciesStandards21307.pdf. Accessed 13 August 2011.
- Cook, T. D., & Payne, M. R. (2002). Objecting to the objections to using random assignment in educational research. In F. Mosteller & R. Boruch (Eds.), Evidence matters: Randomized trials in education research (pp. 150–178). Washington, DC: Brookings Institution Press.Google Scholar
- Davis, D., Thomson O’Brien, M. A., Freemantle, N., Wolf, F. M., Mazmanian, P., & Taylor-Vaisey, A. (1999). Impact of formal continuing medical education: Do conferences, workshops, rounds and other traditional continuing education activities change physician behavior or health care outcomes? JAMA, 282(9), 867–874.CrossRefGoogle Scholar
- Dion, D. (2003). Evidence and inference in the comparative case study. In G. Goertz & H. Starr (Eds.), Necessary conditions: Theory, methodology, and applications (pp. 95–112). Lanham, MD: Rowman and Littlefield.Google Scholar
- Engel, P. G. H. (1997). The social organization of innovation: A focus on stakeholder interaction. Amsterdam: Royal Tropical Institute.Google Scholar
- Fox, R. D., Mazmanian, P. E., & Putnam, R. W. (Eds.). (1989). Changing and learning in the lives of physicians. New York, NY: Praeger.Google Scholar
- Frank, J. (Ed.). (2005). The CanMEDS 2005 physician competency framework. Ottawa: The Royal College of Physicians and Surgeons of Canada.Google Scholar
- Hager, M., Russell, S., & Fletcher, S. W. (Eds). (2007). Continuing education in the health professions: Improving healthcare through lifelong learning. In Proceedings of a conference sponsored by the Josiah Macy, Jr. Foundation, November 28–December 1, 2007, Bermuda.Google Scholar
- Heffner, J. E. (2001). Altering physician behavior to improve clinical performance. Topics in Health Information Management, 22(2), 1–9.Google Scholar
- Institute of Medicine. Committee on Planning a Continuing Health Care Professional Education Institute. (2010). Redesigning continuing education in the health professions. Washington, DC: National Academies Press.Google Scholar
- Institute of Medicine. Committee on Quality of Health Care in America. (2001). Crossing the quality chasm: A new health system for the 21st century. Washington, DC: National Academy Press.Google Scholar
- Kern, D., Thomas, P., & Hughes, M. (2009). Curriculum development for medical education: A six-step approach (2nd ed.). Baltimore, MD: Johns Hopkins University Press.Google Scholar
- Kirkpatrick, D. (1959). Techniques for evaluating training programs. Journal of American Society of Training Directors, 13(3), 21–26.Google Scholar
- Marinopoulos, S. S., Dorman, T., Ratanawongsa, N., et al. (2007). Effectiveness of continuing medical education. Evidence Report/Technology Assessment 149. AHRQ Publication No. 07-E006. Rockville, MD: Agency for Healthcare Research and Quality.Google Scholar
- Moore, D. J. (1998). Needs assessment in the new health care environment: Combining discrepancy analysis and outcomes to create more effective CME. The Journal of Continuing Education in the Health Professions, 18(3), 133–141.Google Scholar
- Moore, D. J. (2003). A framework for outcomes evaluation in the continuing professional development of physicians. In D. Davis, B. Barnes, & R. Fox (Eds.), The continuing professional development of physicians: From research to practice (pp. 249–274). Chicago, IL: AMA Press.Google Scholar
- Nonaka, I., Toyama, R., & Hirata, T. (2008). Managing flow: A process theory of the knowledge-based firm. Basingstoke [England]; New York: Palgrave Macmillan.Google Scholar
- Olson, C. A., Tooman, T. R., & Alvarado, C. J. (2010). Knowledge systems, health care teams, and clinical practice: a study of successful change. Advances in Health Sciences Education: Theory and Practice, 15(4), 491–516.Google Scholar
- Oxman, A. D., Thomson, M. A., Davis, D. A., & Haynes, R. B. (1995). No magic bullets: A systematic review of 102 trials of interventions to improve professional practice. CMAJ, 153(10), 1423–1431.Google Scholar
- Pawson, R., & Tilley, N. (1997). Realistic evaluation. London: Sage.Google Scholar
- Popper, K. (1959). The logic of scientific discovery. New York: Basic Books.Google Scholar
- Resar, R., Pronovost, P., Haraden, C., Simmonds, T., Rainey, T., & Nolan, T. (2005). Using a bundle approach to improve ventilator care processes and reduce ventilator-associated pneumonia. Joint Commission Journal on Quality and Patient Safety, 31(5), 243–248.Google Scholar
- Schön, D. (1987). Educating the reflective practitioner: Toward a new design for teaching and learning in the professions. San Francisco, CA: Jossey-Bass.Google Scholar
- Thomson O’Brien, M. A., Freemantle, N., Oxman, A. D., Wolf, F., Davis, D. A., & Herrin, J. (2001). Continuing education meetings and workshops: effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews (2), CD003030.Google Scholar