Journal of General Internal Medicine

, Volume 19, Issue 5, pp 558–561 | Cite as

Feedback and the mini clinical evaluation exercise

  • Eric S. Holmboe
  • Monica Yepes
  • Frederick Williams
  • Stephen J. Huot
Brief Reports


We studied the nature of feedback given after a miniCEX. We investigated whether the feedback was interactive; specifically, did the faculty allow the trainee to react to the feedback, enable self-assessment, and help trainees to develop an action plan for improvement. Finally, we investigated the number of types of recommendations given by faculty. One hundred and seven miniCEX feedback sessions were audiotaped. The faculty provided at least 1 recommendation for improvement in 80% of the feedback sessions. The majority of the sessions (61%) involved learner reaction, but in only 34% of the sessions did faculty ask for self-assessment from the intern and only 8% involved an action plan from the faculty member. Faculty are using the miniCEX to provide recommendations and often encourage learner reaction, but are underutilizing other interactive feedback methods of self-assessment and action plans. Programs should consider both specific training in feedback and changes to the miniCEX form to facilitate interactive feedback.

Key words

feedback direct observation evaluation 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Norcini JJ, Blank LL, Arnold GK, Kimball HR. The Mini-CEX (Clinical Evaluation Exercise): a preliminary investigation. Ann Intern Med. 1995;123:795–9.PubMedGoogle Scholar
  2. 2.
    Norcini JJ, Blank LL, Duffy FD, Fortna GS. The Mini-CEX: a method for assessing clinical skills. Ann Intern Med. 2003;138:476–81.PubMedGoogle Scholar
  3. 3.
    Holmboe ES, Huot SJ, Chung J, Norcini JJ, Hawkins RE. Construct validity of the mini-clinical evaluation exercise (miniCEX). Acad Med. 2003;78:826–30.PubMedCrossRefGoogle Scholar
  4. 4.
    Ende J. Feedback in clinical medical education. JAMA. 1983;250:777–81.PubMedCrossRefGoogle Scholar
  5. 5.
    Skeff KM, Stratos GA, Berman J, Bergen MR. Improving clinical teaching. Evaluation of a national dissemination program. Arch Intern Med. 1992;152:1156–61.PubMedCrossRefGoogle Scholar
  6. 6.
    Salerno SM, Jackson JL, O’Malley PG. Interactive faculty development seminars improve the quality of written feedback in ambulatory teaching. J Gen Intern Med. 2003;18:831–4.PubMedCrossRefGoogle Scholar
  7. 7.
    Salerno SM, O’Malley PG, Pangaro LN, Wheeler GA, Moores LK, Jackson JL. Faculty development seminars based on the one-minute preceptor improve feedback in the ambulatory setting. J Gen Intern Med. 2002;17:779–87.PubMedCrossRefGoogle Scholar
  8. 8.
    Strauss A, Corbin J. Basics of Qualitative Research. Thousand Oaks, Calif: Sage Publishing; 1998.Google Scholar
  9. 9.
    Crabtree BF, Miller WL, eds. Doing Qualitative Research. Thousand Oaks, Calif: Sage Publishing; 1999.Google Scholar
  10. 10.
    Denzin NK, Lincoln YS, eds. Collecting and Interpreting Qualitative Materials. Thousand Oaks, Calif: Sage Publishing; 1998.Google Scholar
  11. 11.
    Braddock CH III, Edwards KA, Hasenberg NM, Laidley TL, Levinson W. Informed decision making in outpatient practice. Time to get back to basics. JAMA. 1999;282:2313–20.PubMedCrossRefGoogle Scholar
  12. 12.
    Mangione S, Nieman LZ. Cardiac auscultatory skills of internal medicine and family practice trainees. A comparison of diagnostic proficiency. JAMA. 1997;278:717–22.PubMedCrossRefGoogle Scholar

Copyright information

© Society of General Internal Medicine 2004

Authors and Affiliations

  • Eric S. Holmboe
    • 1
    • 2
  • Monica Yepes
    • 3
  • Frederick Williams
    • 4
  • Stephen J. Huot
    • 1
  1. 1.the Yale Primary Care Internal Medicine ResidencyNew Haven
  2. 2.National Naval Medical CenterBethesda
  3. 3.George Washington University School of MedicineWashington, DC
  4. 4.Washington Hospital CenterWashington, DC

Personalised recommendations