Objective:To determine the degree and sources of variability in faculty evaluations of residents for the American Board of Internal Medicine (ABIM) Clinical Evaluation Exercise (CEX).
Design:Videotaped simulated CEX containing programmed resident strengths and weaknesses shown to faculty evaluators, with responses elicited using the openended form recommended by the ABIM followed by detailed questionnaires.
Intervention:After the open-ended form was completed and collected, faculty members rated the resident’s performance on a five-point scale and rated the importance of various aspects of the history and physical examination for the patient shown.
Measurements and Main Results:Very few of the resident’s strengths and weaknesses were mentioned on the openended form, although responses to specific questions revealed that faculty members actually had observed many errors and some strengths that they had failed to document. Faculty members also displayed wide variance in the global assessment of the resident: 50% rated him marginal, 25% failed him, and 25% rated him satisfactory. Only for performance areas not directly related to the patient’s problems could substantial variability be explained by disagreement on standards.
Conclusions:Faculty internists vary markedly in their observations of a resident and document little. To be useful for resident feedback and evaluation, exercises such as the CEX may need to use more specific and detailed forms to document strengths and weaknesses, and faculty evaluators probably need to be trained as observers.
clinical competenceinternship and residencyinternal medicinecertification