Clinical observed performance evaluation: a prospective study in final year students of surgery
- 174 Downloads
We report a prospective study of clinical observed performance evaluation (COPE) for 197 medical students in the pre-qualification year of clinical education. Psychometric quality was the main endpoint. Students were assessed in groups of 5 in 40-min patient encounters, with each student the focus of evaluation for 8 min. Each student had a series of assessments in a 25-week teaching programme. Over time, several clinicians from a pool of 16 surgical consultants and registrars evaluated each student by direct observation. A structured rating form was used for assessment data. Variance component analysis (VCA), internal consistency and inter-rater agreement were used to estimate reliability. The predictive and convergent validity of COPE in relation to summative OSCE, long case, and overall final examination was estimated. Median number of COPE assessments per student was 7. Generalisability of a mean score over 7 COPE assessments was 0.66, equal to that of an 8 × 7.5 min station final OSCE. Internal consistency was 0.88–0.97 and inter-rater agreement 0.82. Significant correlations were observed with OSCE performance (R = 0.55 disattenuated) and long case (R = 0.47 disattenuated). Convergent validity was 0.81 by VCA. Overall final examination performance was linearly related to mean COPE score with standard error 3.7%. COPE permitted efficient serial assessment of a large cohort of final year students in a real world setting. Its psychometric quality compared well with conventional assessments and with other direct observation instruments as reported in the literature. Effect on learning, and translation to clinical care, are directions for future research.
KeywordsAssessment Direct observation Evaluation Generalisability Measurement Psychometric Reliability Validity
- Bloch, R. (2010). http://fhsperd.mcmaster.ca/g_string.
- Brennan, R. L. (2001). University of Iowa http://www.education.uiowa.edu/casma/GenovaPrograms.htm. urGENOVA.
- Harden, R. M., & Gleeson, F. A. (1979). Assessment of clinical competence using an objective structured clinical examination (OSCE). Medical Education, 13, 41–54.Google Scholar
- Lord, F. M., & Novick, M. R. (1968). Statistical theories of mental test scores. Reading, MA: Addison-Wesley.Google Scholar
- Meskauskas, J. A. (1983). Studies of the oral examination: the examinations of the subspeciality Board of Cardiovascular Disease of the American Board of Internal Medicine. In J. S. Lloyd & D. G. Langsley (Eds.), Evaluating the skills of medical specialists. Chicago, IL: American Board of Medical Specialties.Google Scholar
- Noel, G. L., Herbers, J. E., Caplow, M. P., Cooper, G. S., et al. (1992). How well do internal faculty members evaluate the clinical skills of residents? Annals of Internal Medicine, 117, 757–765.Google Scholar
- Norcini, J. J., Blank, L. L., Duffy, F. D., & Fortna, G. S. (2003). The Mini-CEX: A method for assessing clinical skills. Annals of Internal Medicine, 138, 476–481.Google Scholar
- Reed, D., Price, E., Windish, D., et al. (2005). Challenges in systematic reviews of educational intervention studies. Annals of Internal Medicine, 142(12 pt 2), 1080–1089.Google Scholar
- Swanson, D., Norman, G., & Linn, R. (1995). Performance-based assessment: Lessons from the health professions. Educational Researcher, 24(5), 5–11.Google Scholar
- Torre, D. M., Simpson, D. E., Elnicki, D. M., Sebastian, J. L., & Holmboe, E. S. (2007). Feasibility, reliability and user satisfaction with a PDA-based Mini- CEX to evaluate the clinical skills of third-year medical students. Teaching and Learning in Medicine, 19(3), 271–277.Google Scholar