Increased authenticity in practical assessment using emergency case OSCE stations
- 403 Downloads
In case of an emergency, a fast and structured patient management is crucial for patient’s outcome. The competencies needed should be acquired and assessed during medical education. The objective structured clinical examination (OSCE) is a valid and reliable assessment format to evaluate practical skills. However, traditional OSCE stations examine isolated skills or components of a clinical algorithm and thereby lack a valid representation of clinical reality. We developed emergency case OSCE stations (ECOS), where students have to manage complete emergency situations from initial assessment to medical treatment and consideration of further procedures. Our aim was to increase the authenticity and validity in the assessment of students’ capability to cope with emergency patients. 45 students participated in a 10-station OSCE with 6 ECOS and 4 traditional OSCE stations. They were assessed using a case-specific checklist. An inter-station and post-OSCE-questionnaire was completed by each student to evaluate both ECOS and traditional OSCE. In this study, we were able to demonstrate that ECOS are feasible as time-limited OSCE stations. There was a high acceptance on both students and examiners side. They rated ECOS to be more realistic in comparison to the traditional OSCE scenarios. The reliability estimated via Crohnbach’s α for the 6 ECOS is high (0.793). ECOS offer a feasible alternative to the traditional OSCE stations with adequate reliability to assess students’ capabilities to cope with an acute emergency in a realistic encounter.
KeywordsAssessment Authenticity Clinical skills Emergency medicine Reliability
- Barman, A. (2005). Critiques on the objective structured clinical examination. Annals of the Academy of Medicine, Singapore, 34, 478–482.Google Scholar
- Burdick, W. P., Jouriles, N. J., D’Onofrio, G., Kass, L. E., Mahoney, J. F., & Restifo, K. M. (1998). Emergency medicine in undergraduate education. SAEM Education Commitee, Undergraduate Subcommittee, Society for Academic Emergency Medicine. Academic Emergency Medicine, 5, 1105–1110.CrossRefGoogle Scholar
- Farrell, S. (2009). Evaluation of student performance: Clinical and professional performance. SAEM medical student educators’ handbook. Retrieved February 22, 2009, from http://www.saem.org/download/hand-4.pdf.
- Frankfurt Institute of Emergency Medicine and Simulation Training. (2009). Retrieved June 2, 2009, from www.finest-online.org.
- Harden, R. M., & Gleeson, F. A. (1979). Assessment of clinical competence using an objective structured clinical examination (OSCE). Medical Education, 13, 41–54.Google Scholar
- Johnson, G., & Reynard, K. (1994). Assessment of an objective structured clinical examination (OSCE) for undergraduate students in accident and emergency medicine. Journal of Accident and Emergency Medicine, 11, 223–226.Google Scholar
- Miller, G. E. (1990). The assessment of clinical skills/competence/performance. Academic Medicine, 65(Suppl. 9), 63–67.Google Scholar
- Selby, C., Osman, L., Davis, M., & Lee, M. (1995). Set up and run an objective structured clinical exam. BMJ, 310, 1187–1190.Google Scholar
- Streiner, D. L., & Norman, G. (2003a). Reliability. In D. L. Streiner & G. Norman (Eds.), Health measurement scales: A practical guide to their development and use (pp. 126–152). Oxford: Oxford University Press.Google Scholar
- Streiner, D. L., & Norman, G. (2003b). Selecting the items. In D. L. Streiner & G. Norman (Eds.), Health measurement scales: A practical guide to their development and use (pp. 61–79). Oxford: Oxford University Press.Google Scholar
- Stufflebeam, D. L. (2000). The checklists development checklist. Western Michigan University Evaluation Center. Retrieved February 22, 2009, from http://www.wmich.edu/evalctr/checklists/cdc.htm.