Advances in Health Sciences Education

, Volume 15, Issue 1, pp 81–95 | Cite as

Increased authenticity in practical assessment using emergency case OSCE stations

  • Miriam Ruesseler
  • Michael Weinlich
  • Christian Byhahn
  • Michael P. Müller
  • Jana Jünger
  • Ingo Marzi
  • Felix Walcher
Original Paper

Abstract

In case of an emergency, a fast and structured patient management is crucial for patient’s outcome. The competencies needed should be acquired and assessed during medical education. The objective structured clinical examination (OSCE) is a valid and reliable assessment format to evaluate practical skills. However, traditional OSCE stations examine isolated skills or components of a clinical algorithm and thereby lack a valid representation of clinical reality. We developed emergency case OSCE stations (ECOS), where students have to manage complete emergency situations from initial assessment to medical treatment and consideration of further procedures. Our aim was to increase the authenticity and validity in the assessment of students’ capability to cope with emergency patients. 45 students participated in a 10-station OSCE with 6 ECOS and 4 traditional OSCE stations. They were assessed using a case-specific checklist. An inter-station and post-OSCE-questionnaire was completed by each student to evaluate both ECOS and traditional OSCE. In this study, we were able to demonstrate that ECOS are feasible as time-limited OSCE stations. There was a high acceptance on both students and examiners side. They rated ECOS to be more realistic in comparison to the traditional OSCE scenarios. The reliability estimated via Crohnbach’s α for the 6 ECOS is high (0.793). ECOS offer a feasible alternative to the traditional OSCE stations with adequate reliability to assess students’ capabilities to cope with an acute emergency in a realistic encounter.

Keywords

Assessment Authenticity Clinical skills Emergency medicine Reliability 

References

  1. Barman, A. (2005). Critiques on the objective structured clinical examination. Annals of the Academy of Medicine, Singapore, 34, 478–482.Google Scholar
  2. Battles, J. B., Wilkinson, S. L., & Lee, S. J. (2004). Using standardised patients in an objective structured clinical examination as a patient safety tool. Quality & Safety in Health Care, 13, 46–50.CrossRefGoogle Scholar
  3. Boursicot, K., & Roberts, T. (2005). How to set up an OSCE. The Clinical Teacher, 2, 16–20.CrossRefGoogle Scholar
  4. Burdick, W. P., Jouriles, N. J., D’Onofrio, G., Kass, L. E., Mahoney, J. F., & Restifo, K. M. (1998). Emergency medicine in undergraduate education. SAEM Education Commitee, Undergraduate Subcommittee, Society for Academic Emergency Medicine. Academic Emergency Medicine, 5, 1105–1110.CrossRefGoogle Scholar
  5. Crossley, J., Humphris, G., & Jolly, B. (2002). Assessing health professionals. Medical Education, 36, 800–804.CrossRefGoogle Scholar
  6. Downing, S. M. (2004). Reliability: On the reproducibility of assessment data. Medical Education, 38, 1006–1012.CrossRefGoogle Scholar
  7. Farrell, S. (2009). Evaluation of student performance: Clinical and professional performance. SAEM medical student educators’ handbook. Retrieved February 22, 2009, from http://www.saem.org/download/hand-4.pdf.
  8. Frankfurt Institute of Emergency Medicine and Simulation Training. (2009). Retrieved June 2, 2009, from www.finest-online.org.
  9. Harden, R. M., & Gleeson, F. A. (1979). Assessment of clinical competence using an objective structured clinical examination (OSCE). Medical Education, 13, 41–54.Google Scholar
  10. Harden, R. M., Stevenson, M., Downie, W. W., & Wilson, G. M. (1975). Assessment of clinical competence using objective structured examination. British Medical Journal, 1, 447–451.CrossRefGoogle Scholar
  11. Hodges, B. (2003a). Validity and the OSCE. Medical Teacher, 25, 250–254.CrossRefGoogle Scholar
  12. Hodges, B. (2003b). OSCE! Variations on a theme by Harden. Medical Education, 37, 1134–1140.CrossRefGoogle Scholar
  13. Hodges, B., Hanson, M., McNaughton, N., & Regehr, G. (2002). Creating, monitoring, and improving a psychiatry OSCE: A guide for faculty. Academic Psychiatry, 26, 134–161.CrossRefGoogle Scholar
  14. Hodges, B., Regehr, G., Hanson, M., & McNaughton, N. (1998). Validation of an objective structured clinical examination in psychiatry. Academic Medicine, 73, 910–912.CrossRefGoogle Scholar
  15. Johnson, G., & Reynard, K. (1994). Assessment of an objective structured clinical examination (OSCE) for undergraduate students in accident and emergency medicine. Journal of Accident and Emergency Medicine, 11, 223–226.Google Scholar
  16. Lunenfeld, E., Weinreb, B., Lavi, Y., Amiel, G. E., & Friedman, M. (1991). Assessment of emergency medicine: A comparison of an experimental objective structured clinical examination with a practical examination. Medical Education, 25, 38–44.CrossRefGoogle Scholar
  17. Miller, G. E. (1990). The assessment of clinical skills/competence/performance. Academic Medicine, 65(Suppl. 9), 63–67.Google Scholar
  18. Newble, D. (2004). Techniques for measuring clinical competence: Objective structured clinical examinations. Medical Education, 38, 199–203.CrossRefGoogle Scholar
  19. Selby, C., Osman, L., Davis, M., & Lee, M. (1995). Set up and run an objective structured clinical exam. BMJ, 310, 1187–1190.Google Scholar
  20. Streiner, D. L., & Norman, G. (2003a). Reliability. In D. L. Streiner & G. Norman (Eds.), Health measurement scales: A practical guide to their development and use (pp. 126–152). Oxford: Oxford University Press.Google Scholar
  21. Streiner, D. L., & Norman, G. (2003b). Selecting the items. In D. L. Streiner & G. Norman (Eds.), Health measurement scales: A practical guide to their development and use (pp. 61–79). Oxford: Oxford University Press.Google Scholar
  22. Stufflebeam, D. L. (2000). The checklists development checklist. Western Michigan University Evaluation Center. Retrieved February 22, 2009, from http://www.wmich.edu/evalctr/checklists/cdc.htm.
  23. van der Vleuten, C. (1996). The assessment of professional competence: Developments, research and practical implications. Advances in Health Sciences Education, 1, 41–67.CrossRefGoogle Scholar
  24. van der Vleuten, C. (2000). Validity of final examinations in undergraduate medical training. BMJ, 321, 1217–1219.CrossRefGoogle Scholar
  25. van der Vleuten, C. P., & Schuwirth, L. W. (2005). Assessing professional competence: From methods to programmes. Medical Education, 39, 309–317.CrossRefGoogle Scholar
  26. van der Vleuten, C., & Swanson, D. (1990). Assessment of clinical skills with standardized patients: State of the art. Teaching and Learning in Medicine, 2, 58–76.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media B.V. 2009

Authors and Affiliations

  • Miriam Ruesseler
    • 1
  • Michael Weinlich
    • 2
  • Christian Byhahn
    • 3
  • Michael P. Müller
    • 4
  • Jana Jünger
    • 5
  • Ingo Marzi
    • 1
  • Felix Walcher
    • 1
  1. 1.Department of Trauma SurgeryJohann Wolfgang Goethe-UniversityFrankfurt/MainGermany
  2. 2.Med Con Team, CEOReutlingenGermany
  3. 3.Department of Anaesthesiology, Intensive Care Medicine and Pain TherapyJohann Wolfgang Goethe-UniversityFrankfurtGermany
  4. 4.Department of Anaesthesiology and Critical Care MedicineCarl Gustav Carus Technical UniversityDresdenGermany
  5. 5.Medical HospitalUniversity of HeidelbergHeidelbergGermany

Personalised recommendations