Abstract
Video-based assessment is a reliable method for testing clinical skills performance. Several published studies have different results because of various bias factors. This study aimed to describe the development and use of videos to assess the effect of the Objective Structured Clinical Examination (OSCE) examiners’ backgrounds. Cardio-Pulmonary Resuscitation (CPR) was chosen for this study because it has a guideline from the American Heart Association. The development steps included: the assessment guidelines were rewritten by two cardiologists; two standardized simulated CPR procedure videos were made with their supervision. The CPR video showed performance following the guidelines and the other showed CPR not according to guidelines. The cardiologist gave feedback after watching the two videos. Finally, 51 OSCE examiners in the Medical Faculty, Duta Wacana Christian University assessed the CPR performance in the videos using standardized assessment guidelines. Examiners were categorized according to their backgrounds and the average results of the assessment based on their background characteristics were analyzed by the Kruskal–Wallis test. The results show that the two videos were developed and the assessment on those two videos did not significantly differ between examiners’ background categories (p > 0.05). The clinical practice experience and educational background category had a significant score difference (p = 0.04; df = 3 and p = 0.03; df = 2, respectively). There were no score differences between examiners, except in clinical practice experience and educational background categories. Video-based assessment can foster the objectivity of OSCE hence it can be applied in OSCE scoring assessor training. However, there are still sources of biases that academics need to be aware of and consider.
Part of this paper was presented at INA-MHPEC 2022 and received The Best Poster Presentation.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Abbreviations
- OSCE:
-
Objective Structured Clinical Examination
References
Reid K, Smallwood D, Collins M, Sutherland R, Dodds A (2016) Taking OSCE examiner training on the road: reaching the masses. Med Educ Online 21:1–5. https://doi.org/10.3402/meo.v21.32389
Schleicher I, KL, Juenger J, Moeltner A, Ruesseler M, Bender B, Sterz J, Schuettler K-F, Koenig S, Kreuder JG (2017) Examiner effect on the objective structured clinical exam: a study at five medical schools. BMC Med Educ 17(71):1–7. https://doi.org/10.1186/s12909-017-0908-1
Pell G, Homer M, Fuller R (2015) Investigating disparity between global grades and checklist scores in OSCEs. Med Teach 32(17):1106–1113. https://doi.org/10.3109/0142159X.2015.1009425
Kanada Y, Sakurai H, Sugiura Y (2015) Difficulty levels of OSCE items related to examination and measurement skills. J Phys Ther Sci 27(3):715–718. https://doi.org/10.1589/jpts.27.715
Taylor S, Haywood M, Shulruf B (2019) Comparison of the effects of simulated patient clinical skill training and student roleplay on objective structured clinical examination performance among medical students in Australia. J Educ Eval Health Prof 16:3. https://doi.org/10.3352/jeehp.2019.16.3
Zabar S, Kachur EK, Kalet A, Hanley K (2013) Objective structured clinical examinations 10 steps to planning and implementing OSCEs and other standardized patient exercises. Springer Science+Business Media, New York
Chahine S, Holmes B, Kowalewski Z (2015) In the minds of OSCE examiners: uncovering hidden assumptions. Adv Health Sci Educ
Chong L, Taylor S, Haywood M, Adelstein B-A, Shulruf B (2018) Examiner seniority and experience are associated with bias when scoring communication, but not examination, skills in objective structured clinical examinations in Australia. J Educ Eval Health Prof 15(17). https://doi.org/10.3352/jeehp.2018.15.17
Sobh AH, M.I. MI, Diab MI, Pawluk SA, Austin Z, Wilby KJ (2017) Qualitative evaluation of a cumulative exit-from-degree objective structured clinical examination (OSCE) in a Gulf context. Pharm Educ 17(1):73–80
Yeates P, Moreau M, Eva (2015) Are examiners’ judgments in OSCE-style assessments influenced by contrast effects? Acad Med 90(7):975–980. https://doi.org/10.1097/ACM.0000000000000650
Kogan JR, Conforti LN, Iobst WF, Holmboe ES (2014) Reconceptualizing variable rater assessments as both an educational and clinical care problem. Acad Med 89(5):721–727. https://doi.org/10.1097/ACM.0000000000000221
Mortsiefer A, Karger A, Rotthoff T, Raski B, Pentzek M (2017) Examiner characteristics and interrater reliability in a communication OSCE. Patient Educ Couns 100:1230–1234. https://doi.org/10.1016/j.pec.2017.01.013
Massey D, Byrne J, Higgins N, Weeks B, Shuker M-A, Coyne E, Mitchell M, Johnston ANB (2017) Enhancing OSCE preparedness with video exemplars in undergraduate nursing students: a mixed method study. Nurse Educ Today 54:56–61
Erdogan A, Dong Y, Chen X, Schmickl C, Berrios RAS, Arguello LYG, Kashyap R, OK, Pickering B, Gajic O et al (2016) Development and validation of clinical performance assessment in simulated medical emergencies: an observational study. BMC Emerg Med 16(4). https://doi.org/10.1186/s12873-015-0066-x
Merchant RM, Topjian AA, Panchal AR, Cheng A, Aziz K, Berg KM, Lavonas EJ, Magid DJ, Basic A, Advanced Life Support PB et al (2020) Part 1: executive summary: 2020 American heart association guidelines for cardiopulmonary resuscitation and emergency cardiovascular care 142(16_Suppl_2):S337–S357. https://doi.org/10.1161/CIR.0000000000000918
Panchal AR, Bartos JA, Cabañas JG, Donnino MW, Drennan IR, Hirsch KG, Kudenchuk PJ, Kurz MC, Lavonas EJ, Morley PTJC (2020) Part 3: adult basic and advanced life support: 2020 American heart association guidelines for cardiopulmonary resuscitation and emergency cardiovascular care 142(16_Suppl_2):S366–S468. https://doi.org/10.1161/CIR.0000000000000916
Fleming SE, Reynolds J (2009) Wallace BJNe: lights... camera... action! a guide for creating a DVD/video 34(3):118–121
Lopes JdL, Baptista RCN, Domingues TAM, Ohl RIB, Barros ALBLdJRL-AdE (2020) Development and validation of a video on bed baths. Rev Lat Am Enfermagem 28:e3329. https://doi.org/10.1590/1518-8345.3655.3329
Beese NO, Rodriguez FS, Spilski J, Lachmann TJFiph (2021) Development of a digital video-based occupational risk assessment method. Front Public Health 9:683850. https://doi.org/10.3389/fpubh.2021.683850
Govaerts MJB, van der Vleuten CPM, Schuwirth LWT, Muijtjens AMM (2007) Broadening perspectives on clinical performance assessment: rethinking the nature of in-training assessment. Adv Health Sci Edu 12(2):239–260. https://doi.org/10.1007/s10459-006-9043-1
Naumann FL, Marshall S, Shulruf B, Jones PD (2016) Exploring examiner judgement of professional competence in rater based assessment. Adv Health Sci Educ 21(4):775–788
Byrne A, Soskova T, Dawkins J, Coombes L (2016) A pilot study of marking accuracy and mental workload as measures of OSCE examiner performance. BMC Med Educ 16:191. https://doi.org/10.1186/s12909-016-0708-z
Ali M, Pawluk SA, Rainkie DC, Wilby KJ (2019) Pass-fail decisions for borderline performers after a summative objective structured clinical examination. Am J Pharm Educ 83(2):142–147. https://doi.org/10.5688/ajpe6849
Daniels VJ, Bordage G, Gierl MJ, Yudkowsky R (2014) Effect of clinically discriminating, evidence-based checklist items on the reliability of scores from an internal medicine residency OSCE. Adv Health Sci Educ 19(4):497–506
Kozato A, Patel N, Shikino K (2020) A randomised controlled pilot trial of the influence of non-native English accents on examiners’ scores in OSCEs. BMC Med Educ 20(1):268. https://doi.org/10.1186/s12909-020-02198-y
Chong L, Taylor S, Haywood M, Adelstein BA, Shulruf B (2017) The sights and insights of examiners in objective structured clinical examinations. J Educ Eval Health Prof 14:34. https://doi.org/10.3352/jeehp.2017.14.34
Ethics Approval and Consent to Participate
This study was approved by the Health Research Ethics Committee Faculty of Medicine Universitas Kristen Duta Wacana (Reference No.1068/C.16/FK/2019).
Competing Interest
The authors declare that there are no competing interests related to the study.
Acknowledgements
The author would like to thank the staff of the Faculty of Medicine, Universitas Kristen Duta Wacana for supporting the research.
Authors’ Contribution
Oscar Gilang Purnajati—conceived the research, reviewed the literature, designed the study, acquired funding, data analysis, and wrote the manuscript.
Rachmadya Nur Hidayah—developing study framework, data analysis the data, and reviewing the final manuscript.
Gandes Retno Rahayu—analyzed the data, reviewing the final manuscript.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Purnajati, O.G., Hidayah, R.N., Rahayu, G.R. (2023). Developing Clinical Skill Videos as an Instrument to Assess the Objective Structured Clinical Examination (OSCE) Examiners’ Effect. In: Claramita, M., Soemantri, D., Hidayah, R.N., Findyartini, A., Samarasekera, D.D. (eds) Character Building and Competence Development in Medical and Health Professions Education. INA-MHPEC 2022. Springer Proceedings in Humanities and Social Sciences. Springer, Singapore. https://doi.org/10.1007/978-981-99-4573-3_7
Download citation
DOI: https://doi.org/10.1007/978-981-99-4573-3_7
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-4572-6
Online ISBN: 978-981-99-4573-3
eBook Packages: MedicineMedicine (R0)