The Role of Professional Staff in Assessing Students: A Case Study of the Objective Structured Clinical Exam

  • Darci TaylorEmail author
Living reference work entry

Later version available View entry history

Part of the University Development and Administration book series (UDAA)


Conducting an objective structured clinical exam (OSCE) to assess a student’s clinical competency is a complex and dynamic process that requires more than just academic input due to the intricate logistical and technical requirements. Such complexity necessitates the involvement of professional staff, who work collaboratively with academic staff in planning and conducting the OSCE itself – often having direct contact with students leading up to and during the exam. This chapter presents a case study to highlight the integral role of professional staff in the assessment of students undertaking an OSCE at an Australian university. The OSCE process involves a multiplicity of roles and skills, blurring the lines between traditional academic and professional staff boundaries, creating a partnership that arguably promotes mutual respect for the expertise of both roles in higher education. The technical, curriculum, and administrative expertise of professional staff is vital to running an effective OSCE, with professional staff often assuming leadership responsibilities during an OSCE to ensure a positive experience for the student. This level of expertise is often unrecognised by those outside the OSCE process, yet is essential to the quality and integrity of the OSCE and to the professional identity of the staff involved. This chapter unpacks the nature of the work and expertise involved in designing, developing, and delivering an OSCE and the range of qualities and skills required to ensure a successful experience for students.


Objective structured clinical exam OSCE Professional staff Academic staff Assessment Working relationships Student outcomes Logistics Third space Invisibility 


  1. Abdulghani, H.M., Z. Amin, and G. Ponnamperuma. 2014. An essential guide to developing, implementing, and evaluating objective structured clinical examination (OSCE). Hackensack: World Scientific Publishing Company.CrossRefGoogle Scholar
  2. Australian Government Department of Education and Training. 2015. 2015 Staff full-time equivalence. Retrieved from
  3. Australian Higher Education Industrial Association. 2016. Australian higher education workforce of the future. Retrieved from
  4. Birds, R. 2015. Redefining roles and identities in higher education: The liminal experiences of a university spinout company. Journal of Higher Education Policy & Management 37 (6): 633–645. Retrieved from
  5. Blackmore, J. 2009. Academic pedagogies, quality logics and performative universities: Evaluating teaching and what students want. Studies in Higher Education 34 (8): 857–872. Retrieved from
  6. Bosco, A.M., and S. Ferns. 2014. Embedding of authentic assessment in work-integrated learning curriculum. Asia-Pacific Journal of Cooperative Education 15 (4): 281–290.Google Scholar
  7. Brand, H.S., and M. Schoonheim-Klein. 2009. Is the OSCE more stressful? Examination anxiety and its consequences in different assessment methods in dental education. European Journal of Dental Education 13 (3): 147–153. Retrieved from
  8. Brannick, M.T. 2013. Metacognition, OSCE performance anxiety and OSCE performance. Medical Education 47 (6): 540–542. Retrieved from
  9. Braun, V., and V. Clarke. 2006. Using thematic analysis in psychology. Qualitative Research in Psychology 3 (2): 77–101. Retrieved from
  10. Carpenter, J.L. 1995. Cost analysis of objective structured clinical examinations. Academic Medicine 70 (9): 828–833.Google Scholar
  11. Collins, J.P., and R.M. Harden. 1998. AMEE medical education guide No. 13: Real patients, simulated patients and simulators in clinical examinations. Medical Teacher 20 (6): 508–521. Retrieved from
  12. Conway, M., and I. Dobson. 2003. Fear and loathing in university staffing: The case of the Australian academic and general staff. Higher Education Management and Policy 15 (3): 123–133. Retrieved from
  13. Dobson, I.R. 2000. ‘Them and Us’ – general and non-general staff in higher education. Journal of Higher Education Policy & Management 22 (2): 203–210. Retrieved from
  14. Friedman Ben-David, M. 2000. AMEE guide no. 18: Standard setting in student assessment. Medical Teacher 22 (2): 120–130. Retrieved from
  15. Gormley, G. 2011. Summative OSCEs in undergraduate medical education. The Ulster Medical Journal 80 (3): 127–132.Google Scholar
  16. Graham, C. 2013. Professional staff contributions to positive student outcomes. Australian Universities Review 55: 7–16.Google Scholar
  17. Graham, C. 2014. Another matrix revolution? The overlap of university work. Australian Universities Review 56 (1): 67–69.Google Scholar
  18. Gulikers, J.T.M., T.J. Bastiaens, and P.A. Kirschner. 2008. Defining authentic assessment: Five dimensions of authenticity. In Balancing dilemmas in assessment and learning in contemporary education, ed. A. Havnes and L. McDowell. New York: Routledge.Google Scholar
  19. Harden, R.M. 1990. Twelve tips for organizing an objective structure clinical examination (OSCE). Medical Teacher 12 (3/4): 259.Google Scholar
  20. Harvey, P., and N. Radomski. 2011. Performance pressure: Simulated patients and high-stakes examinations in a regional clinical school. Australian Journal of Rural Health 19 (6): 284–289.Google Scholar
  21. Kachur, E., S. Zabar, K. Hanley, A. Kalet, J.H. Bruno, and C.C. Gillespie. 2013. Organising OSCEs (and other SP exercises) in ten steps. In Objective structured clinical examinations: 10 steps to planning and implementing OSCEs and other standardized patient exercise, ed. S. Zabar, E. Kachur, K. Hanley, and A. Kalet, 7–34. New York: Springer Science+Business Media.Google Scholar
  22. Kandiko, C. 2013. The global student experience. In The global student experience: An international and comparative analysis, ed. C. Kandiko and M. Weyers, 1–10. Abingdon: Routledge.Google Scholar
  23. Khan, K.Z., K. Gaunt, S. Ramachandran, and P. Pushkar. 2013a. The objective structured clinical examination (OSCE): AMEE guide no. 81. Part II: Organisation & administration. Medical Teacher 35 (9): e1447–e1463. Retrieved from
  24. Khan, K.Z., S. Ramachandran, K. Gaunt, and P. Pushkar. 2013b. The objective structured clinical examination (OSCE): AMEE guide no. 81. Part I: An historical and theoretical perspective. Medical Teacher 35 (9): e1437–e1446. Retrieved from
  25. Koenen, A.-K., F. Dochy, and I. Berghmans. 2015. A phenomenographic analysis of the implementation of competence-based education in higher education. Teaching and Teacher Education 50: 1–12. Retrieved from
  26. Mercer, J. 2007. The challenges of insider research in educational institutions: Wielding a double-edged sword and resolving delicate dilemmas. Oxford Review of Education 33 (1): 1–17.Google Scholar
  27. Nicholson, B., and K. Forrest. 2009. What influences performance in the OSCE exam? The medical student perspective. Medical Teacher 31 (11): 1040–1041.Google Scholar
  28. O’Carroll, P.J., and P. Fisher. 2013. Metacognitions, worry and attentional control in predicting OSCE performance test anxiety. Medical Education 47 (6): 562–568. Retrieved from
  29. Regan, J.-A., E. Dollard, and N. Banks. 2014. A comparative study of the perceptions of professional staff on their contribution to student outcomes. Journal of Higher Education Policy and Management 36 (5): 533–545. Retrieved from
  30. Rowlands, J. 2013. Academic boards: Less intellectual and more academic capital in higher education governance? Studies in Higher Education 38 (9): 1274–1289. Retrieved from
  31. Rushforth, H.E. 2007. Objective structured clinical examination (OSCE): Review of the literature and implications for nursing education. Nurse Education Today 27: 481–490. Retrieved from
  32. Sudan, R., P. Clark, and B. Henry. 2015. Cost and logistics for implementing the American College of Surgeons objective structured clinical examination. American Journal of Surgery 209 (1): 140–144. Retrieved from
  33. Whitchurch, C. 2008a. Shifting identities and blurring boundaries: The emergence of third space professionals in UK higher education. Higher Education Quarterly 62 (4): 377–396. Retrieved from
  34. Whitchurch, C. 2008b. Beyond administration and management: Reconstructing the identities of professional staff in UK higher education. Journal of Higher Education Policy & Management 30 (4): 375–386. Retrieved from
  35. Whitchurch, C. 2009. The rise of the blended professional in higher education: A comparison between the United Kingdom, Australia and the United States. Higher Education 58 (3): 407–418. Retrieved from
  36. Whitchurch, C. 2012. Reconstructing identities in higher education: The rise of “third space” professionals. New York: Routledge.Google Scholar
  37. Yap, K., M. Bearman, N. Thomas, and M. Hay. 2012. Clinical psychology students’ experiences of a pilot objective structured clinical examination. Australian Psychologist 47 (3): 165–173. Retrieved from

Copyright information

© Springer Nature Singapore Pte Ltd. 2018

Authors and Affiliations

  1. 1.Deakin UniversityGeelongAustralia

Personalised recommendations