Prüfungen

  • Thomas Kollewe
  • Monika Sennekamp
  • Falk Ochsendorf
Chapter

Zusammenfassung

Prüfungen erfüllen aus didaktischer Sicht wichtige Funktionen. Besonders hervorzuheben ist deren Einfluss auf das Lernen („assessment drives learning“). Nach der Darstellung der Anforderungen an Prüfungen aus messtheoretischer Sicht sowie der Prüfungsplanung werden die einzelnen Prüfungsformate (schriftlich, mündlich, praktisch) dargestellt. Ein besonderes Augenmerk liegt dabei auf den Multiple-Choice-Fragen, die charakteristisch für das Medizinstudium sind. Es werden Tipps für die Formulierung guter Fragen sowie Hinweise zu deren Wiederverwendung gegeben. Mit dem „Key-Feature-Problem“ und dem „Script Concordance Test“ werden weitere Möglichkeiten der schriftlichen Prüfung beschrieben. Im Teil zu den mündlichen Prüfungen werden Einflussfaktoren auf die Bewertung und Anregungen für die generelle Verbesserung betrachtet. Hinsichtlich der praktischen Prüfungen werden neben der Objective Structured Clinical Examination (OSCE) auch Möglichkeiten aufgezeigt, wie Prüfungen im Rahmen der klinischen Tätigkeit durchgeführt werden können.

Literatur

  1. Beyer A, Dreier A, Kirschner S, Hoffmann W (2016) Objective Structured Clinical Examination (OSCE) als kompetenz-orientiertes Prüfungsinstrument in der pflegerischen Erstausbildung. Pflege 29:193–203CrossRefGoogle Scholar
  2. Bodle JF, Kaufmann SJ, Bisson D et al. (2008) Value and face validity of objective structured assessment of technical skills (OSATS) for work based assessment of surgical skills in obstetrics and gynaecology. Med Teach 30:212–216CrossRefGoogle Scholar
  3. Bok HGJ, Teunissen PW, Favier RP et al. (2013) Programmatic assessment of competency-based workplace learning: when theory meets practice. BMC Med Educ 13:123CrossRefGoogle Scholar
  4. Brannick MT, Erol-Korkmaz HT, Prewett M (2011) A systematic review of the reliability of objective structured clinical examination scores. Med Educ 45:1181–1189CrossRefGoogle Scholar
  5. Buckley S, Coleman J, Davison I et al. (2009) The educational effects of portfolios on undergraduate student learning: A Best Evidence Medical Education (BEME) systematic review. BEME Guide No. 11. Med Teach 31:282–298CrossRefGoogle Scholar
  6. Case SM, Swanson DB (1993) Extended-matching items: A practical alternative to free-response questions. Teach Learn Med 5:107–115CrossRefGoogle Scholar
  7. Case SM, Swanson DB (2002) Constructing Written Test Questions For the Basic and Clinical Sciences. National Board of Medical Examiners (NBME)Google Scholar
  8. Charlin B, Roy L, Brailovsky C et al. (2000) The Script Concordance Test: A Tool to Assess the Reflective Clinician. Teach Learn Med 12:189–195CrossRefGoogle Scholar
  9. Cömert M, Zill JM, Christalle E et al. (2016) Assessing Communication Skills of Medical Students in Objective Structured Clinical Examinations (OSCE) – A Systematic Review of Rating Scales. PLoS One 11:e0152717Google Scholar
  10. Dannefer EF, Henson LC (2007) The portfolio approach to competency-based assessment at the Cleveland Clinic Lerner College of Medicine. Acad Med 82:493–502CrossRefGoogle Scholar
  11. Dijkstra J, Galbraith R, Hodges BD et al. (2012) Expert validation of fit-for-purpose guidelines for designing programmes of assessment. BMC Med Educ 12:20CrossRefGoogle Scholar
  12. Donato AA, George DL (2012) A blueprint for implementation of a structured portfolio in an internal medicine residency. Acad Med 87:185–191CrossRefGoogle Scholar
  13. Dong T, Swygert KA, Durning SJ et al. (2014) Validity evidence for medical school OSCEs: associations with USMLE{\circledR} step assessments. Teach Learn Med 26:379–386CrossRefGoogle Scholar
  14. Dory V, Gagnon R, Vanpee D, Charlin B (2012) How to construct and implement script concordance tests: Insights from a systematic review. Med Educ 46:552–563CrossRefGoogle Scholar
  15. Driessen E (2017) Do portfolios have a future? Adv Health Sci Educ Theory Pract 22:221–228CrossRefGoogle Scholar
  16. Driessen E, van Tartwijk J, Vermunt JD, van der Vleuten CPM (2003) Use of portfolios in early undergraduate medical training. Med Teach 25:18–23CrossRefGoogle Scholar
  17. Driessen EW, van Tartwijk J, Overeem K et al. (2005) Conditions for successful reflective use of portfolios in undergraduate medical education. Med Educ 39:1230–1235CrossRefGoogle Scholar
  18. Driessen E, van Tartwijk J, van der Vleuten C, Wass V (2007a) Portfolios in medical education: why do they meet with mixed success? A systematic review. Med Educ 41:1224–1233CrossRefGoogle Scholar
  19. Driessen EW, Muijtjens AMM, van Tartwijk J, van der Vleuten CPM (2007b) Web- or paper-based portfolios: is there a difference? Med Educ 41:1067–1073CrossRefGoogle Scholar
  20. Driessen EW, Van Tartwijk J, Govaerts M et al. (2012) The use of programmatic assessment in the clinical workplace: a Maastricht case report. Med Teach 34:226–231CrossRefGoogle Scholar
  21. Durning SJ, Cation LJ, Markert RJ, Pangaro LN (2002) Assessing the reliability and validity of the mini-clinical evaluation exercise for internal medicine residency training. Acad Med 77:900–904CrossRefGoogle Scholar
  22. Fabry G (2008) Medizindidaktik. Ein Handbuch für die Praxis. Verlag Hans Huber, BernGoogle Scholar
  23. Fischer MR, Kopp V, Holzer M et al. (2005) A modified electronic key feature examination for undergraduate medical students: validation threats and opportunities. Med Teach 27:450–455CrossRefGoogle Scholar
  24. Fournier JP, Demeester A, Charlin B (2008) Script Concordance Tests: Guidelines for Construction. BMC Med Inform Decis Mak 8:18CrossRefGoogle Scholar
  25. Friedman Ben-David M (2000) Standard setting in student assessment. Med Teach 22:120–130CrossRefGoogle Scholar
  26. Gleeson F (2009) AMEE Medical Education Guide No. 9. Assessment of clinical competence using the Objective Structured Long Examination Record (OSLER). Med Teach 19:7–14CrossRefGoogle Scholar
  27. Guttormsen S, Beyeler C, Bonvin R et al. (2013) The new licencing examination for human medicine: from concept to implementation. Swiss Med Wkly 143:w13897Google Scholar
  28. Häcker TH (2011) Portfolio: ein Entwicklungsinstrument für selbstbestimmtes Lernen: Eine explorative Studie zur Arbeit mit Portfolios in der Sekundarstufe I, 3., unveränd. Aufl. Schneider-Verl, Hohengehren, BaltmannsweilerGoogle Scholar
  29. Hakstian AR (1971) The effects of type of examination anticipated on test preparation and performance. J Educ Res 64:319–324CrossRefGoogle Scholar
  30. Harden RM (1990) Twelve tips for organizing an Objective Structured Clinical Examination (OSCE). Med Teach 12:259–264CrossRefGoogle Scholar
  31. Harden RM, Gleeson FA (1979) Assessment of clinical competence using an objective structured clinical examination (OSCE). Med Eucation 13:41–54Google Scholar
  32. Harrison C, Wass V (2016) The challenge of changing to an assessment for learning culture. Med Educ 50:702–708CrossRefGoogle Scholar
  33. Hawkins RE, Margolis MJ, Durning SJ, Norcini JJ (2010) Constructing a validity argument for the mini-Clinical Evaluation Exercise: a review of the research. Acad Med 85:1453–1461CrossRefGoogle Scholar
  34. Hift RJ (2014) Should essays and other „open-ended“-type questions retain a place in written summative assessment in clinical medicine? BMC Med Educ 14:249CrossRefGoogle Scholar
  35. Hodges BD, Hollenberg E, McNaughton N et al. (2014) The Psychiatry OSCE: a 20-year retrospective. Acad Psychiatry 38:26–34CrossRefGoogle Scholar
  36. Hrynchak P, Glover Takahashi S, Nayer M (2014) Key-feature questions for assessment of clinical reasoning: A literature review. Med Educ 48:870–883CrossRefGoogle Scholar
  37. Huber-Lang M, Palmer A, Grab C et al. (2017) Visions and reality: the idea of competence-oriented assessment for German medical students is not yet realised in licensing examinations. GMS J Med Educ 34:Doc25Google Scholar
  38. Kelava A, Moosbrugger H (2012) Deskriptivstatistische Evaluation von Items (Itemanalyse) und Testwertverteilungen. In: Moosbrugger H, Kelava A (Hrsg) Testtheorie und Fragebogenkonstruktion. Springer, Berlin, Heidelberg, S 75–102Google Scholar
  39. Kogan JR, Bellini LM, Shea JA (2002) Implementation of the mini-CEX to evaluate medical students’ clinical skills. Acad Med 77:1156–1157CrossRefGoogle Scholar
  40. Kogan JR, Holmboe ES, Hauer KE (2009) Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA 302:1316–1326CrossRefGoogle Scholar
  41. Kopp V, Möltner A, Fischer MR (2006) Key-Feature-Probleme zum Prüfen von prozeduralem Wissen: Ein Praxisleitfaden. GMS Z Med Ausbild 23:Doc50Google Scholar
  42. Krebs R (2004) Anleitung zur Herstellung von MC-Fragen und MC-Prüfungen für die ärztliche Ausbildung. Institut für Medizinische Lehre, Abteilung für Assessment und Evaluation, Universität BernGoogle Scholar
  43. Lee V, Brain K, Martin J (2017) Factors Influencing Mini-CEX Rater Judgments and Their Practical Implications: A Systematic Literature Review. Acad Med 92:880–887CrossRefGoogle Scholar
  44. Lineberry M, Kreiter CD, Bordage G (2013) Threats to validity in the use and interpretation of script concordance test scores. Med Educ 47:1175–1183CrossRefGoogle Scholar
  45. Little-Wienert K, Mazziotti M (2017) Twelve tips for creating an academic teaching portfolio. Med Teach, Aug 17 2017, pp 1–5Google Scholar
  46. Martin JA, Regehr G, Reznick R et al. (1997) Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg 84:273–278CrossRefGoogle Scholar
  47. Möltner A, Schellberg D, Jünger J (2006) Grundlegende quantitative Analysen medizinischer Prüfungen. Basic quantitative analyses of medical examinations. GMS Zeitschrift für Medizinische Ausbildung 23:Doc53Google Scholar
  48. Moonen-van Loon JMW, Overeem K, Donkers HHLM et al. (2013) Composite reliability of a workplace-based assessment toolbox for postgraduate medical education. Adv Health Sci Educ Theory Pract 18:1087–1102CrossRefGoogle Scholar
  49. Moosbrugger H, Kelava A (2012) Qualitätsanforderungen an einen psychologischen Test (Testgütekriterien). In: Moosbrugger H, Kelava A (Hrsg) Testtheorie und Fragebogenkonstruktion. Springer, Berlin, Heidelberg, S 7–26Google Scholar
  50. Müller S, Dahmen U, Settmacher U (2016) Objective Structured Clinical Examination (OSCE) an Medizinischen Fakultäten in Deutschland – eine Bestandsaufnahme. Das Gesundheitswesen.  https://doi.org/10.1055/s-0042-116435
  51. Norcini JJ (2003) Standard setting on educational tests. Med Educ 37:464–469CrossRefGoogle Scholar
  52. Norcini JJ, Blank LL, Arnold GK, Kimball HR (1995) The mini-CEX (clinical evaluation exercise): a preliminary investigation. Ann Intern Med 123:795–799CrossRefGoogle Scholar
  53. Norcini JJ, Blank LL, Duffy FD, Fortna GS (2003) The mini-CEX: A method for assessing clinical skills. Ann Intern Med 138:476–481CrossRefGoogle Scholar
  54. O’Brien CL, Sanguino SM, Thomas JX, Green MM (2016) Feasibility and Outcomes of Implementing a Portfolio Assessment System Alongside a Traditional Grading System. Acad Med 91:1554–1560CrossRefGoogle Scholar
  55. O’Brien CL, Thomas JX, Green MM (2018) What Is the Relationship Between a Preclerkship Portfolio Review and Later Performance in Clerkships? Academic Medicine. 93(1):113–118CrossRefGoogle Scholar
  56. Patterson F, Zibarras L, Ashworth V (2016) Situational judgement tests in medical education and training: Research, theory and practice: AMEE Guide No. 100. Med Teach 38:3–17CrossRefGoogle Scholar
  57. Pell G, Fuller R, Homer M, Roberts T (2010) How to measure the quality of the OSCE: A review of metrics – AMEE guide no. 49. Med Teach 32:802–811CrossRefGoogle Scholar
  58. Preusche I, Schmidts M, Wagner-Menghin M (2012) Twelve tips for designing and implementing a structured rater training in OSCEs. Med Teach 34:368–372CrossRefGoogle Scholar
  59. Pugh D, Bhanji F, Cole G et al. (2016) Do OSCE progress test scores predict performance in a national high-stakes examination? Med Educ 50:351–358CrossRefGoogle Scholar
  60. Rodriguez MC (2005) Three Options Are Optimal for Multiple-Choice Items: A Meta-Analysis of 80 Years of Research. Educ Meas Issues Pract 24:3–13CrossRefGoogle Scholar
  61. Ruesseler M, Weinlich M, Byhahn C et al. (2010) Increased authenticity in practical assessment using emergency case OSCE stations. Adv Health Sci Educ Theory Pract 15:81–95CrossRefGoogle Scholar
  62. Schermelleh-Engel K, Werner CS (2012) Methoden der Reliabilitätsbestimmung. In: Moosbrugger H, Kelava A (Hrsg) Testtheorie und Fragebogenkonstruktion. Springer, Berlin, Heidelberg, S 119–141Google Scholar
  63. Schleicher I, Leitner K, Juenger J et al. (2017a) Examiner effect on the objective structured clinical exam – a study at five medical schools. BMC Med Educ 17:71Google Scholar
  64. Schleicher I, Leitner K, Juenger J et al. (2017b) Does quantity ensure quality? Standardized OSCE-stations for outcome-oriented evaluation of practical skills at different medical faculties. Ann Anat 212:55–60CrossRefGoogle Scholar
  65. Schrauth M, Riessen R, Wirtz TSH et al. (2005) Praktische Prüfungen sind machbar. Medizinische Ausbildung 22:20–22Google Scholar
  66. Schuwirth L, Ash J (2013) Assessing tomorrow’s learners: in competency-based education only a radical different holistic method of assessment will work. Six things we could forget. Med Teach 35:555–559CrossRefGoogle Scholar
  67. Schuwirth L, van der Vleuten C (2004) Different written assessment methods: What can be said about their strengths and weakness? Med Educ 38:974–979CrossRefGoogle Scholar
  68. Scouller K (1998) The influence of assessment method on students’ learning approaches: Multiple choice question examination versus assignment essay. High Educ 35(4):453–472CrossRefGoogle Scholar
  69. Shepard LA (2000) The Role of Assessment in a Learning Culture. 29:4–14Google Scholar
  70. Surry LT, Torre D, Durning SJ (2017) Exploring examinee behaviours as validity evidence for multiple-choice question examinations. Med Educ 51(10):1075–1085CrossRefGoogle Scholar
  71. Thomann LC (2011) Portfolio im Modellstudiengang Medizin der RWTH Aachen – Intention bei Einführung, Statusanalyse und Ermittlung der studentischen Anforderungen zur Implementierung eines elektronischen Portfolios. Dissertation, Rheinisch-Westfälischen Technischen Hochschule AachenGoogle Scholar
  72. Tigelaar DEH, Dolmans DHJM, Grave WS et al. (2006) Participants’ opinions on the usefulness of a teaching portfolio. Med Educ 40:371–378CrossRefGoogle Scholar
  73. Traynor M, Galanouli D, Rice B, Lynn F (2016) Evaluating the objective structured long examination record for nurse education. Br J Nurs 25:681–687CrossRefGoogle Scholar
  74. van der Vleuten CPM (1996) The Assessment of Professional Competence: Developments, Research and Practical Implications. Adv Heal Sci Educ 1:41–67Google Scholar
  75. van der Vleuten CPM (2016) Revisiting „Assessing professional competence: from methods to programmes“. Med Educ 50:885–888CrossRefGoogle Scholar
  76. van der Vleuten CPM, Schuwirth LWT (2005) Assessing professional competence: From methods to programmes. Med Educ 39:309–317CrossRefGoogle Scholar
  77. van der Vleuten C, Schuwirth L, Driessen EW, Govaerts M (2015) Twelve Tips for programmatic assessment. Med Teach 37:641–646Google Scholar
  78. Wakeford R, Southgate L, Wass V (1995) Improving Oral Examinations: Selecting, Training, And Monitoring Examiners For The. BMJ 311:931–935CrossRefGoogle Scholar
  79. Wass V, Jones R, van der Vleuten C (2001a) Standardized or real patients to test clinical competence? The long case revisited. Med Educ 35:321–325CrossRefGoogle Scholar
  80. Wass V, van der Vleuten C, Shatzer J, Jones R (2001b) Assessment of clinical competence. Lancet 357:945–949CrossRefGoogle Scholar
  81. Webb TP, Merkley TR (2012) An evaluation of the success of a surgical resident learning portfolio. J Surg Educ 69:1–7CrossRefGoogle Scholar
  82. Wiliam D (2011) What is assessment for learning? Stud Educ Eval 37:3–14CrossRefGoogle Scholar

Internetadresse

Copyright information

© Springer-Verlag GmbH Deutschland, ein Teil von Springer Nature 2018

Authors and Affiliations

  • Thomas Kollewe
    • 1
  • Monika Sennekamp
    • 2
  • Falk Ochsendorf
    • 3
  1. 1.Goethe-UniversitätFachbereich Medizin, Frankf. Arbeitsstelle für MedizindidaktikFrankfurt am MainDeutschland
  2. 2.Goethe-UniversitätFachbereich Medizin, Institut für AllgemeinmedizinFrankfurt am MainDeutschland
  3. 3.UniversitätsklinikumKlinik für Dermatologie, Venerologie und AllergologieFrankfurt am MainDeutschland

Personalised recommendations