Background

Medical schools around the world have implemented objective structured clinical examinations (OSCEs) [1]. In an OSCE, students move through a series of stations where they have to perform specific clinical tasks within a time limit. The content domains to be assessed and the scoring scheme for the examination are defined in advance [2]. Since its first description in the mid-1970s [3], the OSCE has been the subject of countless papers [4]. A number of papers have shown that the OSCE is a valid and reliable assessment of a student’s clinical competence [5,6,7,8]. Papers have also shown that students accept the OSCE as a relevant and fair exam [9,10,11].

However, only a few studies exist on how the deployment of OSCEs affects students’ study behaviour. Newble and Jaeger [12], for instance, reported that work-based learning, textbooks, tutorials, and group activities were the predominant resources when studying for a clinical examination. Mavis [13] found that students focused on cognitive learning strategies, such as reviewing textbooks or class notes, when preparing for an OSCE. This study, however, was limited to the extent that the examined OSCE was a formative, and not a summative assessment, which may explain the different findings. Rudland and colleagues [14] identified that the OSCE fostered collaborative learning, but did not encourage students to spend more time learning in clinical settings. The disparities found in the literature suggest that the OSCE does not always drive student learning in the desirable way. The student study behaviour may rather depend upon of what is specifically assessed in the OSCEs, the purpose of the assessment (summative vs. formative), as well as other factors such as patient availability [15], advice given by the teachers or information from peers.

The aim of our study was to examine whether the introduction of OSCE assessments shifted the time students spend studying. We explored what resources were important for studying and how much time students spent when studying for OSCEs compared to traditionally used multiple choice question (MCQ) tests, and how they performed on the respective assessment format.

Methods

Context

We conducted the present study in the context of the amendment of the national medical licensure act carried out in 2002, which called for a more practice- and patient-oriented alignment of medical education in Germany [16]. Each of the 36 medical schools established before 2012 has a six-year curriculum. The curricula usually consist of two preclinical years followed by three clinical years and, finally, the clinical internship year. According to the guidelines of the medical licensure act, the clinical years comprise 41 predetermined courses entailing the full range of clinical areas or disciplines. During these courses, students have to pass summative (graded) in-house assessments designed by medical school members to be admitted to the national licensing examination. The medical licensure act sets the general framework of the undergraduate programme, but schools have considerable freedom to organise their own curricula. Thus, both the succession of the individual courses and their specific content, as well as the accompanying assessment strategy differ from one school to another.

Written assessments, in the form of MCQ tests, are still most commonly used during the clinical years in all German medical schools. The focus is on testing a student’s knowledge about diseases, involving pathogenesis, signs and symptoms, diagnostic approaches, and treatment strategies. In order to comply with the new legal requirements, medical schools have broadened their assessment repertoire to include performance related skills. By now, 33 of the 36 medical schools (92%) have introduced at least one summative OSCE into their in-house assessment system used for the clinical curriculum [17]. In the held OSCEs, the main focus is on the performance domains physical examination, history taking, practical procedures, and communication skills. Passing the OSCE(s) is also a prerequisite for students to be admitted to the national licensing examination.

Student population

In the academic year 2014/15, there were around 88,000 medical students in Germany. Almost \( 2/3 \) of them (53,352 [61%]) were female students [18]. We surveyed medical schools on both the number of students per year and the timing when OSCEs occurred in the curriculum. With these data, we calculated the proportion of students during the clinical years or the clinical internship year who had exposure to a summative OSCE at slightly more than 32,700.

Data collection

Between February and April of 2015, we conducted this study using the free software package SoSci Survey (www.soscisurvey.de). Due to privacy terms, we did not get access to the students’ email addresses. We therefore could not administer our web-based questionnaire to a selected sample of the population in study; but instead, asked the medical schools in Germany to advertise the survey on their websites or through messaging systems. All medical students of years 3, 4, 5, and 6, who had undertaken at least one summative OSCE, were eligible to participate in the study. Participation was voluntary and anonymous, and the respondents did not receive any incentive for completing the questionnaire. The study was in accordance with the ethical standards of our institutional review board (Ethics Committee of Jena University Hospital at Friedrich Schiller University).

Design of the questionnaire

We first reviewed literature and conducted interviews with students to identify items that we could use for our study. Based on this knowledge, we developed a draft questionnaire. As a next step, we repeatedly pilot-tested and revised the draft for ensuring that respondents completed the survey in the intended manner. The final version of the questionnaire comprised two identical sections, the first for the OSCE and the second for the MCQs.

Each section contained a list of 12 study resources (Table 1). Participants rated their preferences in preparation for the respective assessment method on a 5-point scale, anchoring 1 (not important), 2 (slightly important), 3 (moderately important), 4 (important), and 5 (very important). Participants then answered two open-ended questions. Firstly, we prompted them to estimate the average total time they spent preparing for a single summative in-house OSCE or MCQ test. To improve ease of completion, we requested them to indicate their time spent in working days of about 8 h. Secondly, we asked them to report their overall average grade achieved on each of the two assessment methods. In this paper, we present the reported grades on a 4.0 grading scale ranging from 0.0 (failing grade) and 1.0 (lowest passing grade) to 4.0 (best possible grade). Finally, the questionnaire collected demographic details, involving gender, age, academic year (semester), and medical school affiliation.

Table 1 Students’ preferences of study resources when preparing for OSCEs and MCQ tests

The questionnaire also included an 11-item set on the benefit of the OSCE or MCQs (at the beginning of each section). The results are presented in a separate paper recently published in GMS Journal for Medical Education [19]. All the questionnaire items can be found in Additional file 1.

Data analysis

After sampling, we verified that each respondent had exposure to a summative OSCE by squaring the indicated semester or the day when completing our questionnaire with the specific curriculum of the relevant medical school. For carrying out the statistical analysis, we used IBM SPSS Statistics for Windows, Version 24 (IBM Corp., Armonk, NY, USA). We performed descriptive statistics and used a paired t-test to compare participants’ responses on the open-ended questions between the OSCE and MCQs. We calculated Cohen’s d as a measure of effect size from the t-statistic (t-value, group size, and Pearson’s correlation coefficient). To determine whether the medical school had an influence on the results, we conducted a univariate ANOVA using the mean of the responses for the two assessment methods (excluding missing data) as the dependent variable and the medical school as the fixed factor. We considered p values below 0.05 statistically significant. When not stated otherwise, we present data as means with standard deviations in parentheses.

Results

Sample

The number of participants who completed the questionnaire was 1189. We removed 58 respondents, as either the demographic details were incomplete or we observed that those respondents had not yet been exposed to a summative in-house OSCE. Our analysis included 1131 respondents (777 female students [69%] vs. 354 male students [31%]) from 32 of the 33 medical schools (97%) that were holding summative OSCE(s). The sample represented all age groups of students, with a range from 19 to 45 years (median 25). Group sizes of years 4, 5, and 6 were similar (318 [28%], 338 [30%], and 303 [27%], respectively), while the proportion of year 3 due to less exposure to OSCEs was lower (172 [15%]). The number of respondents in each of the 33 medical schools varied between 0 and 123, with a mean of 34 responses per school, depending on how the individual schools advertised our study (websites vs. messaging systems).

Study resources

All 1131 respondents included into the analysis completed the list on study resources. The ratings indicated that in preparation for OSCEs, students mostly preferred resources to acquire clinical skills. Physical examination courses (4.38 [0.82]) ranked first, followed by class notes/logs (3.88 [1.11]) and the skills lab (3.87 [1.21]). Medical clerkships (3.58 [1.16]) and clinical work placements (3.56 [1.12]), as well as group learning (3.61 [1.26]) and peer tutorials (3.53 [1.19]) ranked next. The ratings also showed that students attached moderate importance to resources for gathering knowledge, such as lectures (2.67 [1.08]) or textbooks (3.17 [1.07]), when studying for OSCEs.

Students’ preferences of resources for studying were different when they were preparing for MCQ tests. Resources to gather knowledge were most important, whereas those to acquire clinical skills were of minor importance. Class notes/logs (4.25 [0.95]), lectures (4.07 [1.03]), and textbooks (3.97 [0.98]) ranked first, second, and third, followed by multimedia materials (3.32 [1.19]). Table 1 shows the complete ratings on the list of study resources for both assessment methods.

Studying time

We obtained valid responses from 1043 respondents. The reported number of hours spent for an OSCE was 66.5 (52.5) and 94.8 (71.5) for an MCQ test, respectively, which was significantly different (t[1042] = − 14.78, p <  0.01). Cohen’s d of 0.44 showed an effect size in the medium range.

The ANOVA revealed that the medical school had a significant influence on the duration of studying (F[31, 1011] = 5.40, p <  0.01, partial eta squared = 0.14). Table 2 includes the results on studying time by medical school affiliation. We found that respondents from about half of the medical schools (15/32 [47%]) reported a significantly lower time spent in preparation for an OSCE compared to an MCQ test, while the time spent did not differ significantly between the assessment methods for respondents of the other schools.

Table 2 Reported studying time for OSCEs and MCQ tests by medical school affiliation

Performance outcomes

From our respondents, 1111 replied to the question about performance outcomes. The reported average grade was 3.13 (0.62) on OSCEs and 2.84 (0.62) on MCQ tests, respectively. The difference was significant (t[1110] = 12.55, p <  0.01). Cohen’s d effect size was 0.47 indicating a medium effect.

There was a significant influence of the medical school on the OSCE and MCQ grades (F[31, 1079] = 6.48, p <  0.01, partial eta squared = 0.16). Table 3 shows the results on performance outcomes by medical school affiliation. Our analysis revealed that the reported grades were significantly better on OSCEs for respondents of almost half of the medical schools (14/32 [44%]), whereas the grades on MCQ tests were significantly better only in one school (XV).

Table 3 Performance outcomes in OSCEs and MCQ tests by medical school affiliation

Discussion

In response to the amendment of the national licensure act, German medical schools have incorporated OSCEs into their system of assessment. This nationwide study sought to address how the introduction of OSCEs has affected the time students spend studying. We identified that students use different strategies to prepare in advance of OSCE assessments than common MCQ tests. However, this finding was not surprising: When preparing for an assessment, students adapt their study behaviour (what and how they learn) to the assessment rather than to the learning objectives laid down in the curriculum. Both the content domains to be expected and the tasks required in the upcoming assessment influence student learning [20, 21], which has been described as pre-assessment learning effects of assessment [22]. Given the tasks being tested in the OSCEs (taking a history, examining a patient or carrying out a procedure), we therefore expected that students seek opportunities to rehearse the desired clinical skills. Although other authors have reported similar findings, they only examined one OSCE at a single institution and did not use a multi-centre approach [23, 24].

In conclusion, our findings depicted that the deployment of OSCEs has an impact on the students’ learning behaviour. In agreement with previous studies [23,24,25], the assessment tool encourages students to acquire clinical skills in, for example, physical examination, practical procedures or communication. The assessment also appears to motivate students – as compared to the MCQ tests – to focus more on studying in authentic learning environments and the community, both of which has been seen as important to support learning [26, 27].

If students prepare for an OSCE “designed to assess certain competencies or skills” [28]; vs. MCQs, which draw items from a much larger content domain, then they would probably need less study time to achieve the required learning outcomes. Our findings confirmed this assumption for the first time. We found that even though there were differences between schools, students spent less time preparing for an OSCE compared with an MCQ test, and yet performed well.

There has been evidence that scores achieved on the OSCE are strong predictors of a later clinical performance [29, 30]. However, good performance on the OSCE does not necessarily mean a student will have the same level of competence or performance in the clinical workplace. The simulated environment in which the OSCEs take place can influence the performance. Thus, a student might perform poorer when he/she is faced with unexpected, unusual circumstances in the real workplace [31]. It is important to keep this in mind when considering the (good) performance outcomes in OSCEs [28].

Limitations

Our study has several limitations related to its sample. First, as we chose the sampling design of collecting data from every individual of the studied population by using advertisements, we could not retrace how many of the eligible students we approached. Therefore, we can report neither a response nor a non-response rate. Second, we found an overrepresentation of respondents from particular medical schools in our sample, which might have skewed the results. The varying degree to which the individual medical schools supported our study may have caused this fact. Nevertheless, the demographic profile of our sample reflected the general make-up of the medical student population in Germany and, in addition, we had a sufficiently large sample size for our analysis.

Another limitation of the study is that it relied on a self-reporting instrument to determine study resources, studying time, and performance outcomes leading to potential bias. This needs to be considered when interpreting the results.

Conclusions

We conclude that the introduction of the OSCE assessment shifted the time students spend studying. In preparation for OSCEs, students focus their attention on acquiring the necessary clinical skills, and they need less study time to achieve the expected level of competence or performance compared with MCQ tests. This clearly confirms the value of adding the OSCE assessment to a testing programme, as it places the emphasis on the acquisition of practical skills in addition to knowledge.