Abstract
Purpose
The primary aim of this study was to investigate the effect of including the Dutch National Pharmacotherapy Assessment (DNPA) in the medical curriculum on the level and development of prescribing knowledge and skills of junior doctors. The secondary aim was to evaluate the relationship between the curriculum type and the prescribing competence of junior doctors.
Methods
We re-analysed the data of a longitudinal study conducted in 2016 involving recently graduated junior doctors from 11 medical schools across the Netherlands and Belgium. Participants completed three assessments during the first year after graduation (around graduation (+ / − 4 weeks), and 6 months, and 1 year after graduation), each of which contained 35 multiple choice questions (MCQs) assessing knowledge and three clinical case scenarios assessing skills. Only one medical school used the DNPA in its medical curriculum; the other medical schools used conventional means to assess prescribing knowledge and skills. Five medical schools were classified as providing solely theoretical clinical pharmacology and therapeutics (CPT) education; the others provided both theoretical and practical CPT education (mixed curriculum).
Results
Of the 1584 invited junior doctors, 556 (35.1%) participated, 326 (58.6%) completed the MCQs and 325 (58.5%) the clinical case scenarios in all three assessments. Junior doctors whose medical curriculum included the DNPA had higher knowledge scores than other junior doctors (76.7% [SD 12.5] vs. 67.8% [SD 12.6], 81.8% [SD 11.1] vs. 76.1% [SD 11.1], 77.0% [12.1] vs. 70.6% [SD 14.0], p < 0.05 for all three assessments, respectively). There was no difference in skills scores at the moment of graduation (p = 0.110), but after 6 and 12 months junior doctors whose medical curriculum included the DNPA had higher skills scores (both p < 0.001). Junior doctors educated with a mixed curriculum had significantly higher scores for both knowledge and skills than did junior doctors educated with a theoretical curriculum (p < 0.05 in all assessments).
Conclusion
Our findings suggest that the inclusion of the knowledge focused DNPA in the medical curriculum improves the prescribing knowledge, but not the skills, of junior doctors at the moment of graduation. However, after 6 and 12 months, both the knowledge and skills were higher in the junior doctors whose medical curriculum included the DNPA. A curriculum that provides both theoretical and practical education seems to improve both prescribing knowledge and skills relative to a solely theoretical curriculum.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Introduction
A substantial proportion of European medical students lack adequate prescribing knowledge and skills at graduation, probably because, among other aspects, they had too little education in clinical pharmacology and therapeutics (CPT) during their undergraduate training [1, 2]. It is often assumed that the prescribing competence of these students will improve once they become junior doctors, as they gain clinical experience. However, recently, we showed that the prescribing knowledge and skills of junior doctors (recent graduates) in the Netherlands and Flanders (Belgium) did not improve during the first year after graduation [3]. This is troubling because most hospital prescriptions (63–78%) are written out by junior doctors, who make the most prescribing errors (9–10% of all their prescriptions) [4,5,6]. This poor prescribing unfavourably affects patient safety, treatment effectiveness, and healthcare costs [4, 7, 8].
Adequate prescribing competence of graduating medical students stands high on the agenda of (inter)national societies, such as the European Association for Clinical Pharmacology and Therapeutics (EACPT) and the Dutch Society for Clinical Pharmacology and Biopharmacy [9, 10]. A final assessment of prescribing competence could be a first step to ensure that medical students have acquired sufficient prescribing knowledge and skills before graduation. Moreover, utilizing such an assessment could guide the teaching and training in clinical pharmacology and therapeutics (CPT) into the desired direction [11, 12]. The European Prescribing Exam (EuroPE+) was developed for this purpose and has been distributed and used among European medical schools [13]. In addition, the Prescribing Safety Assessment (PSA) has been developed in the UK and the Dutch National Pharmacotherapy Assessment (DNPA) in the Netherlands [10, 14,15,16]. The DNPA, developed in 2014 by the Dutch Society for Clinical Pharmacology and Biopharmacy, consists of 60 multiple choice questions (MCQ) focusing on prescribing safety such as ready knowledge (e.g. the mechanisms of action, clinically relevant side-effects, and contraindications) about the drugs responsible for the majority of medication-related harm and hospital admissions. [10, 16]. However, it is currently not known what the effect of a national prescribing safety examination is on the level and development of prescribing knowledge and skills of junior doctors in the year after graduation.
Besides a final prescribing safety assessment, another way to improve CPT education is to provide a combination of theoretical (e.g. lectures, seminars, self-study, written exams) and practical (e.g. clinics, bedside teaching, prescribing for real patients) teaching in medical curricula. Studies have shown that medical students who have followed problem-based learning, which is most probably more embedded in practical learning, have better prescribing knowledge and skills [17,18,19]. Moreover, enriching the learning context with real patients has been shown to improve students’ prescription-writing skills [20]. At the moment, all Dutch and Flanders (Belgium) medical schools have mixed learning curricula. However, little is known about the effect of different CPT curricula on the prescribing competence of junior doctors.
The main aim of this study was to investigate the effect of the Dutch National Pharmacotherapy Assessment as part of the medical curriculum on the level and development of prescribing knowledge and skills of junior doctors. The second aim was to evaluate the relationship between the curriculum type and the prescribing competence of junior doctors. We hypothesised that the knowledge-based DNPA would improve the knowledge but not the skills of junior doctors, and that both theoretical and practical teaching during undergraduate education delivers both more knowledgeable and skilful junior doctors.
Methods
Study design and participants
This study is a sub-analysis of data from a longitudinal prospective cohort study assessing the knowledge and skills of recently graduated junior doctors from 11 medical schools in the Netherlands (n = 8) and Belgium (n = 3) during three moments in their first year after graduation: around (+ / − 4 weeks) graduation (assessment 1), 6 months after graduation (assessment 2), and 1 year after graduation (assessment 3) [3]. In total, all 1584 graduating medical students (July 2016–March 2017) were invited to participate in this study. During this period, one medical school implemented the DNPA as a summative examination during the 5th year of its undergraduate curriculum, whereas the other medical schools (numbered 1–10) used other assessments (e.g. pharmacotherapy questions integrated in large medical exams) only. To establish the type of curriculum of the medical schools in the period before 2016, we used raw data from a previously published study [21]. Curricula are classified as ‘theoretical’ when CPT is taught by means of lectures, self-study, and working groups, whereas curricula are classified as practical when CPT education is provided during clinics, with bedside teaching and prescribing for real patients. Integration of both types of teaching is classified as a ‘mixed’ curriculum.
Permission for the study in the participating medical schools was granted by the Ethics Review Board of the Netherlands Association of Medical Education (NVMO-ERB 729). The study was funded by ZonMw (The Dutch Organisation for Health Research and Development), project no. 83600095004. Participants provided written informed consent prior to participation and were compensated with a 50-euro voucher for their time.
Design, validity, and reliability of assessment tool
The assessment tool has been described in detail elsewhere [3]. In short, each assessment contained 35 multiple choice questions (MCQs) to assess prescribing knowledge, focusing on medication safety (i.e. factual drug knowledge such as contraindications, and interactions). These questions were extracted from the DNPA database [10, 16] and were different from the questions used in the summative examination of the junior doctors who graduated from school X. The MCQs were divided into seven different topics: (1) analgesics, (2) anticoagulants, (3) antibiotics, (4) cardiovascular drugs, (5) antidiabetics, (6) psychotropics, (7) basic pharmacokinetics and drug calculations. Furthermore, to assess the prescribing skills (i.e. rational prescribing), each assessment contained three clinical polypharmacy case scenarios (about renal impairment, anticoagulants, and pain management), set up by a group of clinically active senior clinical pharmacologists and medical specialists (e.g. internist, surgeon, and general practitioner) from all participating medical schools. Each case required its own treatment plan, including a (non)pharmacological policy and follow-up management. In general, each case required two main additions/alterations (e.g. starting pain treatment and changing a medication because of a clinically relevant drug-drug interaction interaction) and one to three minor alterations (e.g. correcting the timing of drug administration).
The MCQs and the clinical case scenarios had high content validity, with 75.8% and 72.7% of all knowledge and skills questions being rated as ‘essential’, respectively, by clinical pharmacologists not involved in this study [3, 16]. Reliability tests showed sufficient internal consistency for all three assessments (Cronbach alpha of 0.70, 0.69, and 0.76, respectively) and poor to adequate ability to distinguish good from poor students (Rir-scores; range − 0.02–0.46). The latter is inseparable with examining ready knowledge (i.e. easily accessible information for immediate use or application). The MCQs of assessment 2 seemed to be easier than those of assessments 1 and 3, as became apparent with a control group [3].
Data collection
A local coordinator was appointed at each medical school to invite all (nearly) graduated students to participate in this study. The assessments were online, remote without surveillance, and each took about 60 min to complete (but there was no time limit). When necessary, reminders were sent out after 1 or 2 weeks. For the knowledge part, no external resources such as formularies were allowed, whereas for the skills part, this was allowed. Third parties were not allowed to be consulted. The collected data of all participants was encrypted and anonymized.
Scoring and data analysis
For the knowledge part, the MCQs were marked as either correct or incorrect. Scores are given as a percentage of the maximum correct score. Since the examination assesses ready knowledge and students are expected to perform well on this, we used a pass grade of 85% correct answers, as used by the DNPA and EuroPE+ [10, 16].
For the skills part, the clinical cases were independently scored by two investigators (ED (clinical pharmacologist in training) and DB (clinical pharmacologist)), blinded for participant information, using an answer grading rubric based on national guidelines [3]. In the case of discrepancy, the expert group involved in creating the questions was consulted to reach consensus. Three aspects were scored: pharmacological and non-pharmacological policy, and follow-up management. These were scored either insufficient (0 points), sufficient (1 point), or good (2 points). The total score determined the classification of the treatment plan: insufficient (0–1 points), sufficient (2–3 points), or good (4–6 points).
For all longitudinal data, we used linear mixed models (continuous data) or generalized estimating equations (ordinal data) to assess the differences in scores. To assess the differences within one assessment, we performed either chi-square tests, independent T-tests, or ANOVA tests (with post-hoc Tukey HSD). Analysis was performed in SPSS 26.0 (IBM Corp., Armonk, NY, USA), and in Stata version 14 (StataCorp LLC, 2020).
Results
Of the 1584 invited junior doctors, 556 (35.1%) participated, 326 (58.6%) completed the MCQs, and 325 (58.5%) the clinical case scenarios in all three assessments. On the basis of the prespecified criteria, five curricula were classified as theoretical and six as mixed. Demographic information is given in Table 1.
Knowledge
Junior doctors with the DNPA in their curriculum outperformed other junior doctors (76.7 ± 12.5% vs. 67.8 ± 12.6%, 81.8 ± 11.1% vs. 76.1 ± 11.1%, 77.0 ± 12.1% vs. 70.6 ± 14.0%, p < 0.05 for all three assessments, respectively)(Fig. 1 and Supplementary Table 1). Also, junior doctors taught with a mixed curriculum outperformed junior doctors taught with a theoretical curriculum (72.0 ± 13.2% vs. 66.3 ± 12.1%, 79.3 ± 11.2% vs. 74.5 ± 11.0%, 73.2 ± 13.4% vs. 70.0 ± 14.2%, p < 0.05 for all three assessments respectively) (Fig. 1 and Supplementary Table 1). On average, junior doctors who had taken the DNPA during their medical education and junior doctors taught with a mixed curriculum were more likely to pass the assessments than the other junior doctors (36.8% vs. 17.6% and 26.3% vs. 14.7%, respectively) (Supplementary Table 2). Nonetheless, the average score of all subgroups per assessment was below the predefined pass grade of 85%. The comparison between school X and the individual medical schools can be found in Supplementary Table 3 and Supplementary Fig. 1.
In general, linear mixed modelling revealed no significant differences in the development of knowledge scores between the junior doctors who graduated with or without the DNPA in their curriculum, or between the junior doctors taught with a theoretical or mixed curriculum (p = 0.10 and p = 0.11, respectively). However, for the specific assessments, junior doctors taught with a theoretical curriculum showed a greater improvement in prescribing knowledge after 1 year (increase of 3.7% from baseline) than junior doctors taught with a mixed type curriculum (increase of 1.2% from baseline, p = 0.038).
Skills
At graduation (assessment 1), there was no difference in skills scores between the junior doctors with the DNPA in their curriculum and the other junior doctors (p = 0.110), but in assessments 2 and 3, the junior doctors with the DNPA in their curriculum had significantly higher scores (p = 0.001 for both) (Fig. 2). Junior doctors taught with a mixed curriculum outperformed junior doctors taught with a theoretical curriculum in all three assessments (p < 0.05 in all assessments) (Fig. 3). This could mainly be ascribed to a difference in the number of treatments plans scored as ‘good’. Nonetheless, skills deteriorated after graduation, regardless of where junior doctors graduated. The comparison between all individual medical schools can be found in Supplementary Fig. 2.
Generalized estimating equations showed that, over time, there was less deterioration in prescribing skills in the junior doctors with the DNPA in their curriculum and in the junior doctors taught with a mixed curriculum compared with the other junior doctors (both p < 0.001).
Discussion
This study suggests that implementing the Dutch National Pharmacotherapy Assessment improves the prescribing knowledge but not necessarily the prescribing skills of junior doctors at graduation. Moreover, including practical CPT education in the medical curriculum is associated with more knowledgeable and skilful junior doctors. Nevertheless, the average prescribing knowledge and skills of the junior doctors involved in our study was below the predefined level of 85% correctness and did not improve in the year after graduation.
Assessment is an important component of a medical curriculum. The aim of assessment is, among other reasons, to evaluate whether students meet pre-determined learning objectives. A plausible reason why the DNPA only led to better prescribing knowledge and not to better prescribing skills at graduation is that the DNPA contains questions that assess knowledge-level learning objectives. In this study, we also assessed skills-level learning objectives by evaluating junior doctors’ capability to set up treatment plans for clinical polypharmacy case scenarios. Skills are typically learned during practical education and, indeed, our study shows that a curriculum that includes practical education leads to better prescribing knowledge and skills. However, in assessments two and three, junior doctors who had taken the DNPA during their training outperformed others. This suggests that the DNPA improves prescribing knowledge which is essential for enhancing prescribing skills in clinical practice. This hypothesis is supported by the finding in our study that junior doctors with the DNPA in their curriculum mainly performed better than juniors taught solely with theoretical education (Supplementary Fig. 2). Of course, it could be questioned whether the differences in prescribing knowledge are clinically relevant. However, on average, junior doctors who had taken the DNPA during their medical education and junior doctors taught with a mixed curriculum were more likely to pass the assessments than the other junior doctors. Despite the improved prescribing knowledge of junior doctors who had taken the DNPA during training and the improved prescribing knowledge and skills of junior doctors who had followed more practical education, the overall prescribing competence of the participants was still insufficient. Therefore, as recommended by the Joint British Pharmacological Society and EACPT (BPS/EACPT) Working Group on Safe Prescribing in 2007 and later by EACPT Education Working Group in 2018, both undergraduate and postgraduate education and training in CPT must be intensified, modernized, and harmonized [22, 23].
To assess whether future doctors have acquired sufficient prescribing skills, national prescribing safety assessments could incorporate more skills-type questions in their exams, such as prescribing for clinical case scenarios or performing medication reviews. The Prescribing Safety Assessment (PSA) in the UK and the European Prescribing Exam (EuroPE+) already have such skills questions [13,14,15], but it is not known whether this leads to more skilful junior doctors after being graduated.
The PSA is mandatory in most UK medical schools, and Foundation Year 1 doctors are required the pass the exam in order to progress to year 2. The EuroPE+ is currently being piloted as a formative assessment in twelve European medical schools, with the aim to incorporate the examination in all European medical schools as a summative exam for all penultimate or final-year medical students [13]. Currently, the DNPA is used as a summative assessment by seven medical schools in the Netherlands and as a formative assessment in one medical school. It is not known which manner of assessment is more effective. This question is becoming increasingly relevant as several medical schools in the Netherlands are switching to ‘programmatic assessment’. In ‘programmatic assessment’ curricula, information about the student's learning process is continuously collected via the so-called data points (e.g. feedback, exams, objective structured clinical examination (OSCE), et cetera) [24]. This allows teachers to monitor and adjust the student’s learning process. An important difference with conventional curricula is that not the individual exam, but the collection of data points over a longer period is assessed ‘summatively’ by an assessment committee [24]. One could argue that assessing prescribing knowledge and skills is of such importance that it should always be graded by summative assessment. On the other hand, formative assessments are more appropriate to drive learning [25,26,27]. A study comparing results between medical schools with summative or formative assessments might resolve this question.
There are possible reasons why junior doctors who had the DPNA in their medical curriculum performed better than other junior doctors in terms of prescribing knowledge but not prescribing skills. First, the DNPA was used in only one medical school, and it is possible that this school had a more effectively integrated CPT curriculum and teaching programme (constructive alignment) than the other medical schools. Second, there were only 57 junior doctors from this medical school compared with 269 junior doctors from the other medical schools, which may lead to differences in demographics. In our earlier study involving the same cohort, we found that non-surgical junior doctors outperformed surgical doctors [3]. However, there were similar proportions of surgical and non-surgical junior doctors among graduates from medical school X and the other medical schools, but fewer research physicians (3.5% vs. 11.5%, Supplementary Tables 4 and 5). We previously found that physician-researchers underperformed in the knowledge part compared with non-registrars and registrars in assessments 2 and 3 [3].
Our study also confirms that the prescribing knowledge and skills of junior doctors are insufficient since the majority of the participants did not pass the assessments [28,29,30]. As discussed elsewhere, the majority of the junior doctors worked in clinical practice (86.4%) and thus the assessed topics should be familiar and well-known [3]. We believe, like the BPS/EACPT Working Group on Safe Prescribing and Jansen et al. [16, 23], that all junior doctors should have broad knowledge of, and skills in, the medicines that are frequently prescribed and associated with medication-related harm. The first assessment was right after graduation, and the juniors doctors took their time (62 min (interquartile range 46–92)), so their poor performance cannot be ascribed to poor retention of knowledge and skills (which usually last about 2 years [31]) or negligence.
To contextualize the findings more broadly, this study suggests that European medical schools could benefit from implementing a final assessment focused on prescribing. Utilizing a standardized European evaluation—like the European Prescribing Exam, which is grounded in consensus studies concerning key learning outcomes, and essential medicines and diseases relevant to prescribing—could serve not only to harmonize CPT education across Europe, but also to enhance the prescribing competence of future medical professionals [13].
Strengths and limitations
To our knowledge, this is the first longitudinal, international, and multicentre study to investigate the effect of a national prescribing safety assessment and type of curriculum on the prescribing competence of junior doctors working in clinical practice during the year after graduation. However, there are a number of limitations to take into consideration when interpreting the results. First, only one medical school used the DNPA in its curriculum at the moment we conducted this study. This uneven distribution makes it difficult to generalize the results, especially because confounders such as type of curriculum could not be tested. Second, the participants who completed the full study were possibly more interested and therefore probably more competent in CPT. This type of selection bias could have led to an overestimation of the true competence and indicates that only introducing a CPT assessment or practical learning education is not sufficient to resolve the problem of the poor prescribing knowledge and skills of junior doctors. The same could be said about the fact that the participants were not proctored during the tests. They might have used resources for the knowledge part or discussed with colleagues. Indeed, this is also true for clinical practice, where doctors can consult co-workers or formularies, but the results might be an overestimation of the true prescribing knowledge and skills. Third, the MCQs were extracted from the DNPA database. Even though all questions were different from those of the DNPA used in medical school X, the better score of this school might in part be because the junior doctors were more familiar with the type of questions and the knowledge assessed. However, the MCQs were constructed in a simple way [3], and the knowledge assessed was considered ready knowledge, i.e. all junior doctors should know it. Fourth, we could only distinguish between the type of curricula using quantitative data. Not only the type of teaching and training influence learning, but also how a teacher works, his or her ability to convey the subject manner, and the time he/she puts into it. Fifth, there are many aspects that can influence a person’s knowledge and/or skills over time, for example the ward or hospital where you work, or accessibility to continuing education. However, our international and multicentre study design probably diminished this type of bias.
Conclusion
Optimizing and maintaining prescribing skills and knowledge from the start of a doctor’s medical career is an important step in prescribing safety. Our study shows that the inclusion of the knowledge focused Dutch National Pharmacotherapy Assessment in the medical curriculum might improve the prescribing knowledge, but not the skills, of junior doctors at the moment of graduation. However, after 6 and 12 months, both the knowledge and skills were better in the junior doctors whose medical curriculum included the DNPA. Additional studies are needed to confirm this. Moreover, a curriculum with more practical CPT education might improve prescribing knowledge and skills compared with mainly theoretical teaching. To improve the prescribing competence of future junior doctors, we recommend that a skills part is added to (national) prescribing safety assessments and that more practical teaching is incorporated in existing CPT curricula. Moreover, continuous education in CPT for junior doctors should be developed and implemented.
Data availability
The data that support the findings of this study are available directly after publication from the corresponding author upon reasonable request. Proposals may be submitted up to 24 months following the article submission. The data will be shared after de-identification.
Code availability
Not applicable.
References
Brinkman DJ, Tichelaar J, Graaf S, Otten RHJ, Richir MC, van Agtmael MA (2018) Do final-year medical students have sufficient prescribing competencies? A systematic literature review. Br J Clin Pharmacol 84(4):615–635. https://doi.org/10.1111/bcp.13491
Brinkman DJ, Tichelaar J, Schutte T, Benemei S, Bottiger Y, Chamontin B, Christiaens T, Likic R, Ma iulaitis R, Marandi T, Monteiro EC, Papaioannidou P, Pers YM, Pontes C, Raskovic A, Regenthal R, Sanz EJ, Tamba BI, Wilson K, Vries T, Richir MC, Agtmael MV, Working Group Research on CPTEotEAfCP, Therapeutics (2017) Essential competencies in prescribing: a first european cross-sectional study among 895 final-year medical students. Clin Pharmacol Ther 101(2):281–289. https://doi.org/10.1002/cpt.521
Donker EM, Brinkman DJ, van Rosse F, Janssen B, Knol W, Dumont G, Jorens PG, Dupont A, Christiaens T, van Smeden J, de Waard-Siebinga I, Peeters LEJ, Goorden R, Hessel M, Lissenberg-Witte B, Richir M, van Agtmael MA, Kramers C, Tichelaar J (2022) Do we become better prescribers after graduation: a 1-year international follow-up study among junior doctors. Br J Clin Pharmacol. https://doi.org/10.1111/bcp.15443
Dornan T, Ashcroft D, Heathfield H, Lewis P, Miles J, Taylor D, Yully M, Was V (2009) An in depth investigation into causes of prescribing errors by foundation trainees in relation to their medical education. EQUIP study. In: ed
Ryan C, Ross S, Davey P, Duncan EM, Francis JJ, Fielding S, Johnston M, Ker J, Lee AJ, MacLeod MJ, Maxwell S, McKay GA, McLay JS, Webb DJ, Bond C (2014) Prevalence and causes of prescribing errors: the PRescribing Outcomes for Trainee Doctors Engaged in Clinical Training (PROTECT) study. PLoS ONE 9(1):e79802. https://doi.org/10.1371/journal.pone.0079802
Ashcroft DM, Lewis PJ, Tully MP, Farragher TM, Taylor D, Wass V, Williams SD, Dornan T (2015) Prevalence, nature, severity and risk factors for prescribing errors in hospital inpatients: prospective study in 20 UK hospitals. Drug Saf 38(9):833–843. https://doi.org/10.1007/s40264-015-0320-x
Dean B, Schachter M, Vincent C, Barber N (2002) Causes of prescribing errors in hospital inpatients: a prospective study. Lancet 359(9315):1373–1378. https://doi.org/10.1016/S0140-6736(02)08350-2
Leendertse AJ, Egberts AC, Stoker LJ, van den Bemt PM, Group HS (2008) Frequency of and risk factors for preventable medication-related hospital admissions in the Netherlands. Arch Intern Med 168(17):1890–1896. https://doi.org/10.1001/archinternmed.2008.3
Coleman JJ, Samer C, Zeitlinger M, van Agtmael M, Rongen GA, Marquet P, Simon T, Singer D, Manolopoulos VG, Böttiger Y (2019) The European Association for Clinical Pharmacology and Therapeutics-25 years’ young and going strong. Eur J Clin Pharmacol 75(6):743–750. https://doi.org/10.1007/s00228-019-02690-5
Kramers C, Janssen BJ, Knol W, Hessel MHM, Mulder WM, Dumont G, Maassen van den Brink A, Tichelaar J (2017) A licence to prescribe. Br J Clin Pharmacol 83(8):1860–1861. https://doi.org/10.1111/bcp.13257
Wormald BW, Schoeman S, Somasunderam A, Penn M (2009) Assessment drives learning: an unavoidable truth? Anat Sci Educ 2(5):199–204. https://doi.org/10.1002/ase.102
Wood T (2009) Assessment not only drives learning, it may also help learning. Med Educ 43(1):5–6. https://doi.org/10.1111/j.1365-2923.2008.03237.x
Donker EM, Brinkman DJ, Richir MC, Papaioannidou P, Likic R, Sanz EJ, Christiaens T, Costa JN, De Ponti F, Böttiger Y, Kramers C, van Agtmael MA, Tichelaar J (2022) The European Prescribing Exam: assessing whether European medical students can prescribe rationally and safely. Eur J Clin Pharmacol 78(6):1049–1051. https://doi.org/10.1007/s00228-022-03301-6
Maxwell SRJ, Cameron IT, Webb DJ (2015) Prescribing safety: ensuring that new graduates are prepared. Lancet 385(9968):579–581. https://doi.org/10.1016/S0140-6736(14)62339-4
Maxwell SRJ, Coleman JJ, Bollington L, Taylor C, Webb DJ (2017) Prescribing Safety Assessment 2016: delivery of a national prescribing assessment to 7343 UK final-year medical students. Br J Clin Pharmacol 83(10):2249–2258. https://doi.org/10.1111/bcp.13319
Jansen DRM, Keijsers C, Kornelissen MO, Olde Rikkert MGM, Kramers C (2019) Towards a “prescribing license” for medical students: development and quality evaluation of an assessment for safe prescribing. Eur J Clin Pharmacol. https://doi.org/10.1007/s00228-019-02686-1
Brinkman DJ, Monteiro T, Monteiro EC, Richir MC, van Agtmael MA, Tichelaar J (2021) Switching from a traditional undergraduate programme in (clinical) pharmacology and therapeutics to a problem-based learning programme. Eur J Clin Pharmacol 77(3):421–429. https://doi.org/10.1007/s00228-020-03027-3
World Health Organization (2001) Teacher’s guide to good prescribing Hans V Hogerzeil (editor) ... [et al.]. In: ed. World Health Organization
Brinkman DJ, Tichelaar J, Schutte T, Benemei S, Bottiger Y, Chamontin B, Christiaens T, Likic R, Ma iulaitis R, Marandi T, Monteiro EC, Papaioannidou P, Pers YM, Pontes C, Raskovic A, Regenthal R, Sanz EJ, Tamba BI, Wilson K, Vries T, Richir MC, Agtmael MV (2017) Essential competencies in prescribing: a first european cross-sectional study among 895 final-year medical students. Clin Pharmacol Ther 101(2):281–289. https://doi.org/10.1002/cpt.521
Tichelaar J, van Kan C, van Unen RJ, Schneider AJ, van Agtmael MA, de Vries TP, Richir MC (2015) The effect of different levels of realism of context learning on the prescribing competencies of medical students during the clinical clerkship in internal medicine: an exploratory study. Eur J Clin Pharmacol 71(2):237–242. https://doi.org/10.1007/s00228-014-1790-y
Brinkman DJ, Tichelaar J, Okorie M, Bissell L, Christiaens T, Likic R, Maciulaitis R, Costa J, Sanz EJ, Tamba BI, Maxwell SR, Richir MC, van Agtmael MA (2017) Pharmacology and therapeutics education in the european union needs harmonization and modernization: a cross-sectional survey among 185 medical schools in 27 countries. Clin Pharmacol Ther 102(5):815–822. https://doi.org/10.1002/cpt.682
Brinkman DJ, Tichelaar J, Mokkink LB, Christiaens T, Likic R, Maciulaitis R, Costa J, Sanz EJ, Maxwell SR, Richir MC, van Agtmael MA, Education Working Group of the European Association for Clinical P, Therapeutics, its affiliated Network of Teachers in P (2018) Key learning outcomes for clinical pharmacology and therapeutics education in Europe: a modified Delphi study. Clin Pharmacol Ther 104(2):317–325. https://doi.org/10.1002/cpt.962
Maxwell SR, Cascorbi I, Orme M, Webb DJ (2007) Educating European (junior) doctors for safe prescribing. Basic Clin Pharmacol Toxicol 101(6):395–400. https://doi.org/10.1111/j.1742-7843.2007.00141.x
van der Vleuten CP, Schuwirth LW, Driessen EW, Dijkstra J, Tigelaar D, Baartman LK, van Tartwijk J (2012) A model for programmatic assessment fit for purpose. Med Teach 34(3):205–214. https://doi.org/10.3109/0142159x.2012.652239
Schuwirth L, van der Vleuten C (2004) Merging views on assessment. Med Educ 38(12):1208–1210. https://doi.org/10.1111/j.1365-2929.2004.02055.x
Epstein RM (2007) Assessment in medical education. N Engl J Med 356(4):387–396. https://doi.org/10.1056/NEJMra054784
Abu-Zaid A (2013) Formative assessments in medical education: a medical graduate’s perspective. Perspect Med Educ 2(5–6):358–359. https://doi.org/10.1007/s40037-013-0089-5
Maxwell SR, Cascorbi I, Orme M, Webb DJ, Joint BPSEWGoSP (2007) Educating European (junior) doctors for safe prescribing. Basic Clin Pharmacol Toxicol 101(6):395–400. https://doi.org/10.1111/j.1742-7843.2007.00141.x
Harding S, Britten N, Bristow D (2010) The performance of junior doctors in applying clinical pharmacology knowledge and prescribing skills to standardized clinical cases. Br J Clin Pharmacol 69(6):598–606. https://doi.org/10.1111/j.1365-2125.2010.03645.x
Starmer K, Sinnott M, Shaban R, Donegan E, Kapitzke D (2013) Blind prescribing: a study of junior doctors’ prescribing preparedness in an Australian emergency department. Emerg Med Australas 25(2):147–153. https://doi.org/10.1111/1742-6723.12061
Custers EJ, Ten Cate OT (2011) Very long-term retention of basic science knowledge in doctors after graduation. Med Educ 45(4):422–430. https://doi.org/10.1111/j.1365-2923.2010.03889.x
Author information
Authors and Affiliations
Consortia
Contributions
ED, DB, FR, BJ, WK, GD, PJ, AD, TC, JvS, IdWS, LP, RG, MH, MR, MvA, CK and JT contributed to study design and data collection. ED, HO, DB, MR and JT contributed to writing of the report. ED, HO and BLW contributed to the data analysis. All authors contributed to data interpretation and approved the final version of the submitted report.
Corresponding author
Ethics declarations
Ethics approval
Permission for the study in all participating medical schools was granted by the Ethical Review Board of the Netherlands Association of Medical Education (NVMO-ERB 729).
Consent to participate
Participants provided written informed consent prior to participation.
Consent for publication
Not applicable.
Conflict of interest
All authors declare that they had no financial support; no financial relationships with any organizations that might have an interest in the submitted work in the previous 3 years; JvS, MvA and JT are member of the Pharmacotherapy Education working group of the Dutch Society for Clinical Pharmacology & Biopharmacy. MvA and JT are member of the Education working group of the European Association for Clinical Pharmacology and Therapeutics. No other relationships or activities that could appear to have influenced the submitted work.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Below is the link to the electronic supplementary material.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Donker, E.M., Osmani, H., Brinkman, D.J. et al. The impact of a summative national prescribing assessment and curriculum type on the development of the prescribing competence of junior doctors. Eur J Clin Pharmacol 79, 1613–1621 (2023). https://doi.org/10.1007/s00228-023-03567-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00228-023-03567-4