Introduction

Medical research training is an important aspect in the training of medical students worldwide. Teaching scientific skills and knowledge is part of all medical curricula, although the content varies greatly. In general, students learn to critically read and evaluate articles, receive training in evidence based medicine (EBM) approaches, become acquainted with the basics of the ‘empirical cycle’, and carry out a (compulsory) research project. All these skills are of major importance to function as a good physician, because doctors should be aware of the latest developments (in their field) and should use up-to-date knowledge to justify their medical decisions. Integrating individual clinical expertise with external clinical evidence from research is of vital importance.1

In addition, there is growing interest in providing extra research training to motivated medical students.2 Besides the regular basic science training, all eight University Medical Centers (UMCs) in the Netherlands also have or have recently started additional educational programs for highly motivated students. The programs comprise research master programs, research master physician-clinical researchers, and ‘Honours’ and MD/PhD programs3 and focus in general on recognition and cultivation of academic talent. This is in line with international trends, for example, in the USA4 and in Norway.5

Scientific training of medical students, comprising both scientific skills and knowledge and research training, is labour intensive and costly, and usually involves small group education or individual training. It is therefore important to assess the output of these efforts, i.e., to what extent medical students are scientifically active and whether they publish their articles in peer-reviewed medical journals. Using a modified version of a well-validated method for bibliometrical evaluation of research groups and scientific institutes6, we assessed scientific output of medical students in the Netherlands.

Setting

In the Netherlands, all UMCs provide undergraduate medical training. All medical curricula conform to the educational objectives set in the ‘Blueprint 2001: training of doctors in the Netherlands’.7 Objectives stated are common educational requirements, which leave individual medical schools the freedom to design their own educational program and their own ‘desired’ individual profile.8 Specified common educational requirements guarantee that all medical schools meet minimal educational requirements.

After finishing secondary school, students begin medical school education at approximately 18 years of age. In general, two-thirds of the six-year undergraduate medical program consists of multidisciplinary modules based on thematic topics. These may be problem-based modules or some other form of teaching model. About one-third of the six-year curriculum is spent in a clinical setting, which comprises 80–90 weeks of internships in several clinical wards and extramural settings. Both UMCs and regionally affiliated hospitals accommodate interns.9 There is no general exit exam and approximately 90% of all students entering medical school finally graduate.10 Each medical school spends approximately between 50 and 80 European Credits (EC) on scientific education, of which approximately 20–40 EC is spent on an individual student research project of 14–27 weeks full time somewhere in year 4–6 (Table 1).

Table 1. Scientific education within the medical curricula of six University Medical Centers in the Netherlands.

Methods

In the Netherlands, each UMC is represented in the working group on scientific education of the NVMO (the Netherlands Association for Medical Education) by at least one contact person. These were contacted and asked to send lists with names of students who received their Medical Degree in 2006 or 2007. Data was provided by six out of eight UMCs: Erasmus University Medical Center, Rotterdam (EUMC), Leiden University Medical Center (LUMC), Radboud University Nijmegen Medical Centre, Nijmegen (RUNMC), University Medical Center Groningen (UMCG), University Medical Center Utrecht (UMCU), and VU University Medical Center Amsterdam (VUMC). These lists were then sent to the Centre for Science and Technology Studies (CWTS) for bibliometrical evaluation.

CWTTS data collection steps

In 2006 and 2007, the total number of students that received their medical degree at one of the six participating UMCs was 2963. CWTS searched the in-house version of the Web of Science database for publications of these students that appeared in print during the last three years before medical graduation. This special version of the Web of Science database is designed and constructed for optimal performance for bibliometric studies. It has a good citation linking algorithm, contains cleaned up addresses of scientific organizations worldwide, is related to software routines that generate standard bibliometric indicators, and it contains search routines for individual scientists and their research output.6 These latter routines were used to collect the publications for the supplied list of students.

On the basis of author name similarity and address clustering, the most likely ‘candidate publications’ were selected for every single name. Only research papers, reviews, and letters were considered, while publications in e-journals, books, and book chapters were excluded. Papers published in a journal for which no citation data are available, or in a journal that is not assigned to any Thomson Scientific journal category (‘field’) were excluded. As a result papers in Dutch journals were usually not included. Publications resulting from research projects abroad were only included when there was a clear connection with the home university of the student, or the name of the student was clearly identifiable. Only publications from three years prior to graduation were included as we assumed that students who started their medical studies will not publish in the first three years of their study. The date of formal publication ‘on paper’ was used, not the e-publication date. This resulted in a list of publications per individual. In the verification process at CWTS, publications were added which were missed due to homonyms or synonyms, and publications that were attributed to an oeuvre due to a lack of specific information, were deleted. This happens with the occurrence of names that appear frequently, such as Smith or Jansen, and have only one initial.11 After the CWTS collected the data on the publications of all students, the data per institute were verified by one of the contact persons of the respective institute.

The contact persons were asked to check students who published three or more articles before graduation, to assess whether this really was a student from the list or whether it was another person from the same university with the same name and initials. Finally, the contact persons were asked to report publications they accidently noticed of students that were missed by the CWTS, but of which they knew were written by one of their students. All corrections from the contact persons were received and validated using Web of Science and Pubmed. The lists of corrected publications were used to make an overview of bibliometric indicators per institute.

CWTS standard bibliometric indicators

A standard set of basic indicators plays a central role in bibliometric evaluation studies.6 We used three indicators: 1) the number of publications (P), 2) the average citation per publication ratio excluding self-citations (CPP), and 3) the ratio of this CPP to the worldwide average ‘field citation score’ (FCSm) in order to normalize the measured impact in a field-specific way. To this end, citations for each publication were counted that appeared before the end of the year of graduation.

For FCSm, the definition of a ‘field’ is based on a classification of scientific journals into categories developed by Thomson Scientific. In calculating FCSm, the type of paper (e.g., normal article, review), as well as the specific years in which the papers were published, are taken into account. To give an example, the number of citations received during the period 2003–2006 by an article published in 2003 in field X, is compared to the average number of citations received during the same period (2003–2006) by all articles published in the same field (X) in the same year (2003).

If the ratio CPP/FCSm is above 1.0, publications of a group of people are cited more frequently than an ‘average’ publication in the field(s). In this way, an indication of the international position of a group, in terms of its impact compared to a worldwide average, is given. The CPP/FCSm is the ‘crown-indicator’, because it puts the position of an institute within a worldwide, ‘field-normalized’ perspective, and it is considered the internationally standardized impact indicator.

Table 2. Papers published by medical students during the last three years before graduation, by institution.

Results

The 2963 included students published a total of 748 articles during the last three years of their medical studies. These publications are of good quality, as the number of citations per paper is above the average for papers published in this field (CPP/FCSm average 1.28, range: 0.70–2.42). The output data per UMC are presented in Table 2.

The percentage of students who published one or more articles during the last three years of their medical studies ranged between 9.0 and 20.9. In 2006, 14.2 percent of all students wrote one or more articles and this was 14.9 percent in 2007. In total 65 students (2.2%) published three or more articles.

Discussion

Using a well-validated method for bibliometrical evaluation of research output, we assessed what percentage of Dutch medical students from six medical schools were scientifically active and productive during the course of their study. Of the students who received their medical degree in 2006 and 2007 14.5% published at least one research paper. The publications were of good quality, as the average number of citations per paper was above the average for papers published in this field.

In 1999 the University Medical Center Groningen started an additional educational program for highly motivated medical students, the Junior Scientific Masterclass (JSM). Of the 57 graduates that were actively involved in this program and received their medical degree in 2006 en 2007, 49.1% published at least one research paper during their medical studies. The number of JSM students that published one or more research papers is over four times higher than regular students at the UMCG. Of the UMCG students who published three or more articles during their medical studies 80.0% were actively involved in the JSM program. As the other UMC’s started their additional educational programs more recently, their results were not broken down by program.

In this study, a well-validated method for bibliometrical evaluation was used. For the scientific output of medical students, this method is most likely to result in an underestimation of the number of student publications counted and evaluated. Due to their relatively low number of publications, it is difficult to recognize patterns and to attribute a publication to a student, especially when the student did a research project abroad. Another factor leading to underestimation of scientific output of students is publication delay. Research done during the last phase of a students’ medical study is most likely to be published one or more years after graduation and consequently, these publications are not included in our assessment. As we acknowledge that many research efforts do not directly result in scientific publications, at least not for the student researcher, it is safe to conclude that the percentage of students actively involved in medical research is considerably higher than reported here.

The number of publications for all UMCs together is fairly stable between 2006 and 2007. In the results of the individual UMCs, however, there are substantial differences in the number of publications between the two years. Due to these differences, it is difficult to interpret the results for the individual UMCs. It is very unlikely that the differences between the years are the result of (differences in) effort made by the institutes to educate the students. The differences between the UMCs are probably small, because all UMCs conform to the educational objectives set in the ‘Blueprint 2001: training of doctors in the Netherlands’7 which states that undergraduate medical training is an academic study and doctors should have received research training.

We hope our results reflect the effort put into research training in the medical curriculum. As only six medical schools were studied, including graduated students from two subsequent years, it is difficult to study variations between medical curricula in relation to scientific output. Our ‘Student Science Performance Bibliometric Index’ (SSPBI) can be used to study this, but data from subsequent years is essential. In this respect, it is still interesting to observe that the percentage of students who publish during their medical studies is remarkably stable across all six medical schools, with a rather small range between 10% and 20%.

Within the scientific community, bibliometric indices of research publications are the key indicators of the quality of a researcher or a research group. In general, the more publications and the higher the impact factor, the better an individual or research group performs. The basic requirements for scientific education of medical students as described in the Blueprint 2001 do not aim at a scientific publication for every medical student. However, to assess the ‘scientific climate’ within Dutch medical schools with their own ‘unique’ medical curriculum, focusing on the percentage of students with at least one scientific publication listed in the Web of Science database seems a valid indicator. When assessing the level of ‘scientific climate’ within a curriculum or medical school, we argue it is better to have 50 out of 300 medical students publishing one paper, than 10 out of 300 students publishing five papers each.

Over the last decade there has been a growing concern for the future of clinical academic research. Multiple publications report on a decrease in the numbers of clinical academics active in university research.1214 Due to multiple factors, such as increasing workload of clinical patient care, decreasing time available for clinical academic research and financially more rewarding career prospects outside the university hospitals, fewer young academics choose for a medical scientific career.1516 From this perspective, it may be useful to stimulate young students to develop their scientific skills.

We found only few previous papers that reported on publications of medical students, although their methodology is different from ours. Reinders interviewed alumni from Groningen Medical School ten years after graduation. Of those, 19% self-reported at least one publication written before graduation.17 Loonen searched Pubmed for publications of residents in plastic surgery, and concluded that 10 out of 61 residents (16%) already had at least one publication before graduation.18 Zier et al. presented results of a special program to stimulate student research in Mount Sinai School of Medicine, New York.19 Here, 25% of students self-reported at least one publication prepared before graduation. It was not clear, however, how long after graduation these papers were actually published. In Croatia, 14% of master thesis resulted in a later publication indexed in Medline.20 According to a report from a single center from Wurzburg, Germany, medical student research activity finally resulted in a Medline-indexed publication for 66% of students.21 By and large, most studies reported the number of students publishing their work any time, while we looked only at publications that came out before graduation. How many of Dutch students will publish their student-work in the future is not known. Because we did not make use of hand searching or interviewing students, our method can be applied on a much larger scale to assess scientific output of medical schools.

There is growing attention for research training of medical students. A study by Hren et al. showed that medical students, in general, have a positive attitude towards science and scientific research.22 An evaluation on the effects of medical student research programs in Norway, revealed that these research programs probably more than doubled the number of students performing research in addition to their medical studies.5 Data from the JSM program for medical students within the UMCG show a comparable trend. Students active within this additional program published more scientific papers than students who were only active within the undergraduate curriculum. Whether this can be attributed to the program only or whether there is a positive selection of more ambitious students towards the program, remains to be investigated.

In conclusion, medical students in the Netherlands are actively and productively strongly involved in medical research during their studies. Our ‘Student Science Performance Bibliometric Index’ (SSPBI), specifically developed to assess scientific output of medical students, gives a robust estimate of the ‘scientific climate’ within Dutch medical school. Use of the SSPBI on a larger scale might help medical and other faculties to recognize, stimulate, and cultivate academic talent.

Declaration of interest

Huub J. van Eyk is a third year medical student. This may become his first publication as a medical student, and as such be included in a future paper on scientific output of medical students.

Belangenconflict: geen gemeld

Financiële ondersteuning: geen gemeld

* Dit artikel verscheen eerder in Med Teach 2010;32(3):231–935.