Background

Evidence-based practice (EBP) is accepted as an integral skill for health professionals and EBP training is included as an accreditation requirement in many health professions [1]. As EBP has gained global currency as a decision making paradigm, the frequency and number of studies exploring educational strategies for developing knowledge and skills in EBP has increased.

A recent systematic review identified over 170 published studies investigating educational interventions aimed at facilitating skills and knowledge of EBP [2]. Despite the continued investment of time, effort and resources in EBP education, best practice in EBP education remains unclear [3]. Inconsistent and incomplete reporting of information in educational interventions for EBP is common, thereby limiting the ability to compare, interpret and synthesise findings from these studies. Researchers undertaking systematic reviews in EBP education frequently identify the lack of detailed reporting of the educational interventions as an issue [27]. In 2003, Coomarasamy, Taylor & Khan [5] had difficulty determining the type and dose of the intervention due to the poor reporting in the included studies. A decade later, the problem persists, with Maggio et al. [6] and Ilic & Maloney [3] unable to draw conclusions about the effectiveness of the EBP educational interventions included in their systematic review due to the incomplete descriptions of the interventions. The consistent appeal from authors of systematic reviews is for improved detail in the reporting of educational interventions for EBP. The specific requests from authors of systematic reviews include improvements in the detail for the reporting of the development, implementation and content of the curriculum for the intervention, the employment of more rigorous study designs and methodology, and the use of robust outcome measures [27].

Reporting guidelines in the form of a checklist, flow diagram or explicit text provide a way for research reporting to be consistent and transparent [8]. The reporting guidelines specific to study design such as STROBE for observational studies [9], PRISMA for systematic reviews and meta-analysis [10] and CONSORT for randomised trials [11] have paved the way for greater accuracy in the reporting of health research [12]. The EQUATOR Network encourages high quality reporting of health research and currently includes some 218 reporting guidelines for different research approaches and designs [13].

There are four reporting guidelines currently listed on the EQUATOR Network website which are specific to educational interventions [1417]. These include educational interventions in Cancer Pain education [15], Team Based Learning [16], Standardised Patients [14] and Objective Structured Clinical Examinations (OSCE) [17]. Other than the inclusion of a narrative literature review, the development processes used for these reporting guidelines differed and no formal consensus processes were reported for any of these reporting guidelines. The end user framework used for these reporting guidelines share some similarities. Howley et al. [14] and Patricio et al. [17] employ a checklist format, comprised of 18 [17] to 45 [14] items. Haidet et al. [16] and Stiles et al. [15] include a series of domains and recommendations for reporting in each domain. The information items included in each of these reporting guidelines are content specific. For example, Patricio et al. [17] include 31 items related specifically to the set up and design for OSCE’s. Howley et al. [14] include nine items specific to behavioural measures for standardised patients. None of the four reporting guidelines appeared to be appropriate for reporting educational interventions for developing knowledge and skills in EBP. Therefore an original three stage project was commenced, based on the recommendations for developers of reporting guidelines for health research [18], to develop the guideline for reporting evidence-based practice educational interventions and teaching (GREET) [19]. The aim of this systematic review was to identify which items have been included when reporting educational interventions used to facilitate foundational skills and knowledge for EBP. The data obtained from this review will be used to inform the development for the GREET [19].

The review question was: ‘What information has been reported when describing educational interventions targeting foundational evidence-based practice knowledge and skills?’

Methods

Research team

The research team consisted of a doctoral candidate (AP), experts with prior knowledge and experience in EBP educational theory (MPM, LKL, MTW), lead authors of the two Sicily statements (PG, JKT) and experts with experience in the development of reporting guidelines and the dissemination of scientific information (DM, JG, MH).

Data sources

The search strategy underwent several iterations of peer-review before being finalised [20]. The search protocol was translated for each of the databases with four primary search themes, health professionals (e.g. medicine, nursing, allied health); EBP (e.g. EBM, best evidence medical education, critical appraisal, research evidence); education (e.g. teach, learn, journal club); and evaluation (e.g. questionnaire, survey, data collection) [19]. The preliminary search strategy was test run by two pairs of independent reviewers (AP and HB/MPM/JA) for each database. Inconsistencies between search results were discussed, the source of disagreement identified, and the searches were re-run until consistent. Between October and December 2011 the final search was completed of nine electronic databases (MEDLINE, Academic Search Premier, ERIC, CINAHL, Scopus, Embase, Informit health database, Cochrane Library and Web of Science). The MEDLINE search strategy is provided as an example in Additional file 1.

Protocols for systematic reviews are recommended to be prospectively registered where possible [10]. However, as this systematic review focussed on the reporting of educational interventions for EBP rather than a health related outcome, it was not eligible for prospective registration with databases such as PROSPERO [21].

Study selection

Eligibility criteria for studies are presented in Table 1. The reference lists of systematic reviews with or without meta-analysis identified in the search were also screened for further eligible studies.

Table 1 Study eligibility and exclusion criteria

Study selection and quality assessment

Training

A training exercise was undertaken to establish a consistent process for reviewing the title and abstracts against the eligibility criteria. Four reviewers (AP, MPM, LKL, MTW) collaboratively examined the title and abstracts of the first 150 citations for eligibility with disagreements resolved by consensus.

Once consistency was established, one investigator (AP) reviewed the titles and abstracts of the remaining citations. When titles and/or abstracts met the inclusion criteria or could not be confidently excluded, the full text was retrieved. The resultant list was reviewed by two independent reviewers (AP,MTW) for eligibility, with disagreements resolved by consensus. The reference lists of all included studies were screened, with 54 further potential citations identified. The penultimate list of eligible studies was reviewed (JKT, PG, MH, DM, JG) and three additional citations were nominated.

Eligibility was limited to controlled trials. To estimate whether reported items differed between controlled trials and lower level study designs, a random selection of 10 studies identified in the search using lower level study designs (pre-post studies without a separate control group) were compared with 15 randomly selected randomised and non-randomised trials (with control groups) for frequency and commonality of reporting items.

Studies were not appraised for methodological bias because the aim of this systematic review was to describe how EBP educational interventions have been reported rather than describing the efficacy of the interventions [25].

Data extraction and treatment

A data extraction instrument was prospectively planned, developed and published [19] based on the Cochrane Handbook for Systematic Reviews of Interventions [26]. As outlined in the study protocol for the GREET, the 25 data items were extracted across domains including Participants, Intervention, Content, Evaluation and Confounding (Table 2). All data items were initially recorded verbatim. Consistency between extractors (AP, MW, LKL, MPM) was confirmed using a random sample of 10 per cent of eligible studies (inter-rater consistency >90% agreement). Data extraction was then completed by pairs of independent reviewers (AP and either LKL, MTW, MPM) and disagreements were resolved by discussion to reach consensus.

Table 2 Data extraction domains and information items

The 25 data items were grouped according to the frequency of reporting (ranging from low to very high frequency of reporting) and further reviewed to determine their role in relation to the reporting of the intervention. To provide an objective guide for differentiating between information items relating specifically to the intervention and those relating to the reporting of study design/methodology two reporting guidelines were used. The Template for Intervention Description and Replication (TIDIER) [27] was used to identify information items considered to be specific to the reporting of the intervention, and the CONSORT statement (excluding item 5, intervention) was used to identify information items which were considered to be related to the study design/methodology and confounding issues [11].

Results

Characteristics of eligible studies

Sixty-one studies met the inclusion criteria [4, 2887] (Figure 1) with all of these published in English. The median publication year was 2003 (range 1984 to 2011) with increasing frequency after 2000 (Figure 2). Studies were published in 34 journals with the most frequent being the Journal of General Internal Medicine (n = 7, 11%), BioMedical Central Medical Education (n = 6, 10%), Academic Medicine (n = 4, 7%) and Medical Education (n = 4, 7%).

Figure 1
figure 1

PRISMA flow diagram.

Figure 2
figure 2

Publication frequency for studies included in systematic review.

There were approximately equal numbers of randomised (n = 29, 48%) and non-randomised (n = 32, 52%) trials. Two studies referenced the use of a reporting guideline [36, 85] with both studies using the 2001 CONSORT statement [88].

Frequency of reporting

The frequency of reporting of the 25 items across the five domains was evaluated for all studies (Additional file 2). Frequency of reporting, was described as very high (reported by ≥90% of studies), high (70-89%), moderate (50-69%) and low (<50%) (Table 3).

Table 3 Summary of the reporting of the data items for included studies (n = 25)

Information items reported with very high frequency (≥90% of studies)

Seven items were reported with very high frequency (Table 3). Four items from two domains were reported by all studies (Participant domain: context of education/stage of training, professional discipline, number of disciplines; Evaluation domain: evaluation methods). The remaining three items included the strategies used for teaching and learning (Intervention domain: n = 59, 97%); whether the same evaluation method was used for all groups (Evaluation domain: n = 57, 93%); and confounding issues or study limitations (Confounding issues domain: n = 57, 93%).

Information items reported with high frequency (70 to 89% of studies)

Six items were reported with high frequency (Table 3). The majority were from the Intervention domain including the number of sessions (n = 51, 84%), program duration (n = 50, 82%), setting (n = 49, 80%), frequency of the sessions (n = 44, 72%) and the educational materials used (n = 45, 74%). The remaining items reflected which EBP steps were included in the intervention (Content domain n = 46, 75%). The most frequently reported EBP step was Step 3 (appraise) (n = 46, 75%) followed by Step 2 (acquire) (n = 38, 62%) and Step 1 (ask) (n = 30, 49%). The two least frequently reported were Steps 4 (apply) (n = 23, 38%) and 5 (assess) (n = 4, 7%).

Information items reported with moderate frequency (50 to 69% of studies)

Two items were reported with moderate frequency: duration of sessions (Intervention domain: n = 42, 69%) and psychometric properties of the evaluation method (e.g. face validity, inter-rater reliability) [89] (Evaluation domain: n = 35, 57%).

Information items reported with low frequency (<50% of studies)

There were ten information items which were reported with low frequency (Table 3). Half of the items were from the Participants/Instructors domain including previous EBP or research training exposure of the learners (n = 30, 49%), adherence or attendance at the intervention (n = 24, 39%), profession of the instructors (n = 2, 44%), number of instructors involved (n = 24, 39%) and previous teaching experience (n = 8, 13%). Two items were from the Intervention domain (educational framework n = 22, 36% and student time spent not face to face n = 1, 2%) and two items were from the Evaluation domain (citation provided for EBP content) n = 18, 30% and citation provided for steps of EBP for the content of the intervention n = 9, 15%). The remaining item concerned whether the name of the evaluation instrument was identified and whether it was modified (n = 12, 21%) (Evaluation domain).

For the studies that provided a citation to describe the steps or components of EBP (n = 18, 30%), the most commonly reported were Sackett et al. (Evidence-Based Medicine: How to Practice and Teach EBM) [90] (n = 12,67%) and the Evidence-Based Medicine Working Group [91] (n = 5,26%).

Items relating specifically to the reporting of the intervention

When the 25 data extraction items were sorted into those related to the reporting of interventions (TIDIER) [27] and items related to study design (CONSORT) [11], most of the items (n = 16, 64%) were considered to be specific to the reporting of intervention rather than the study design (n = 9, 36%) (Table 4).

Table 4 Items specific to the reporting of interventions (n = 16) allocated to the TIDIER framework

Confounding

There were 197 issues reported by authors of 57 (93%) studies as either limitations or factors which may have confounded the results of the educational intervention (Additional file 3). There was little commonality across the confounding items relating to the intervention, with almost one quarter of the studies (n = 10, 24%) reporting limitations relating to the delivery, duration or time of year for the educational intervention program.

Discussion

The aim of this systematic review was to determine what information is reported in studies describing educational interventions used to facilitate foundational skills and knowledge for EBP.

Stiles et al. [15] use the term ‘educational dose’ to describe information such as the duration of the educational intervention, learning environment, the extent and intensity of direct interactions with the educators and the extent of the institution support. This educational dose is considered a core principle of an educational intervention [15]. For an educational intervention for EBP to be replicated, compared or synthesised, a detailed description of this educational dose is essential. However, this current review and several previous reviews [27], have identified inconsistent reporting of information items for the educational dose in studies of educational interventions.

The most consistently reported items across the 61 included studies were the learners context of education/stage of training, professional discipline of the learners, number of different disciplines of the learners and the evaluation method used, which were reported by all of the included studies. The most consistently reported domain was the Intervention delivery, with six out of nine items (67%) reported by more than 70 per cent of studies.

Comparison of the most consistently reported items in this review to other reviews of educational interventions undertaken as part of the development for a reporting guideline, reveals similar results. The learners’ stage of education was found to be reported by 97 per cent [14] and 83.8 per cent [17] of studies. The professional discipline and number of different disciplines of the learners was not reported for any of these systematic reviews of educational interventions. It is possible that this is due to these previous reviews being based solely on the medical profession. The evaluation method used, reported by all studies in this systematic review, was reported by between 25 [14] and 67.8 per cent [16] of studies.

The least consistently reported domain in our study was the ‘participants/instructors’ due to the limited reporting of detail regarding the instructor(s). Information regarding the number of instructors and their professional discipline and teaching experience was often not reported. These results are not unique to our findings. Maggio et al. [6] were not able to determine the instructors profession in 40 per cent of studies. Patricio et al. [17] found information regarding the number and detail of the faculty involved in the intervention was missing for 85.7% of studies and Haidet et al. [16] found no information regarding the faculty background in Team Based Learning reported for any of the included studies.

While every effort was made to plan and undertake a comprehensive search strategy, there are several potential limitations for this review. This systematic review was undertaken using the PRISMA reporting guideline [10] which includes recommendations for a number of strategies to identify sources of potential eligible articles. The screening of citations of included articles (progeny) is not currently included in the PRISMA reporting guideline, and was not undertaken as part of this review. However, in theory, if a review of progeny were included, relevant existing studies (similar topic, within search strategy, within included databases and within timeframe) should have been identified by the original search strategy.

The final search terms included the health professional disciplines of medicine, nursing and allied health; allied health disciplines included were based on the definition by Turnbull et al. [22]. It is possible that some professions, such as complimentary medicine, could have been missed. However, this risk was minimised by using a search string that included all relevant terms pertaining to EBP (e.g. evidence-based practice; evidence based medicine; EBM; best evidence medical education; BEME; research evidence). All studies, irrespective of professional discipline, should have been identified. During the initial search phase we did not apply language limits, however while reviewing the final list of eligible studies the decision was made to exclude three studies published in Spanish. It is unlikely that exclusion of these studies, which accounted for approximately four per cent of the eligible studies, would meaningfully alter our results.

Despite the development and testing of a prospective data extraction process, the allocation of items into pre-determined domains had the potential to overlook important information items and introduce bias. This systematic review was planned as the first of a three stage development process for the GREET. The purpose of the systematic review (stage 1) was to determine what had previously been reported in educational interventions for EBP to inform the second stage of the development process, the Delphi survey. The Delphi survey was planned to seek the prospective views of experts in EBP education and research regarding which information should be reported when describing an intervention to facilitate knowledge and skills in EBP. In order to ensure that the widest possible range of items were considered in the third stage of the reporting guideline development process, it was prospectively planned that all items identified within the systematic review would be included for comment in the Delphi process.

Finally, it is not always possible or practical to have a control group in studies investigating the effectiveness of educational interventions, hence the findings from this review may be limited by the exclusion of lower level study designs. Although our analysis comparing a small number of included studies to studies excluded based on design suggest that the reporting of educational interventions for EBP is similar irrespective of research design.

The usefulness of reporting guidelines for research designs such as systematic reviews (PRISMA) [10] and randomised controlled trials (CONSORT) [11] is well established, with many leading journals and editors endorsing these guidelines. These guideline documents are dynamic; the CONSORT checklist is continually updated as new evidence emerges. For example, selective outcome reporting was added to the 2010 CONSORT update [11]. In reporting guidelines for study designs, there is usually an item relating to the reporting of an intervention in addition to items relating to study methodology and analysis. In CONSORT, one item pertains to the reporting of the intervention (item 5: reporting of the interventions for each group with sufficient details to allow replication, including how and when they were actually administered). Given the number of information items identified within this systematic review, we believe that authors will benefit from guidance regarding the detail necessary to support replication and synthesis of educational interventions for EBP.

There are extensions to CONSORT for the reporting of interventions such as herbal and homeopathic interventions [92, 93], non-pharmacological treatments [94], acupuncture [95], E-Health [96] and tailored interventions [97]. The collaborators of the CONSORT group have recently developed the TIDIER checklist and guide [27], a generic reporting guideline for interventions, irrespective of the type of intervention, where no other specific guidance exists. Although there are four reporting guidelines for educational interventions, none of these have been developed using a formal consensus process, nor do they relate specifically to educational interventions for EBP. The findings of this review suggest that there is need for supplemental reporting guidelines (to expand the single item in CONSORT) to address the reporting of educational interventions for EBP.

The determination of which items are necessary for describing an educational intervention is a complex task. Empirical evidence for which items are likely to introduce bias in educational interventions for EBP is scarce, largely due to the inconsistent and incomplete reporting for studies reporting educational interventions for EBP [27]. Information reported by authors as confounders or limitations may provide anecdotal evidence regarding which information items may introduce bias or impact upon study outcomes. The most frequently reported limitations by the authors related to the delivery, duration or the time of year for the educational intervention (n = 10, 24%).

Conclusion

This systematic review collated information concerning what has been reported in the description of educational interventions for EBP. Completing the first stage in the development process for a reporting guideline specific for educational interventions for EBP (GREET) [19], the findings of this review provide a starting point for a discussion regarding the types of items that should be included in the GREET. The second stage in the development process for the GREET, a Delphi consensus survey, will be informed by the findings of this review. The GREET will be the first intervention-specific reporting guideline, based on the TIDIER framework, and will provide specific guidance for authors of studies reporting educational interventions for EBP.

Authors’ information

AP is a PhD candidate, School of Health Sciences, University of South Australia, Adelaide, Australia.

LKL is a Post-doctoral Research Fellow, Health and Use of Time Group (HUT), Sansom Institute for Health Research, School of Health Sciences, University of South Australia, Adelaide, Australia.

MPM is a Lecturer, School of Health Sciences and a member of the International Centre for Allied Health Evidence (iCAHE), University of South Australia, Adelaide, Australia.

JG is a Senior Research Associate, Ottawa Hospital Research Institute, The Ottawa Hospital, Centre for Practice-Changing Research (CPCR), Ontario, Canada.

PG is the Director, Centre for Research in Evidence-BasedPractice (CREBP),Bond University, Queensland, Australia.

DM is a Senior Scientist, Clinical Epidemiology Program, Ottawa Hospital Research Institute, The Ottawa Hospital, Centre for Practice-Changing Research (CPCR), Ontario, Canada.

JKT is an Associate Professor, University of Southern California Division of Biokinesiology and Physical Therapy, Los Angeles, USA.

MH is a visiting Professor, Bournemouth University, Bournemouth, UK and a consultant to Best Evidence Medical Education (BEME).

MTW is an Associate Professor, School of Population Health and a member of the Nutritional Physiology Research Centre (NPRC), School of Health Sciences, University of South Australia, Adelaide, Australia.

Correspondence should be addressed to Ms Anna Phillips, School of Health Sciences, University of South Australia, GPO box 2471, Adelaide 5001, Australia.