Background

Systematic reviews (SRs) and meta-analyses are an essential resource for healthcare decision-makers [1]. When conducted well, SRs can provide credible and timely data on a range of enquiries, such as which treatments are effective, ineffective or harmful; which tests accurately diagnose a condition and which exposures are associated with health outcomes. However, the value of SRs depends on how well authors have reported what they did, and what they found. If such information is absent or ambiguous, readers cannot judge whether the results of the SR are robust to the methods used, cannot attempt to reproduce the findings and cannot interpret the findings accurately. This can contribute to the failure to implement the findings of SRs into clinical practice [2]. Therefore, transparent reporting of SRs should be considered critically important by authors of SRs [3, 4].

The transparency of SRs and meta-analyses of health research has been called into question on many occasions [5]. The first formal appraisal of SRs with a focus on medicine was performed by Cynthia Mulrow, who identified several poor reporting practices in a sample of 50 medical review articles published between June 1985 and June 1986 [6]. For example, clearly specified methods of identifying, selecting and appraising studies were available in one article only. Transparency was only slightly better in reviews published in 1996, with less than 25% of articles describing how evidence was identified, evaluated or synthesised [7]. In the last decade, transparency of SRs has certainly improved, yet a high amount of suboptimal reporting persists [8].

Improvements in the transparency of SRs in recent years may be attributed to the dissemination of reporting guidelines. Reporting guidelines provide evidence-based recommendations for authors on how to report their research methods and findings clearly [9]. In 1999, an international group of 30 epidemiologists, clinicians, statisticians, editors and researchers developed a reporting guideline for meta-analyses of randomised trials—the QUOROM (QUality Of Reporting Of Meta-analyses) Statement [10]. In 2005, a meeting was convened to update QUOROM to address several conceptual and practical advances in the methodology of SRs and to help overcome several shortcomings identified in an audit of SRs [3]. The guideline was renamed the PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) Statement, and published in 2009 [11]. It was accompanied by an explanation and elaboration document, which provided detailed guidance for each of the 27 included items, and examples of exemplar reporting [12].

According to citation data in Scopus®, the PRISMA Statement has had a very high uptake from the biomedical research community (Fig. 1). The checklist paper [11, 13,14,15,16,17,18,19] has been cited 19,402 times as of 31 July 2017, and the accompanying explanation and elaboration document [12, 20,21,22,23] received 5483 citations by this date. However, not all published SRs cite the guideline; for example, in a random sample of 119 non-Cochrane SRs of therapeutic interventions indexed in MEDLINE® in February 2014, 42 (35%) mentioned the use of the PRISMA Statement [8].

Fig. 1
figure 1

Cumulative number of citations of the PRISMA Statement. Data obtained from Scopus® on 31 July 2017. E&E explanation and elaboration

Since its publication, seven extensions to the PRISMA Statement have been developed to facilitate reporting of different types or aspects of SRs (Table 1). These include the PRISMA-Equity extension [24,25,26], PRISMA for Abstracts of SRs [27], PRISMA extension for reporting SRs incorporating network meta-analysis [28], PRISMA for SRs and meta-analyses of individual participant data [29], PRISMA for SR protocols [30, 31], PRISMA harms checklist [32] and PRISMA extension for SRs of complex interventions [33, 34]. Citation counts for the PRISMA extensions are much lower than those of the PRISMA Statement, but they have not had the same amount of time to accrue citations (Fig. 2). Also, one should not expect the extensions to receive as many citations, as they are more restricted in scope, meaning that fewer SRs to which the extensions are applicable are published each year. The most cited extension is the checklist paper for PRISMA-P (for SR protocols) [30], which has received 683 citations since its publication in January 2015.

Table 1 Scope of the PRISMA Statement and published extensions
Fig. 2
figure 2

Cumulative number of citations of PRISMA extensions published before 2017. Data obtained from Scopus® on 31 July 2017. E&E explanation and elaboration, IPD individual participant data, NMA network meta-analysis

There are also eight PRISMA extensions that are in development (Table 2). These include extensions for SRs of newborn and child health research and for protocols of such SRs, for SRs of diagnostic test accuracy studies, for rapid reviews, for scoping reviews, for SR search methods, for SRs of traditional Chinese medicine interventions and for SRs of in vivo animal studies.

Table 2 Scope of the PRISMA extensions in development

It is important to evaluate whether the PRISMA Statement and extensions have achieved what they are designed to do—improve the transparency of SRs. We are aware of two previous SRs that have investigated the adherence of SRs to the PRISMA Statement (i.e. the extent to which SRs comply with each item in the statement) [35, 36]. Another SR has examined whether transparency is better in SRs published in journals that endorse the PRISMA Statement (e.g. suggest its use in the journal instructions to authors or require that authors submit a PRISMA checklist accompanying their SR) [37]. However, to our knowledge, there has been no attempt to map what other research on the uptake, and impact of the PRISMA Statement and extensions has been done. Also, there has been no attempt to synthesise studies evaluating adherence of SRs published after the PRISMA Statement was disseminated. Therefore, we aimed to address these gaps by conducting a scoping review of meta-research studies evaluating the PRISMA Statement and extensions.

Methods

We did not pre-register the methods of our scoping review, as we are unaware of any register for methodological research of this nature.

We considered articles to be eligible for inclusion if they were an empirical study of any design (e.g. randomised trial, cross-sectional analysis, before-after study), which investigated some component of the PRISMA Statement or extensions (e.g. how often PRISMA is referred to in journal instructions to authors) or which used the PRISMA Statement or one of the extensions for evaluative purposes (e.g. to assess how often SRs adhere to each PRISMA item). We included meta-research studies regardless of language or year of publication. We excluded commentaries, editorials or letters to the editor.

One author (MJP) searched for potentially relevant studies indexed in MEDLINE® from inception to 31 July 2017 (specifically, Ovid MEDLINE® Epub Ahead of Print, In-Process and Other Non-Indexed Citations; Ovid MEDLINE® Daily and Ovid MEDLINE and Versions®). The following search strategy was used to retrieve articles that included the term ‘PRISMA’ (abbreviated or spelled out in full) in the title or abstract of the article:

  1. 1.

    ‘Preferred Reporting Items for Systematic Reviews and Meta-analyses’.ti,ab.

  2. 2.

    PRISMA.ti,ab.

  3. 3.

    1 or 2.

One author (MJP) screened all titles and abstracts, and any full-text articles retrieved, to determine eligibility. The same author recorded the types of evidence available in the included meta-research studies. Types of evidence were classified as:

  • data on SR adherence to the PRISMA Statement or extensions;

  • characteristics associated with SR adherence to PRISMA (e.g. journal endorsement, year of publication);

  • the frequency of journal instructions to authors referring to the PRISMA Statement or extensions;

  • other (e.g. frequency of SR authors who reported using the PRISMA Statement to guide their reporting).

To determine the influence of the PRISMA Statement on the transparency of SRs, we pooled the findings of meta-research studies evaluating how often SRs adhere to the PRISMA Statement. This updates a previous SR which included adherence studies published up to October 2014 [35]. One author (MJP) collected from each meta-research study the following data about the SRs evaluated: focus (e.g. therapeutic, diagnostic), clinical area, language, years of publication and frequencies of SRs adhering to each of the 27 PRISMA Statement items. In some cases, authors of meta-research studies recorded if a particular PRISMA item was fully reported or partially reported in each of the SRs evaluated. In such cases, we recorded only the number of SRs that fully reported the PRISMA item. One author (MJP) contacted study authors to request data on adherence to individual items if these data were not available in the published article (e.g. when study authors reported only the mean number of items that SRs adhered to).

We pooled data on SR adherence to individual PRISMA items across all SRs in the included studies. We noted items that fewer than two thirds (67%) of SRs adhered to and those that are fewer than half of SRs adhered to. We also pooled data on SR adherence to individual PRISMA items in a subset of studies that evaluated SRs published after the PRISMA Statement was disseminated. For this analysis, we analysed studies which included only SRs published in 2010 or later or studies which reported data on a subgroup of SRs published in 2010 or later. We did not contact study authors for this subgroup data. We conducted all analyses in Microsoft Excel.

Results

Scoping review of meta-research studies

The search of MEDLINE® yielded 5001 citations (Fig. 3). After screening each title and abstract, we retrieved the full text of 170 articles. We excluded 70 of these articles, most of which were editorials or commentaries (reasons for exclusion are listed in Additional file 1: Table S1). One hundred meta-research studies met our inclusion criteria (listed in Additional file 2: Table S2). The studies were published between 2011 and 2017, and more than half were published in 2015 or later (n = 59). All of the studies were observational in design; there were 86 cross-sectional analyses, six uncontrolled before-after studies, four surveys of authors and four systematic reviews of meta-research studies.

Fig. 3
figure 3

Flow diagram of identification, screening and inclusion of studies

We recorded 20 different types of evidence available across the included meta-research studies (Table 3). The most common type of evidence available was data on SR adherence to the PRISMA Statement, which was reported in 57/100 (57%) studies. Many of these 57 studies (n = 37 [65%]) also investigated characteristics associated with SR adherence to the PRISMA Statement, such as the type of journal, year of publication and article word count. The third most common type of evidence available was data on the frequency of journals referring to the PRISMA Statement or extensions in the instructions to authors (n = 18/100 [18%]).

Table 3 Types of evidence available in meta-research studies (n = 100) evaluating the PRISMA Statement or extensions

Few studies have evaluated how well SRs adhere to the PRISMA extensions; adherence to PRISMA for Abstracts and PRISMA for Network Meta-Analyses has been examined in three studies and one study, respectively (Table 3). Further, few studies have investigated whether the endorsement of the PRISMA Statement by journals was associated with adherence to PRISMA (n = 8/100 [8%]). We did not identify any studies that investigated whether journal endorsement of one of the PRISMA extensions was associated with SR adherence to the extension.

Evaluations of SR adherence to the PRISMA Statement

Of the 57 studies evaluating SR adherence to the PRISMA Statement [38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94], most were published between 2015 and 2017 (33/57 [58%]), focused on SRs of therapeutic interventions only (45/57 [79%]), evaluated non-Cochrane SRs only (34/57 [60%]) and evaluated SRs written in English only (39/57 [68%]) (Table 4). A total of 6487 SRs were evaluated across all studies; the median (interquartile range) number of SRs evaluated per study was 74 (44-144). The evaluated SRs were published between 1989 and 2016.

Table 4 Characteristics of 57 studies evaluating SR adherence to the PRISMA Statement

All 57 studies assessed adherence to individual PRISMA items, with relevant data provided on request by authors of ten studies [39, 42, 43, 45, 66, 67, 69, 77, 79, 85]. By pooling the PRISMA adherence data across SRs in all 57 reports, we identified 11 items that fewer than 67% of SRs adhered to (Fig. 4; numerical data available in Additional file 3: Table S3). These include item 2 (structured summary), item 5 (methods: protocol and registration), item 8 (methods: search), item 11 (methods: data items), item 12 (methods: risk of bias in individual studies), item 15 (methods: risk of bias across studies), item 16 (methods: additional analyses), item 19 (results: risk of bias within studies), item 22 (results: risk of bias across studies), item 23 (results: additional analyses) and item 27 (funding). There were six items that fewer than 50% of SRs adhered to (items 5, 15, 16, 22, 23 and 27).

Fig. 4
figure 4

Summary percentage across reports of SRs adhering to the PRISMA Statement

PRISMA adherence data for SRs published in 2010 or later (i.e. after the PRISMA Statement was published) were available in 27 studies [38, 39, 41, 42, 44, 46, 56, 60, 62, 68,69,70,71,72,73,74,75,76,77,78,79, 81,82,83,84, 92, 94], which evaluated 2382 SRs. The characteristics of these studies (i.e. focus, clinical area, language of SRs) were similar to those of the total set of studies. SR adherence to the PRISMA Statement was higher for nearly all items in this subset of recent SRs, compared with the adherence data across all SRs (Fig. 4; numerical data available in Additional file 3: Table S3). There were 12 items that more than 80% of SRs adhered to (items 1, 3, 4, 6, 7, 14, 17, 18, 20, 21, 24 and 26). However, lack of transparency remains an issue for many SRs. There were nine items that fewer than 67% of SRs adhered to (items 5, 8, 12, 15, 16, 19, 22, 23 and 27), and one item was adhered to by 21% of SRs only (item 5, on whether a SR protocol or registration number exists).

Discussion

Our scoping review suggests that the PRISMA Statement and extensions have provided fertile ground for meta-research. Twenty different types of evidence were available across 100 meta-research studies. The most common type of evidence was data on SR adherence to the PRISMA Statement, which has been evaluated in 57 studies. The pooled results of these studies indicate that reporting of many items of the PRISMA Statement is suboptimal, even in those SRs published after its dissemination in 2009. Very few meta-research studies have evaluated SR adherence to the PRISMA extensions, but this is unsurprising given that most extensions were disseminated in 2015 or later. Few studies have tested strategies to increase adherence to the PRISMA Statement and extensions.

Strengths and limitations

There are several strengths of our research. To our knowledge, this is the first attempt to systematically map research conducted on the PRISMA Statement and extensions. Most of the included studies assessing SR adherence to the PRISMA Statement focused on one clinical area, so by pooling data across these studies, our findings are more generalisable. Also, we managed to obtain unpublished data from ten studies that had not reported data on adherence to each individual PRISMA item [39, 42, 43, 45, 66, 67, 69, 77, 79, 85].

A few limitations must be acknowledged. We included only meta-research articles indexed in one bibliographic database (MEDLINE®) and written in English. However, we do not see any reason why our findings would differ had other databases and meta-research articles in languages other than English been consulted. Screening of records and collection of data from articles were performed by one author only. It is therefore possible that we may have missed some relevant meta-research studies or made errors when recording the frequency of SRs adhering to the PRISMA Statement. We have uploaded all data collected to the Open Science Framework (https://osf.io/7x2mp/) so that interested readers can verify our data and replicate our results. Most of the SRs evaluated in the 57 studies investigating SR adherence to the PRISMA Statement were written in English, and it is possible that non-English language SRs may be less likely to adhere to PRISMA, if their authors were not confident in English. Our classification of types of evidence available in meta-research studies reflects what was reported; we did not contact study authors to enquire whether they conducted other analyses yet chose not to report the findings. We did not record the references of SRs evaluated in each study investigating SR adherence to the PRISMA Statement and so are unaware if some SRs appeared in more than one of the included meta-research studies. However, based on the information regarding the types of SRs (e.g. Cochrane or non-Cochrane), years of publication of SRs and clinical focus of SRs, we judged the number of overlapping SRs to be low.

We were unable to compare the reporting of SRs published after PRISMA was disseminated in 2009 with that before 2009 because of how the included meta-research studies were designed and reported. Most studies (43 of 57) included some SRs published before 2009 and some published after 2009, but most studies did not report the number of SRs in each category. There were 14 studies that included only SRs published after 2009, 13 studies which provided subgroup data on SRs published after 2009 (but not all of these studies provided corresponding data for SRs published before 2009) and three studies included only SRs published before 2009. Given the data on PRISMA adherence in SRs published before 2009 was limited to a small subset of the included studies, we decided to restrict our analysis of PRISMA adherence to all SRs (regardless of year of publication) and SRs published after 2009. A formal before-after comparison was therefore not possible.

We focused on the PRISMA Statement and extensions, although we are aware of other reporting guidelines for SRs. These include the Methodological Expectations of Cochrane Intervention Reviews (MECIR) reporting standards [132, 133], the American Psychological Association Meta-Analysis Reporting Standards (MARS) [134], the ENTREQ Statement for syntheses of qualitative research [135], the RAMESES publication standards for realist syntheses [136] and meta-narrative reviews [137] and reporting guidance for describing interventions in SRs [138]. More research is needed to map the research conducted on these reporting guidelines.

Comparison with other studies

We are aware of two other syntheses of meta-research studies that have investigated the adherence of SRs to the PRISMA Statement [35, 36]. Samaan et al. [36] included three studies, and Pussegoda et al. [35] included 13 studies, respectively. Both reached the same conclusion as us, that adherence to the PRISMA Statement is suboptimal; however, unlike our review, neither analysed reporting of SRs published after the PRISMA Statement was published. Another SR by Stevens et al. [37] synthesised the results of three studies exploring whether SR adherence to the PRISMA Statement is higher in journals which endorse the reporting guideline. We identified in our scoping review an additional five studies that could be added to an update of this review. To our knowledge, ours is the only review which has mapped research conducted on the PRISMA extensions.

Implications of the findings

There are several reasons why adherence is better for some PRISMA items than others. It is possible that the less complex the item, the easier it is to report it. For example, most of the 12 PRISMA items that were adhered to by more than 80% of SRs published in 2010 or later are relatively straightforward to report. These items include identifying the report as a SR or meta-analysis in the title, providing a rationale and objectives, presenting study characteristics and reporting conclusions. Several items in the PRISMA Statement comprise multiple components, which some systematic reviewers may fail to fully address (e.g. item 12 asks authors to ‘describe methods used for assessing risk of bias of individual studies (including specification of whether this was done at the study or outcome level), and how this information is to be used in any data synthesis’). Also, reporting of some items may depend on whether the journal facilitates reporting of that item (e.g. authors may be unable to present a full electronic search strategy (item 8) in journals that do not allow supplementary files). In addition, some items with low adherence may not be considered sufficiently important to report by a majority of systematic reviewers and journal editors. It would be useful to conduct surveys and interviews with systematic reviewers to explore the contributions of these potential barriers and facilitators to complete SR reporting.

To our knowledge, there have been no prospectively designed, controlled studies evaluating whether the PRISMA Statement or extensions are having their intended effect. This is surprising, and a different threshold than that required to introduce a drug into the marketplace. Instead, only a few cross-sectional or uncontrolled before-after studies have evaluated the impact of journal endorsement of the PRISMA Statement on reporting of SRs. Of these eight studies [38, 39, 46, 71, 77, 81, 84, 94], six evaluated whether journals which ‘recommend’ or ‘encourage’ use of the PRISMA Statement in the journal instructions to authors publish SRs that are reported more completely. Two studies investigated whether reporting is clearer in journals that ask authors to submit a PRISMA checklist when submitting an SR. Both are rather low-intensity interventions that may not have the desired effect. For example, a recommendation in the instructions to authors can easily be missed by authors (some of whom will not even check the instructions), while a submitted PRISMA checklist may be ignored by peer reviewers and journal editors who face competing pressures on their time.

Researchers need to develop more efficient and intensive interventions to implement reporting guidelines such as the PRISMA Statement and extensions. We believe technology can play a valuable role in this regard. For example, StatReviewer software performs an automated review of the statistical and reporting integrity of scientific manuscripts (http://www.statreviewer.com/). Manuscripts can currently be checked against the following reporting guidelines: CONSORT 2010 [139], STROBE [140], STARD [141, 142], ARRIVE [143] and The Uniform Requirements for Medical Journals (http://www.icmje.org/recommendations/). StatReviewer is considering including PRISMA in their suite of reporting guidelines (D. Moher, personal communication). We also think rigorous evaluations, in the form of randomised trials, of StatReviewer are needed. Such evaluations could build upon the experiences of previous randomised trials evaluating web-based reporting guideline tools (e.g. WebCONSORT [144], COBWEB [145]).

It is 12 years since the PRISMA group last met, and the PRISMA Statement has not been updated since its publication 8 years ago. We believe that an update is necessary to address the poor adherence to the guideline. An updating process will provide the opportunity to discuss how to rearrange the layout and rephrase the checklist items to increase clarity. It will also allow for potential new items to be considered, based on recent methodological developments affecting SR conduct and reporting. These developments include novel guidance on how to:

  • summarise findings when meta-analysis is not appropriate [146, 147];

  • report and synthesise intervention characteristics of included studies [138, 148];

  • use and interpret prediction intervals for random-effects meta-analyses [149, 150];

  • enhance reproducibility of meta-analytic results and share data collected [151, 152] and

  • report the methods and results of updated SRs [153] and living SRs [154].

In addition, developing a comprehensive research translation strategy to help journals endorse and implement the updated guideline may facilitate its use. Journal editors and researchers should work together to develop prospective (ideally randomised), controlled studies to provide robust evidence about the effect of the updated guideline on the transparency of SRs.

Conclusions

Many studies have evaluated how well SRs adhere to the PRISMA Statement, and the pooled result of these suggests that reporting of many items is suboptimal. Little research has been done to design and test strategies to increase adherence to the PRISMA Statement or extensions. An update of the PRISMA Statement, followed by a toolkit of strategies to help journals endorse and implement the updated guideline, may improve the transparency of SRs.