Impact and perceived value of journal reporting guidelines among Radiology authors and reviewers

  • Marc DeweyEmail author
  • Deborah Levine
  • Patrick M. Bossuyt
  • Herbert Y. Kressel
Imaging Informatics and Artificial Intelligence



To analyse the author-perceived impact on the final manuscript and perceived value of journal reporting guidelines among Radiology authors and reviewers.


This survey was conducted among all corresponding authors of original research submissions to Radiology. Separately, we surveyed active Radiology reviewers. Results were analysed using logistic multivariate regression.


Overall, 60% of authors (831/1391) completed the survey. Only 15% (120/821) had used the guideline and checklist when designing the study, significantly more so for PRISMA (55%, 16/29) compared with STARD and STROBE users (17%, 52/310; p < 0.001 and 10%, 46/443; p < 0.001). For 23% of the surveyed manuscripts (189/821), authors used the guidelines when writing the manuscript; these authors more often reported an impact on the final manuscript (i.e. changes in the content, 57%, 107/189) compared to those who used the guideline when submitting the manuscript (35%, 95/272; p < 0.001; OR 0.433, 95% confidence interval [CI] 0.288–0.648, p < 0.001) or when the checklist was requested by the editorial office (17%, 41/240; p < 0.001; OR 0.156, CI 0.097–0.247, p < 0.001). The perceived value of the reporting guideline was rated significantly higher the earlier the authors used the guideline in the research process (p < 0.001). The checklist was used by 77% of reviewers (200/259) some or all of the time; 60% (119/199) said it affected their reviews.


Reporting guidelines had more author-perceived impact on the final manuscript and higher perceived value the earlier they were used, suggesting that there is a need for enhanced education on the use of these guidelines.

Key Points

• Only 15% of authors had used the respective reporting guideline and checklist when designing the study.

• Almost 4 out of 5 Radiology authors and half of reviewers judged the guideline checklists to be useful or very useful.

• Reporting guidelines had more author-perceived impact on manuscripts, i.e. changes that were made in the final manuscript, the earlier authors used them in the research process.


Randomised controlled trial Clinical trial Diagnostic imaging Information dissemination Surveys and questionnaires 



Consolidated Standards of Reporting Trials


Preferred Reporting Items for Systematic reviews and Meta-Analyses)


Standards for Reporting Diagnostic Accuracy


Strengthening the Reporting of Observational Studies in Epidemiology



We thank all authors and reviewers for participation in this study. Without the help of the staff at the Radiological Society of Northern America (RSNA), this study would not have been possible. We are also thankful to other members on the editorial board of Radiology for discussions. We are thankful to Bettina Herwig for copy editing, to Kyllan Wescott for data management, and to Ivan Perez for help with the multivariable analysis.


Prof. Dewey has received support for this study from the International Society for Strategic Studies in Radiology (Young Leaders Club).

Compliance with ethical standards


The scientific guarantor of this publication is Prof. Dr. Marc Dewey.

Conflict of interest

The authors of this manuscript declare no relevant relationships with companies and no conflicts of interest.

Statistics and biometry

No complex statistical methods were necessary for this paper.

Informed consent

This investigation received a waiver of exemption from the committee on clinical investigations at Beth Israel Deaconess Medical Center.

Ethical approval

This investigation received a waiver of exemption from the committee on clinical investigations at Beth Israel Deaconess Medical Center.


• Prospective

Supplementary material

330_2018_5980_MOESM1_ESM.pdf (182 kb)
ESM 1 (PDF 181 kb)
330_2018_5980_MOESM2_ESM.pdf (184 kb)
ESM 2 (PDF 183 kb)


  1. 1.
    Topol EJ (2016) Money back guarantees for non-reproducible results? BMJ 353:i2770CrossRefGoogle Scholar
  2. 2.
    Rennie D, Flanagin A (2014) Research on peer review and biomedical publication: furthering the quest to improve the quality of reporting. JAMA 311:1019–1020CrossRefGoogle Scholar
  3. 3.
    The Equator Network. Available via Accessed 28 Sept 2017
  4. 4.
    Moher D, Hopewell S, Schulz KF et al (2010) CONSORT 2010 explanation and elaboration: updated guidelines for reporting parallel group randomised trials. BMJ 340:c869CrossRefGoogle Scholar
  5. 5.
    Schulz KF, Altman DG, Moher D; CONSORT Group (2010) CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. BMJ 340:c332Google Scholar
  6. 6.
    Bossuyt PM, Reitsma JB, Bruns DE et al (2015) STARD 2015: an updated list of essential items for reporting diagnostic accuracy studies. BMJ 351:h5527CrossRefGoogle Scholar
  7. 7.
    Bossuyt PM, Reitsma JB, Bruns DE et al (2003) Towards complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. Radiology 226:24–28CrossRefGoogle Scholar
  8. 8.
    Liberati A, Altman DG, Tetzlaff J et al (2009) The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: explanation and elaboration. BMJ 339:b2700CrossRefGoogle Scholar
  9. 9.
    Moher D, Liberati A, Tetzlaff J, Altman DG; PRISMA Group (2009) Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. BMJ 339:b2535Google Scholar
  10. 10.
    von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP; STROBE Initiative (2007) Strengthening the reporting of observational studies in epidemiology (STROBE) statement: guidelines for reporting observational studies. BMJ 335:806–808Google Scholar
  11. 11.
    O'Leary JD, Crawford MW (2013) Review article: reporting guidelines in the biomedical literature. Can J Anaesth 60:813–821CrossRefGoogle Scholar
  12. 12.
    Grant SP, Mayo-Wilson E, Melendez-Torres GJ, Montgomery P (2013) Reporting quality of social and psychological intervention trials: a systematic review of reporting guidelines and trial publications. PLoS One 8:e65442CrossRefGoogle Scholar
  13. 13.
    Cobo E, Cortés J, Ribera JM et al (2011) Effect of using reporting guidelines during peer review on quality of final manuscripts submitted to a biomedical journal: masked randomised trial. BMJ 343:d6783Google Scholar
  14. 14.
    Leung V, Rousseau-Blass F, Beauchamp G, Pang DSJ (2018) ARRIVE has not ARRIVEd: support for the ARRIVE (animal research: reporting of in vivo experiments) guidelines does not improve the reporting quality of papers in animal welfare, analgesia or anesthesia. PLoS One 13:e0197882CrossRefGoogle Scholar
  15. 15.
    Moher D, Jones A, Lepage L; CONSORT Group (Consolidated Standards for Reporting of Trials) (2001) Use of the CONSORT statement and quality of reports of randomized trials: a comparative before-and-after evaluation. JAMA 285:1992–1995Google Scholar
  16. 16.
    Plint AC, Moher D, Morrison A et al (2006) Does the CONSORT checklist improve the quality of reports of randomised controlled trials? A systematic review. Med J Aust 185:263–267PubMedGoogle Scholar
  17. 17.
    Korevaar DA, Wang J, van Enst WA et al (2015) Reporting diagnostic accuracy studies: some improvements after 10 years of STARD. Radiology 274:781–789CrossRefGoogle Scholar
  18. 18.
    Pouwels KB, Widyakusuma NN, Groenwold RH, Hak E (2016) Quality of reporting of confounding remained suboptimal after the STROBE guideline. J Clin Epidemiol 69:217–224CrossRefGoogle Scholar
  19. 19.
    Page MJ, Moher D (2017) Evaluations of the uptake and impact of the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement and extensions: a scoping review. Syst Rev 6:263CrossRefGoogle Scholar
  20. 20.
    Levine D, Kressel HY (2016) Radiology 2016: the care and scientific rigor used to process and evaluate original research manuscripts for publication. Radiology 278:6–10CrossRefGoogle Scholar
  21. 21.
    McInnes MD, Bossuyt PM (2015) Pitfalls of systematic reviews and meta-analyses in imaging research. Radiology 277:13–21CrossRefGoogle Scholar
  22. 22.
    Mannocci A, Saulle R, Colamesta V et al (2015) What is the impact of reporting guidelines on public health journals in Europe? The case of STROBE, CONSORT and PRISMA. J Public Health (Oxf) 37:737–740Google Scholar
  23. 23.
    Barnes C, Boutron I, Giraudeau B, Porcher R, Altman DG, Ravaud P (2015) Impact of an online writing aid tool for writing a randomized trial report: the COBWEB (Consort-based WEB tool) randomized controlled trial. BMC Med 13:221CrossRefGoogle Scholar
  24. 24.
    Schueler S, Walther S, Schuetz GM, Schlattmann P, Dewey M (2013) Methodological quality of diagnostic accuracy studies on non-invasive coronary CT angiography: influence of QUADAS (quality assessment of diagnostic accuracy studies included in systematic reviews) items on sensitivity and specificity. Eur Radiol 23:1603–1622CrossRefGoogle Scholar
  25. 25.
    Frank RA, Sharifabadi AD, Salameh JP et al (2018) Citation bias in imaging research: are studies with higher diagnostic accuracy estimates cited more often? Eur Radiol.
  26. 26.
    Sardanelli F, Alì M, Hunink MG, Houssami N, Sconfienza LM, Di Leo G (2018) To share or not to share? Expected pros and cons of data sharing in radiological research. Eur Radiol 28:2328–2335Google Scholar

Copyright information

© European Society of Radiology 2019

Authors and Affiliations

  1. 1.Charité – Universitätsmedizin Berlin, Humboldt-Universität and Freie Universität zu BerlinBerlinGermany
  2. 2.Beth Israel Deaconess Medical CenterBostonUSA
  3. 3.Harvard Medical School and RadiologyBostonUSA
  4. 4.Academic Medical CenterUniversity of AmsterdamAmsterdamThe Netherlands

Personalised recommendations