Completeness of reporting in abstracts of randomized controlled trials in subscription and open access journals: cross-sectional study
Open access (OA) journals are becoming a publication standard for health research, but it is not clear how they differ from traditional subscription journals in the quality of research reporting. We assessed the completeness of results reporting in abstracts of randomized controlled trials (RCTs) published in these journals.
We used the Consolidated Standards of Reporting Trials Checklist for Abstracts (CONSORT-A) to assess the completeness of reporting in abstracts of parallel-design RCTs published in subscription journals (n = 149; New England Journal of Medicine, Journal of the American Medical Association, Annals of Internal Medicine, and Lancet) and OA journals (n = 119; BioMedCentral series, PLoS journals) in 2016 and 2017.
Abstracts in subscription journals completely reported 79% (95% confidence interval [CI], 77–81%) of 16 CONSORT-A items, compared with 65% (95% CI, 63–67%) of these items in abstracts from OA journals (P < 0.001, chi-square test). The median number of completely reported CONSORT-A items was 13 (95% CI, 12–13) in subscription journal articles and 11 (95% CI, 10–11) in OA journal articles. Subscription journal articles had significantly more complete reporting than OA journal articles for nine CONSORT-A items and did not differ in reporting for items trial design, outcome, randomization, blinding (masking), recruitment, and conclusions. OA journals were better than subscription journals in reporting randomized study design in the title.
Abstracts of randomized controlled trials published in subscription medical journals have greater completeness of reporting than abstracts published in OA journals. OA journals should take appropriate measures to ensure that published articles contain adequate detail to facilitate understanding and quality appraisal of research reports about RCTs.
KeywordsReporting guidelines Randomized controlled trial CONSORT for Abstracts Open access publishing Subscription journals
Annals of Internal Medicine
British Medical Journal
Consolidated Standards of Reporting Trials Checklist for Abstracts
International Committee of Medical Journal Editors
Journal of the American Medical Association
New England Journal of Medicine
Public Library of Science
Randomized controlled trial
Randomized controlled trials (RCTs) are considered the best way to compare therapeutic or preventive interventions in medicine . Clear, transparent, and complete reporting of RCTs is necessary for their use in practice and in health evidence synthesis [2, 3]. It is important that presentations of RCTs in abstracts are also complete and clear, because trial validity and applicability can then be quickly assessed. Also, in some settings, such as in developing countries, an abstract may be the only source of information for health professionals because of limited access to the full texts, and the use of abstracts as sole sources of information may adversely influence healthcare decisions . To improve the quality of reporting of RCT abstracts, an extension of the Consolidated Standards of Reporting Trials (CONSORT) statement was developed in 2008 [2, 3]. The Consolidated Standards of Reporting Trials Checklist for Abstracts (CONSORT-A) statement specifies a minimum set of items that authors should include in the abstract of an RCT . So far, evidence shows poor adherence in general and specialty medical journals [4, 5, 6, 7].
Currently, more than half of the studies indexed in the largest biomedical bibliographical database Medline are in open access (OA) . It was claimed that the advent of OA journals would lead to the erosion of scientific quality control. This opinion was based on the assumption that the OA publishers would take over an increasing part of the publishing industry and would not provide the same level of rigorous peer review as traditional subscription publishers, which would result in a decline in the quality of scholarly publishing . However, there is evidence that the overall quality of OA journal publishing is comparable to that in traditional subscription publishing [10, 11]. The aim of this study was to assess the completeness of results reporting in abstracts of RCTs published in traditional subscription journals (members of the International Committee of Medical Journal Editors [ICMJE] ) and in OA journals (two oldest journal consortia: Public Library of Science [PLoS] journals and BioMedCentral [BMC] series journals).
This cross-sectional study included all abstracts of articles about RCTs published in four subscription journals (New England Journal of Medicine [NEJM], JAMA [Journal of the American Medical Association], Annals of Internal Medicine [AIM], and The Lancet) and two collections of OA journals (BMC series journals and PLoS journals) from January 2016 to December 2017. BMJ (British Medical Journal), which is an ICMJE member, was not included in this group, because it has a combination of OA and subscription publishing options and was previously a fully OA journal .
Two researchers independently screened the articles for inclusion of articles describing the basic study design for which CONSORT was developed: randomized, double-blind, two-group parallel design. The following study designs were thus excluded: crossover trials, cluster trials, factorial studies, pragmatic studies, superiority trials, noninferiority trials, megatrials, sequential trials, open-label studies, nonblinded studies, single-blind studies, pilot studies, secondary analysis of primary trials, and combined studies (RCT plus other study designs). The literature search, outlined in Additional file 1, was undertaken in the MEDLINE database using the OvidSP interface.
The completeness of reporting in the abstracts was independently assessed by two researchers using the CONSORT-A checklist with 16 items. We did not include the item “authors” (i.e., “contact details for the corresponding author”), because this item is specific to conference abstracts . The completeness of reporting was presented as the percentage of articles in two journal groups reporting the individual items, the average percentage and 95% confidence interval (CI) of reported items for the two journal groups, the median number (95%CI) of reported items for each article group, and the mean difference (95% CI) between abstracts published in 2016 and 2017. The results were compared using the chi-square test, t test, and Mann-Whitney test (MedCalc Statistical Software, Ostend, Belgium).
Articles in subscription journals had, on average, 79% (95% CI, 77–81%) completely reported items of total 16 items of the CONSORT-A, compared with 65% (95% CI, 63–67%) for articles in OA journals (P < 0.001, chi-square test). The abstracts in subscription journals had a median of 13 (95% CI, 12–13) reported items, and OA journals had a median of 11 (95% CI, 10–11) reported items, of a total of 16 CONSORT-A items (P < 0.001, Mann-Whitney test).
Number and percentage of articles (95% confidence interval for percentage) published in subscription or open access journals in 2016–2017 satisfying individual items on the CONSORT-A checklist
Subscription journals (n = 149)
OA journals (n = 119)
P value (χ2 test)
86 (58%; 49–66%)
95 (80%; 72–87%)
2. Trial design
141 (95%; 89–97%)
110 (92%; 86–96%)
149 (100%; 97–100%)
113 (95%; 89–98%)
149 (100%; 97–100%)
115 (97%; 91–99%)
132 (89%; 82–93%)
115 (97%; 91–99%)
146 (98%; 94–99%)
111 (93%; 87–97%)
46 (31%; 24–39%)
46 (39%; 30–48%)
8. Blinding (masking)
130 (87%; 81–92%)
97 (82%; 73–88%)
9. Number randomized
87 (58%; 50–66%)
53 (45%; 36–54%)
149 (100%; 97–100%)
117 (98%; 93–100%)
11. Number analyzed
71 (48%; 39–56%)
29 (24%; 17–33%)
64 (43%; 35–5%)
15 (13%; 7–20%)
114 (77%; 69–83%)
33 (28%; 20–37%)
149 (100%; 97–100%)
116 (97%; 92–99%)
15. Trial registration
149 (100%; 97–100%)
100 (84%; 76–90%)
112 (75%; 68–81%)
2 (2%; 0–7%)
There was no difference in the completeness of reporting between the two publication years analyzed in our study: 2016 (total n = 145 abstracts) and 2017 (total n = 123 abstracts): 2016–2017 mean difference (MD), − 4.07; 95% CI, − 8.11% to − 0.02% for subscription journals (P = 0.0487); and MD, 3.99; 95% CI, − 0.32% to 8.31% (P = 0.0692) for OA journals.
We found that the abstracts of articles on RCTs published in subscription medical journals had better reporting completeness according to CONSORT-A than abstracts published in OA journals. There was no difference in the completeness of reporting between 2016 and 2017 in both journal groups, indicating that this was a real phenomenon reflecting a standard practice and not a temporal fluctuation. It is important to keep in mind that all journals included in our study state explicitly that they follow reporting standards as set in reporting guidelines, such as CONSORT.
The limitations of the study include the fact that we included only well-known traditional and OA journals, so that the results may represent best practices and underestimate adequate reporting in health journals. We had very strict inclusion criteria and restricted the comparison only to two-group, double-blind, parallel trial design, which left out many other trial study designs. The CONSORT statement was originally created for the “standard” two-group parallel design, and CONSORT-A was developed for the original CONSORT checklist; therefore, we decided to take this basic design as the inclusion criterion, because it is possible that journals from the two groups in our study may differ in the types and complexity of the trials they publish, which may represent a significant bias. The journals in our study were predominantly general medical journals and published in developed countries, so they may not be fully representative of the general population of medical journals. We also assessed the completeness of results reporting in the abstracts and not the full text. We decided to include only abstracts because they are available in bibliographical databases, which are often the primary route of access to information for many health professionals . This is especially true for settings where health professionals have limited access to the full texts and read only abstracts of journal articles. In such cases, inadequate reporting in abstracts could seriously mislead a reader regarding interpretation of the trial findings [15, 16]. Although an article abstract should be a clear and accurate reflection of what is included in the article, several studies have highlighted problems in the accuracy and quality of abstracts [17, 18, 19, 20].
The greatest differences between the OA and subscription journals were in adequate descriptions of outcomes and harms, which were more often reported in subscription than in OA journals. In general, underreporting of selective reporting of outcomes is a serious problem, particularly when harms are not reported [21, 22, 23]. Although subscription journals published this information at least twice as often as OA journals did, the level of reporting of outcomes and harms is below desirable complete reporting (43% of abstracts fully describing outcomes and 77% describing harms). This underreporting has serious consequences because it may impede the interpretation of the benefit-to-risk relationship.
Both OA and subscription journals adhered to the registration policy; all abstracts in subscription journals had trial registration numbers compared with 84% in OA journals. Only 2% of the abstracts in OA journals reported funding, compared with 75% in subscription journals. It is difficult to draw conclusions about these differences in funding reporting, because only 2 abstracts of 119 in OA journals contained this information. However, it is clear that subscription journals practice greater transparency in reporting funding in abstracts of clinical trials.
A possible explanation for the observed differences in trial-reporting completeness in abstracts in our study is that subscription journals have more resources than OA journals, but this is most probably not the case for the journals included in our study. The representatives of OA journals in our study were well-established PLoS journals and BMC series journals: PLoS journals were started with a US$9 million grant , and BMC series journals are published by the Springer Nature group, one of the largest scientific publishers . Article processing fees are up to US$3000 for PLoS Medicine  and US$3170 for BMC Medicine . It is difficult to compare the revenues of OA journals with those of major ICMJE subscription journals in our study because their revenues are not generally known , but there is no reason to believe that OA journals included in our study did not have resources for implementing reporting guidelines and ensuring the completeness of published abstracts. All journals included in the study are selective and have high volumes of submissions, with an acceptance rate of approximately 5% for subscription journals [29, 30, 31, 32]. In the OA group, PLoS Medicine has a 3% acceptance rate , whereas BMC series journals have a higher acceptance rate, 45–55%, with some of its journals having acceptance rates below 10% .
On the one hand, it can be argued that authors are responsible for completeness of reporting of their studies, including in the abstract. On the other hand, it has been shown that editorial interventions after manuscript acceptance significantly improve the quality of abstracts . Journals are thus well-positioned to ensure that reporting guidelines are followed. They can also help their authors by endorsing tools that have been developed to help authors improve the completeness of their reports, such as the web writing tool based on CONSORT . Recent developments in this field include the Penelope decision-making tool, developed by Penelope Research and the Enhancing the Quality and Transparency of Health Research (EQUATOR) Network . The tool was tested in four BMC series journals in 2016, where it is presented to authors as an embedded element in the manuscript submission system . On the one hand, this indicates that OA journals are open to innovations for better reporting and that they may be more advanced than subscription journals in that respect. On the other hand, subscription journals traditionally offer full editorial support to authors to improve their manuscripts for publication, including abstracts [38, 39].
Our study showed that reporting of RCTs in article abstracts is less complete in OA journals than in subscription journals. OA journals should address this problem and demonstrate that they can publish high-quality articles. After the launch of the cOAlition S initiative to provide full and immediate open access to research publications a reality by 2020 in Europe, OA journals may gain even more importance in publishing . In order to fulfill their expected role, OA journals publishing health research should take appropriate measures to ensure that published articles contain adequate detail to facilitate understanding and quality appraisal of research reports about RCTs.
We appreciate the help with statistical analysis provided by Ivan Buljan, Department of Research in Biomedicine and Health, University of Split School of Medicine, Split, Croatia.
IJMC and AM designed and performed the study and wrote the manuscript. They are both responsible for all aspects of the study. Both authors read and approved the final manuscript.
This research was funded by the Croatian Science Foundation (grant no. IP-2014-09-7672, “Professionalism in Health Care”). The funder had no role in the design of this study during its execution and or in data interpretation.
Ethics approval and consent to participate
Consent for publication
AM is a member of the Steering Group of the EQUATOR Network (https://www.equator-network.org/). She is the editor of an open access journal, Journal of Global Health. IJMC declares no conflicts of interest.
- 5.Can OS, Yilmaz AA, Hasdogan M, et al. Has the quality of abstracts for randomised controlled trials improved since the release of Consolidated Standards of Reporting Trial guideline for abstract reporting? A survey of four high-profile anaesthesia journals. Eur J Anaesthesiol. 2011;28:485–92.CrossRefGoogle Scholar
- 9.National Institutes of Health (NIH). NIH public access policy. 2012. http://publicaccess.nih.gov/public_access_policy_implications_2012.pdf. Accessed 20 May 2019.Google Scholar
- 12.International Committee of Medical Journal Editors. Recommendations for the conduct, reporting, editing, and publication of scholarly work in medical journals. 2017 [updated Dec 2018]. http://www.icmje.org/recommendations/. Accessed 20 May 2019.
- 16.The impact of open access upon public health [editorial]. PLoS Med. 2006;3:e252.Google Scholar
- 24.Gordon and Betty Moore Foundation. Public Library of Science to launch new, free-access biomedical journals with $9 million grant. 2002. https://www.moore.org/article-detail?newsUrlName=public-library-of-science-to-launch-new-free-access-biomedical-journals-with-$9-million-grant-from-the-gordon-and-betty-moore-foundation. Accessed 20 May 2019.Google Scholar
- 25.Milliot J. The World’s 54 Largest Publishers, 2018. Publishers Weekly. 2018. https://www.publishersweekly.com/pw/by-topic/industry-news/publisher-news/article/78036-pearson-is-still-the-world-s-largest-publisher.html. Accessed 20 May 2019.
- 26.PLOS. Publication fees. https://www.plos.org/publication-fees. Accessed 20 May 2019.
- 27.BMC Medicine. Fees and funding: article-processing charges. https://bmcmedicine.biomedcentral.com/submission-guidelines/fees-and-funding. Accessed 20 May 2019.
- 29.New England Journal of Medicine. NEJM author center. https://www.nejm.org/author-center/home. Accessed 20 May 2019.
- 30.JAMA. JAMA Network for Authors: About JAMA. https://jamanetwork.com/journals/jama/pages/for-authors#fa-about. Accessed 20 May 2019.
- 31.Lancet. The Lancet: Information for Authors. https://www.thelancet.com/pb/assets/raw/Lancet/authors/lancet-information-for-authors.pdf. Accessed 20 May 2019.
- 32.Annals of Internal Medicine. Annals of Internal Medicine: Author info. https://annals.org/aim/pages/authors. Accessed 20 May 2019.
- 33.PLOS Medicine. PLOS Medicine: Journal Information. https://journals.plos.org/plosmedicine/s/journal-information. Accessed 20 May 2019.
- 34.BioMed Central. Publishing your research in BioMed Central journals. http://www.ibp.cas.cn/xxfw/xxsypxzn/200903/W020121031404887510757.pdf. Accessed 20 May 2019.
- 37.EQUATOR network. Tools and templates for implementing reporting guidelines. https://www.equator-network.org/toolkits/using-guidelines-in-journals/tools-and-templates-for-implementing-reporting-guidelines/#wizard. Accessed 20 May 2019.
- 40.Science Europe. Open access. https://www.scienceeurope.org/coalition-s/. Accessed 20 May 2019.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.