Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

Is the soundness-only quality control policy of open access mega journals linked to a higher rate of published errors?


Open access mega journals (OAMJs) are broad and centralized open access journals that have come to represent profitable outlets for accumulating large volumes of research from multiple fields of study, including papers that are rejected from other journals by the same publisher. Some OAMJs charge hefty (exceeding US$1000) article processing fees. One characteristic of OAMJs is a large editorial board. In 2015, Björk indicated that a primary characteristic of an OAMJ was its prepublication soundness only peer review, i.e., novelty, significance, relevance and impact are assessed only post-publication. However, such a premise ignores the inherent nature of peers’ bias. This controversial claim is challenged in this paper by assessing whether there is a link between research output (number of papers published in each OAMJ) and number of errata, including retractions. We assessed 16 OAMJs indexed in Clarivate Analytics’ Web of Science and found wide variation in published errata and retractions across OAMJs between 2012 and 2018. PLOS ONE had the highest correction rate (3.159%), followed by Medicine (3.158%), BMJ Open (2.949%) and Scientific Reports (2.896%). In contrast, PeerJ, Elementa, and Science of the Anthropocene did not publish any errata in 2012–2018 but IEEE Access had a correction rate of 0.059%. Regarding the retraction rate, the highest share of retracted publications was seen in Medicine (0.079%), Cell Reports (0.035%) and PLOS ONE (0.030%), while nine out of the 16 studied OAMJs did not have any retracted publications during 2012–2018. We conclude that there is wide variation in “quality control”, as assessed through errata and retractions, among OAMJs. We recommend, therefore, that the “soundness only peer review” prerequisite for OAMJs should be scrapped.

This is a preview of subscription content, log in to check access.


  1. 1.

  2. 2.

  3. 3.


  1. Björk, B. C. (2015). Have the “mega-journals” reached the limits to growth? PeerJ, 3, e981.

  2. Björk, B. C. (2018a). Publishing speed and acceptance rates of open-access mega journals. Online Information Review.

  3. Björk, B. C. (2018b). Evolution of the scholarly mega-journal, 2006–2017. PeerJ, 6, e4357.

  4. Bornmann, L. (2011). Scientific peer review. Annual Review of Information Science and Technology, 45(1), 197–245.

  5. Erfanmanesh, M. (2019). Quantitative portrait of open access mega-journals. Malaysian Journal of Library and Information Science (in press).

  6. Fraumann, G. (2018). The values and limits of altmetrics. New Directions for Institutional Research, 178, 53–69.

  7. Haustein, S. (2012). Multidimensional journal evaluation: Analyzing scientific periodicals beyond the impact factor. Walter de Gruyter Saur.

  8. Lee, C. J., Sugimoto, C. R., Zhang, G., & Cronin, B. (2013). Bias in peer review. Journal of the American Society for Information Science and Technology, 64(1), 2–17.

  9. Molckovsky, A., Vickers, M. M., & Tang, P. A. (2011). Characterization of published errors in high-impact oncology journals. Current Oncology, 18(1), 26–32.

  10. Shin, E. J. (2017). Can the growth of mega-journals affect authors’ choice of journal? Serials Review, 43(2), 137–146.

  11. Solomon, D. J. (2014). A survey of authors publishing in four megajournals. PeerJ, 2, e365.

  12. Spezi, V., Wakeling, S., Pinfield, S., Creaser, C., Fry, J., & Willett, P. (2017). Open-access mega-journals: The future of scholarly communication or academic dumping ground? A review. Journal of Documentation, 73(2), 263–283.

  13. Spezi, V., Wakeling, S., Pinfield, S., Creaser, C., Fry, J., & Willett, P. (2018). “Let the community decide”? The vision and reality of soundness-only peer review in open-access mega-journals. Journal of Documentation, 74(1), 137–161.

  14. Teixeira da Silva, J. A. (2016). An error is an error is an erratum. The ethics of not correcting errors in the science literature. Publishing Research Quarterly, 32(3), 220–226.

  15. Teixeira da Silva, J. A., & Dobránszki, J. (2017). Notices and policies for retractions, expressions of concern, errata and corrigenda: Their importance, content, and context. Science and Engineering Ethics, 23(2), 521–554.

  16. Teixeira da Silva, J. A., & Shaughnessy, M. F. (2017). An interview with Jaime A. Teixeira da Silva: Insight into improving the efficiency of the publication process. North American Journal of Psychology, 19(2), 325–338.

  17. Teixeira da Silva, J. A., Tsigaris, P., & Al-Khatib, A. (2019). Open access mega-journals: Quality, economics and post-publication peer review infrastructure. Publishing Research Quarterly.

  18. Wakeling, S., Willett, P., Creaser, C., Fry, J., Pinfield, S., & Spezi, V. (2016). Open-access mega-journals: A bibliometric profile. PLoS ONE, 11(11), e0165359.

  19. Wakeling, S., Willett, P., Creaser, C., Fry, J., Pinfield, S., & Spezi, V. (2017). Transitioning from a conventional to a ‘mega’ journal: A bibliometric case study of the journal Medicine. Publications, 5, 7.

  20. Wiser, J. (2014). The future of serials: A publisher's perspective. Serials Review, 40(4), 238–241.

Download references

Author information

Correspondence to Mohammadamin Erfanmanesh or Jaime A. Teixeira da Silva.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Erfanmanesh, M., Teixeira da Silva, J.A. Is the soundness-only quality control policy of open access mega journals linked to a higher rate of published errors?. Scientometrics 120, 917–923 (2019).

Download citation


  • Article processing charge
  • Journal cascading
  • OA
  • OAMJ
  • Post-publication peer review