Skip to main content

Prevalence of potentially predatory publishing in Scopus on the country level

A Correction to this article was published on 17 May 2021

This article has been updated

Abstract

We present results of a large-scale study of potentially predatory journals (PPJ) represented in the Scopus database, which is widely used for research evaluation. Both journal metrics and country/disciplinary data have been evaluated for different groups of PPJ: those listed by Jeffrey Beall and those discontinued by Scopus because of “publication concerns”. Our results show that even after years of discontinuing, hundreds of active potentially predatory journals are still highly visible in the Scopus database. PPJ papers are continuously produced by all major countries, but with different prevalence. Most all science journal classification subject areas are affected. The largest number of PPJ papers are in engineering and medicine. On average, PPJ have much lower citation metrics than other Scopus-indexed journals. We conclude with a survey of the case of Russia and Kazakhstan where the share of PPJ papers in 2016 amounted to almost a half of all Kazakhstan papers in Scopus. Our data suggest a relation between PPJ prevalence and national research evaluation policies. As such policies become more widespread, the expansion of potentially predatory journal research will be increasingly important.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Change history

Notes

  1. 1.

    Promotional info at http://clarivate.com/?product=web-of-science accessed on 20 june 2017.

  2. 2.

    Promotional info at https://www.elsevier.com/solutions/scopus/content/content-policy-and-selection accessed on 21 June 2017.

  3. 3.

    See http://www.arc.gov.au/news-media/media-releases/scopus-provide-citation-information-era accessed on 21 June 2017 and currently available only via Internet Archive: https://web.archive.org/web/20151010083040/https://www.arc.gov.au/news-media/media-releases/scopus-provide-citation-information-era.

  4. 4.

    While “Web of Science” usually clearly means relevant database, “Scopus” may also refer to Mount Scopus, a historical mountain in northeast Jerusalem, or a latin name of the bird species Hamercop (Scopus Umbretta), or a specialist journal on east African ornithology.

  5. 5.

    Open access scholarly literature is free of charge and often carries less restrictive copyright and licensing barriers than traditionally published works, for both the users and the authors.

  6. 6.

    Oral evidence to UK House of Commons Science & Technology Inquiry, March 1st 2004, Sir Crispin Davis (CEO, Reed Elsevier), see https://publications.parliament.uk/pa/cm200304/cmselect/cmsctech/uc399-i/uc39902.htm.

  7. 7.

    See https://www.plos.org/.

  8. 8.

    See http://ulrichsweb.serialssolutions.com/.

  9. 9.

    Elsevier’s site https://www.elsevier.com/solutions/scopus/how-scopus-works/content.

  10. 10.

    Accessed in September 2018.

  11. 11.

    See https://www.elsevier.com/solutions/scopus/how-scopus-works/content/content-policy-and-selection.

  12. 12.

    Elsevier’s site https://www.elsevier.com/solutions/scopus/how-scopus-works/content.

  13. 13.

    August 2016 version.

  14. 14.

    The publication counts data were accessed in November 2018. So, the analysis for 2018 was based on preliminary data.

  15. 15.

    See the order of the Minister of Education and Science of Kazakhstan from 03/31/2011 No. 127, appendix 1 http://web.archive.org/web/20190905234744/https://egov.kz/cms/ru/law/list/V1100006951.

  16. 16.

    https://5top100.ru/en/.

  17. 17.

    http://rimc.uum.edu.my/index.php/blacklisted-journals-by-moe.

  18. 18.

    The Jakarta Post article https://www.thejakartapost.com/news/2018/06/10/wanted-6000-new-journals-to-publish-150000-papers.html: “The regulation requires academics to publish at least one scientific paper in three years in an international or accredited journal. Another regulation has also contributed to the surge in published papers. Three years ago, the government issued Ministerial Regulation No. 44/2015 on higher education quality, which required every graduate student to publish one piece in an accredited journal and a doctoral candidate to publish a piece in an international journal”.

  19. 19.

    First introduced in the order of the Minister of Education and Science of the Republic of Kazakhstan dated March 31, 2011, No. 127

  20. 20.

    See http://web.archive.org/save/http://adilet.zan.kz/rus/archive/docs/V1100006951/31.03.2011 and https://academy-gp.kz/?page_id=71&lang=en.

References

  1. Ajuwon, G., & Ajuwon, A. (2018). Predatory publishing and the dilemma of the nigerian academic. African Journal of Biomedical Research, 21(1), 1–5.

    Google Scholar 

  2. Bagues, M., Sylos-Labini, M., & Zinovyeva, N. (2017). A walk on the wild side: ’predatory’ journals and information asymmetries in scientific evaluations. IZA Discussion Papers (11041).

  3. Balehegn, M. (2017). Increased publication in predatory journals by developing countries’ institutions: What it entails? and what can be done? International Information and Library Review, 49(2), 97–100. https://doi.org/10.1080/10572317.2016.1278188.

    Article  Google Scholar 

  4. Beall, J. (2009). Bentham open. The Charleston Advisor, 11(1), 29–32.

    Google Scholar 

  5. Beall, J. (2010). “Predatory” open-access scholarly publishers. The Charleston Advisor, 11(4), 10–17.

    Google Scholar 

  6. Beall, J. (2016a). Beall’s list: Potential, possible, or probable predatory scholarly open-access publishers. https://web.archive.org/web/20160801084124/, https://scholarlyoa.com/publishers/ (archived ed. 2016-08-1)

  7. Beall, J. (2016b). List of standalone journals: Potential, possible, or probable predatory scholarly open-access journals. https://web.archive.org/web/20160721165856/https://scholarlyoa.com/individual-journals/ (archived ed. 2016-07-21)

  8. Beall, J. (2017). What I learned from predatory publishers. Biochemia Medica, 27(2), 273–279.

    Article  Google Scholar 

  9. Berger, M., & Cirasella, J. (2015). Beyond beall’s list: Better understanding predatory publishers. College and Research Libraries News, 76(3), 132–135.

    Article  Google Scholar 

  10. Biagioli, M., & Lippman, A. (Eds.). (2020). Gaming the Metrics: Misconduct and Manipulation in Academic Research. The MIT Press.

  11. Biagioli, M., Kenney, M., Martin, B., & Walsh, J. (2019). Academic misconduct, misrepresentation and gaming: A reassessment. Research Policy, 48(2), 401–413.

    Article  Google Scholar 

  12. Bloudoff-Indelicato, M. (2015). Backlash after frontiers journals added to list of questionable publishers. Nature, 526(7575), 613. https://doi.org/10.1038/526613f.

    Article  Google Scholar 

  13. Bohannon, J. (2013). Who’s afraid of peer review? Science, 342(6154), 60–65.

    Article  Google Scholar 

  14. Colwell, R., Blouw, M., Butler, L., Cozzens, S., Feller, I., Gingras, Y., & Makarow, M. (2012). Informing research choices: Indicators and judgment. The Expert Panel on Science Performance and Research Funding.

  15. Cortegiani, A., Ippolito, M., Ingoglia, G., Manca, A., Cugusi, L., Severin, A., Strinzel, M., Panzarella, V., Campisi, G., Manoj, L. et al. (2020) Citations and metrics of journals discontinued from scopus for publication concerns: the ghos(t)copus project [version 2]. F1000Research 9:415, https://doi.org/10.12688/f1000research.23847.2.

  16. Crawford, W. (2014). Ethics and access 1: The sad case of Jeffrey Beall. Cites and Insights, 14(4), 1–14.

    Google Scholar 

  17. Cyranoski, D. (2018). China awaits controversial blacklist of ‘poor quality’ journals. Nature, 562(7728), 471–472. https://doi.org/10.1038/d41586-018-07025-5.

    Article  Google Scholar 

  18. Dahler-Larsen, P. (2011). The Evaluation Society. Stanford: Stanford University Press.

    Book  Google Scholar 

  19. Davis, P. (2009). Open access publisher accepts nonsense manuscript for dollars. Retrieved from The Scholarly Kitchen: https://scholarlykitchen.sspnet.org/2009/06/10/nonsense-for-dollars/

  20. Eriksson, S., & Helgesson, G. (2016). Where to publish and not to publish in bioethics. Retrieved from The Ethics Blog: https://ethicsblog.crb.uu.se/2016/04/19/where-to-publish-and-not-to-publish-in-bioethics/

  21. Eriksson, S., & Helgesson, G. (2017). The false academy: Predatory publishing in science and bioethics. Medicine, Health Care and Philosophy, 20(2), 163–170. https://doi.org/10.1007/s11019-016-9740-3.

    Article  Google Scholar 

  22. Esposito, J. (2013). Esposito J (2013) Parting company with jeffrey beall. Retrieved from The Scholarly Kitchen: https://scholarlykitchen.sspnet.org/2013/12/16/parting-company-with-jeffrey-beall/

  23. Gläser, J., Lange, S., Laudel, G., & Schimank, U. (2010). Informed authority? the limited use of research evaluation systems for managerial control in universities. In R. Whitley, J. Gläser, & L. Engwall (Eds.), Reconfiguring Knowledge Production: Changing Authority Relationships in the Sciences and Their Consequences for Intellectual Innovation (pp. 149–369). Oxford: Oxford University Press.

    Chapter  Google Scholar 

  24. Guerrero-Bote, V., & Moya-Anegón, F. (2012). A further step forward in measuring journals’ scientific prestige: The sjr2 indicator. Journal of Informetrics, 6(4), 674–688. https://doi.org/10.1016/j.joi.2012.07.001.

    Article  Google Scholar 

  25. Harzing, A. W., & Alakangas, S. (2016). Google scholar, scopus and the web of science: a longitudinal and cross-disciplinary comparison. Scientometrics, 106(2), 787–804. https://doi.org/10.1007/s11192-015-1798-9.

    Article  Google Scholar 

  26. Ibba, S., Pani, F., Stockton, J., Barabino, G., Marchesi, M., & Tigano, D. (2017). Incidence of predatory journals in computer science literature. Library Review, 66(6–7), 505–522. https://doi.org/10.1108/LR-12-2016-0108.

    Article  Google Scholar 

  27. Kruskal, W. H., & Wallis, W. A. (1952). Use of ranks in one-criterion variance analysis. Journal of the American Statistical Association, 47(260), 583–621. https://doi.org/10.1080/01621459.1952.10483441.

    Article  MATH  Google Scholar 

  28. Leydesdorff, L., Wouters, P., & Bornmann, L. (2016). Professional and citizen bibliometrics: Complementarities and ambivalences in the development and use of indicators-a state-of-the-art report. Scientometrics, 109(3), 2129–2150. https://doi.org/10.1007/s11192-016-2150-8.

    Article  Google Scholar 

  29. Lin, S., & Zhan, L. (2014). Trash journals in China. Learned Publishing, 27(2), 145–154. https://doi.org/10.1087/20140208.

    Article  Google Scholar 

  30. Machacek, V., & Srholec, M. (2017). Predatory journals in scopus. Project report., http://idea-en.cerge-ei.cz/files/IDEA_Study_2_2017_Predatory_journals_in_Scopus/mobile/index.html#p=3, available at: http://idea-en.cerge-ei.cz/files/IDEA_Study_2_2017_Predatory_journals_in_Scopus/mobile/index.html#p=3

  31. Machacek, V., & Srholec, M. (2019). Predatory publications in scopus: Evidence on cross-country differences.

  32. Moed, H., Bar-Ilan, J., & Halevi, G. (2016). A new methodology for comparing google scholar and scopus. Journal of Informetrics, 10(2), 533–551. https://doi.org/10.1016/j.joi.2016.04.017.

    Article  Google Scholar 

  33. Moed, H., Markusova, V., & Akoev, M. (2018). Trends in russian research output indexed in scopus and web of science. Scientometrics, 116(2), 1153–1180. https://doi.org/10.1007/s11192-018-2769-8.

    Article  Google Scholar 

  34. Mongeon, P., & Paul-Hus, A. (2016). The journal coverage of web of science and scopus: a comparative analysis. Scientometrics, 106(1), 213–218. https://doi.org/10.1007/s11192-015-1765-5.

    Article  Google Scholar 

  35. Mouton, J., & Valentine, A. (2017). The extent of South African authored articles in predatory journals. South African Journal of Science, 113(7–8), 1–9.

    Google Scholar 

  36. Nwagwu, W. E., & Ojemeni, O. (2015). Penetration of Nigerian predatory biomedical open access journals 2007–2012: A bibliometric study. Learned Publishing, 28(1), 23–34.

    Article  Google Scholar 

  37. Önder, Ç., & Erdil, S. (2017). Opportunities and opportunism: Publication outlet selection under pressure to increase research productivity. Research Evaluation, 26(2), 66–77. https://doi.org/10.1093/reseval/rvx006.

    Article  Google Scholar 

  38. Rijcke, S., Wouters, P. F., Rushforth, A. D., Franssen, T. P., & Hammarfelt, B. (2016). Evaluation practices and effects of indicator use-a literature review. Research Evaluation, 25(2), 161–169. https://doi.org/10.1093/reseval/rvv038.

    Article  Google Scholar 

  39. Roland, M. C. (2007). Publish and perish. Hedging and fraud in scientific discourse. EMBO Reports, 8(5), 424–428. https://doi.org/10.1038/sj.embor.7400964.

    Article  Google Scholar 

  40. Savina, T., & Sterligov, I. (2016). Potentially predatory journals in scopus: Descriptive statistics and country-level dynamics [nwb’2016 presentation slides. In 21st Nordic Workshop on Bibliometrics and Research Policy, https://doi.org/10.6084/m9.figshare.4249394.v1.

  41. Shamseer, L., Moher, D., Maduekwe, O., Turner, L., Barbour, V., Burch, R., et al. (2017). Potential predatory and legitimate biomedical journals: Can you tell the difference? A cross-sectional comparison. BMC Medicine, 15, 28. https://doi.org/10.1186/s12916-017-0785-9.

    Article  Google Scholar 

  42. Shen, C., & Björk, B. C. (2015). “Predatory” open access: a longitudinal study of article volumes and market characteristics. BMC Medicine, 13, 230. https://doi.org/10.1186/s12916-.

    Article  Google Scholar 

  43. Steele, C., Butler, L., & Kingsley, D. (2006). The publishing imperative: the pervasive influence of publication metrics. Learned Publishing, 19(4), 277–290. https://doi.org/10.1087/095315106778690751.

    Article  Google Scholar 

  44. Sterligov, I. (2020). Why blacklists matter. In Corruption in Higher Education, Brill Sense, p 49–56, https://doi.org/10.1163/9789004433885_008.

  45. Sterligov, I., & Savina, T. (2016). Riding with the metric tide: “predatory” journals in scopus. Higher Education in Russia and Beyond, 1(7), 9–12.

    Google Scholar 

  46. Waltman, L. (2016). A review of the literature on citation impact indicators. Journal of Informetrics, 10(2), 365–391. https://doi.org/10.1016/j.joi.2016.02.007.

    Article  Google Scholar 

  47. Waltman, L., Eck, N., Leeuwen, T., & Visser, M. (2013). Some modifications to the snip journal impact indicator. Journal of Informetrics, 7(2), 272–285. https://doi.org/10.1016/j.joi.2012.11.011.

    Article  Google Scholar 

  48. Weingart, P. (2005). Impact of bibliometrics upon the science system: Inadvertent consequences? Scientometrics, 62(1), 117–131. https://doi.org/10.1007/s11192-005-0007-7.

    Article  Google Scholar 

  49. Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., Jones, R., Kain, R., Kerridge, S., Thelwall, M., Tinkler, J., Viney, I., Wouters, P., Hill, J., & Johnson, B. (2015). The metric tide. Report of the Independent Review of the Role of Metrics in Research Assessment and Management https://doi.org/10.13140/RG.2.1.4929.1363.

  50. Wouters, P., Thelwall, M., Kousha, K., Waltman, L., Rijcke de, S., Rushforth, A., & Franssen, T. (2015). The metric tide: Literature review (supplementary report i to the independent review of the role of metrics in research assessment and management. HEFCE https://doi.org/10.13140/RG.2.1.5066.3520.

  51. Xia, J., Harmon, J., Connolly, K., Donnelly, R., Anderson, M., & Howard, H. (2015). Who publishes in “predatory” journals? Journal of American Society for Information Science and Technology, 66(7), 1406–1417. https://doi.org/10.1002/asi.23265.

    Article  Google Scholar 

  52. Xia, J., Li, Y., & Situ, P. (2017). An overview of predatory journal publishing in asia. Journal of East Asian Libraries 2017(165):4, https://scholarsarchive.byu.edu/jeal/vol2017/iss165/4., available at:.

Download references

Acknowledgements

The authors would like to thank Dmitrii Marin (University of Waterloo, Canada) and Alexei Lutay (Russian Foundation for Basic Research, Russia) for helpful detail feedback and stimulating discussions.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Tatiana Marina.

Appendices

Appendix 1: Kruskal Wallis Test

The hypothesis: the median values of journal metrics for Publication Concerns, Active PPJ, Inactive PPJ groups were equal. We used the Kruskal–Wallis criterion (Kruskal and Wallis 1952). There were statistically significant differences in the journal metrics depending on the journal group, see Tables 6 and 7.

Table 6 Statistics: Kruskal Wallis test with grouping variable Journals
Table 7 Ranks: Kruskal Wallis test

Appendix 2: Rules for awarding academic degrees

According to the RulesFootnote 19 for awarding academic degrees of the Ministry of Education and Science of the Republic of Kazakhstan, the dissertation is written under the guidance of domestic and foreign supervisors who have academic degrees and are specialists in the field of scientific research of doctoral students. The main findings of the dissertation are to be published in at least 7 publications on the topic of the dissertation, including at least 3 in scientific publications recommended by the authorized body, 1 in an international scientific publication that has a non-zero impact factor in Web of Science or is indexed in Scopus, 3 in the materials of international conferences, including 1 in the materials of foreign conferences.Footnote 20

Appendix 3: List of potentially predatory journals

see Table 8

Table 8 List of potentially predatory journals

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Marina, T., Sterligov, I. Prevalence of potentially predatory publishing in Scopus on the country level. Scientometrics 126, 5019–5077 (2021). https://doi.org/10.1007/s11192-021-03899-x

Download citation

Keywords

  • Potentially predatory journals
  • Government publishing policy
  • Publication concerns
  • Scopus database
  • Bibliometric analysis