Skip to main content

Prevalence of potentially predatory publishing in Scopus on the country level

A Correction to this article was published on 17 May 2021

This article has been updated


We present results of a large-scale study of potentially predatory journals (PPJ) represented in the Scopus database, which is widely used for research evaluation. Both journal metrics and country/disciplinary data have been evaluated for different groups of PPJ: those listed by Jeffrey Beall and those discontinued by Scopus because of “publication concerns”. Our results show that even after years of discontinuing, hundreds of active potentially predatory journals are still highly visible in the Scopus database. PPJ papers are continuously produced by all major countries, but with different prevalence. Most all science journal classification subject areas are affected. The largest number of PPJ papers are in engineering and medicine. On average, PPJ have much lower citation metrics than other Scopus-indexed journals. We conclude with a survey of the case of Russia and Kazakhstan where the share of PPJ papers in 2016 amounted to almost a half of all Kazakhstan papers in Scopus. Our data suggest a relation between PPJ prevalence and national research evaluation policies. As such policies become more widespread, the expansion of potentially predatory journal research will be increasingly important.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Change history


  1. 1.

    Promotional info at accessed on 20 june 2017.

  2. 2.

    Promotional info at accessed on 21 June 2017.

  3. 3.

    See accessed on 21 June 2017 and currently available only via Internet Archive:

  4. 4.

    While “Web of Science” usually clearly means relevant database, “Scopus” may also refer to Mount Scopus, a historical mountain in northeast Jerusalem, or a latin name of the bird species Hamercop (Scopus Umbretta), or a specialist journal on east African ornithology.

  5. 5.

    Open access scholarly literature is free of charge and often carries less restrictive copyright and licensing barriers than traditionally published works, for both the users and the authors.

  6. 6.

    Oral evidence to UK House of Commons Science & Technology Inquiry, March 1st 2004, Sir Crispin Davis (CEO, Reed Elsevier), see

  7. 7.


  8. 8.


  9. 9.

    Elsevier’s site

  10. 10.

    Accessed in September 2018.

  11. 11.


  12. 12.

    Elsevier’s site

  13. 13.

    August 2016 version.

  14. 14.

    The publication counts data were accessed in November 2018. So, the analysis for 2018 was based on preliminary data.

  15. 15.

    See the order of the Minister of Education and Science of Kazakhstan from 03/31/2011 No. 127, appendix 1

  16. 16.

  17. 17.

  18. 18.

    The Jakarta Post article “The regulation requires academics to publish at least one scientific paper in three years in an international or accredited journal. Another regulation has also contributed to the surge in published papers. Three years ago, the government issued Ministerial Regulation No. 44/2015 on higher education quality, which required every graduate student to publish one piece in an accredited journal and a doctoral candidate to publish a piece in an international journal”.

  19. 19.

    First introduced in the order of the Minister of Education and Science of the Republic of Kazakhstan dated March 31, 2011, No. 127

  20. 20.

    See and


  1. Ajuwon, G., & Ajuwon, A. (2018). Predatory publishing and the dilemma of the nigerian academic. African Journal of Biomedical Research, 21(1), 1–5.

    Google Scholar 

  2. Bagues, M., Sylos-Labini, M., & Zinovyeva, N. (2017). A walk on the wild side: ’predatory’ journals and information asymmetries in scientific evaluations. IZA Discussion Papers (11041).

  3. Balehegn, M. (2017). Increased publication in predatory journals by developing countries’ institutions: What it entails? and what can be done? International Information and Library Review, 49(2), 97–100.

    Article  Google Scholar 

  4. Beall, J. (2009). Bentham open. The Charleston Advisor, 11(1), 29–32.

    Google Scholar 

  5. Beall, J. (2010). “Predatory” open-access scholarly publishers. The Charleston Advisor, 11(4), 10–17.

    Google Scholar 

  6. Beall, J. (2016a). Beall’s list: Potential, possible, or probable predatory scholarly open-access publishers., (archived ed. 2016-08-1)

  7. Beall, J. (2016b). List of standalone journals: Potential, possible, or probable predatory scholarly open-access journals. (archived ed. 2016-07-21)

  8. Beall, J. (2017). What I learned from predatory publishers. Biochemia Medica, 27(2), 273–279.

    Article  Google Scholar 

  9. Berger, M., & Cirasella, J. (2015). Beyond beall’s list: Better understanding predatory publishers. College and Research Libraries News, 76(3), 132–135.

    Article  Google Scholar 

  10. Biagioli, M., & Lippman, A. (Eds.). (2020). Gaming the Metrics: Misconduct and Manipulation in Academic Research. The MIT Press.

  11. Biagioli, M., Kenney, M., Martin, B., & Walsh, J. (2019). Academic misconduct, misrepresentation and gaming: A reassessment. Research Policy, 48(2), 401–413.

    Article  Google Scholar 

  12. Bloudoff-Indelicato, M. (2015). Backlash after frontiers journals added to list of questionable publishers. Nature, 526(7575), 613.

    Article  Google Scholar 

  13. Bohannon, J. (2013). Who’s afraid of peer review? Science, 342(6154), 60–65.

    Article  Google Scholar 

  14. Colwell, R., Blouw, M., Butler, L., Cozzens, S., Feller, I., Gingras, Y., & Makarow, M. (2012). Informing research choices: Indicators and judgment. The Expert Panel on Science Performance and Research Funding.

  15. Cortegiani, A., Ippolito, M., Ingoglia, G., Manca, A., Cugusi, L., Severin, A., Strinzel, M., Panzarella, V., Campisi, G., Manoj, L. et al. (2020) Citations and metrics of journals discontinued from scopus for publication concerns: the ghos(t)copus project [version 2]. F1000Research 9:415,

  16. Crawford, W. (2014). Ethics and access 1: The sad case of Jeffrey Beall. Cites and Insights, 14(4), 1–14.

    Google Scholar 

  17. Cyranoski, D. (2018). China awaits controversial blacklist of ‘poor quality’ journals. Nature, 562(7728), 471–472.

    Article  Google Scholar 

  18. Dahler-Larsen, P. (2011). The Evaluation Society. Stanford: Stanford University Press.

    Book  Google Scholar 

  19. Davis, P. (2009). Open access publisher accepts nonsense manuscript for dollars. Retrieved from The Scholarly Kitchen:

  20. Eriksson, S., & Helgesson, G. (2016). Where to publish and not to publish in bioethics. Retrieved from The Ethics Blog:

  21. Eriksson, S., & Helgesson, G. (2017). The false academy: Predatory publishing in science and bioethics. Medicine, Health Care and Philosophy, 20(2), 163–170.

    Article  Google Scholar 

  22. Esposito, J. (2013). Esposito J (2013) Parting company with jeffrey beall. Retrieved from The Scholarly Kitchen:

  23. Gläser, J., Lange, S., Laudel, G., & Schimank, U. (2010). Informed authority? the limited use of research evaluation systems for managerial control in universities. In R. Whitley, J. Gläser, & L. Engwall (Eds.), Reconfiguring Knowledge Production: Changing Authority Relationships in the Sciences and Their Consequences for Intellectual Innovation (pp. 149–369). Oxford: Oxford University Press.

    Chapter  Google Scholar 

  24. Guerrero-Bote, V., & Moya-Anegón, F. (2012). A further step forward in measuring journals’ scientific prestige: The sjr2 indicator. Journal of Informetrics, 6(4), 674–688.

    Article  Google Scholar 

  25. Harzing, A. W., & Alakangas, S. (2016). Google scholar, scopus and the web of science: a longitudinal and cross-disciplinary comparison. Scientometrics, 106(2), 787–804.

    Article  Google Scholar 

  26. Ibba, S., Pani, F., Stockton, J., Barabino, G., Marchesi, M., & Tigano, D. (2017). Incidence of predatory journals in computer science literature. Library Review, 66(6–7), 505–522.

    Article  Google Scholar 

  27. Kruskal, W. H., & Wallis, W. A. (1952). Use of ranks in one-criterion variance analysis. Journal of the American Statistical Association, 47(260), 583–621.

    Article  MATH  Google Scholar 

  28. Leydesdorff, L., Wouters, P., & Bornmann, L. (2016). Professional and citizen bibliometrics: Complementarities and ambivalences in the development and use of indicators-a state-of-the-art report. Scientometrics, 109(3), 2129–2150.

    Article  Google Scholar 

  29. Lin, S., & Zhan, L. (2014). Trash journals in China. Learned Publishing, 27(2), 145–154.

    Article  Google Scholar 

  30. Machacek, V., & Srholec, M. (2017). Predatory journals in scopus. Project report.,, available at:

  31. Machacek, V., & Srholec, M. (2019). Predatory publications in scopus: Evidence on cross-country differences.

  32. Moed, H., Bar-Ilan, J., & Halevi, G. (2016). A new methodology for comparing google scholar and scopus. Journal of Informetrics, 10(2), 533–551.

    Article  Google Scholar 

  33. Moed, H., Markusova, V., & Akoev, M. (2018). Trends in russian research output indexed in scopus and web of science. Scientometrics, 116(2), 1153–1180.

    Article  Google Scholar 

  34. Mongeon, P., & Paul-Hus, A. (2016). The journal coverage of web of science and scopus: a comparative analysis. Scientometrics, 106(1), 213–218.

    Article  Google Scholar 

  35. Mouton, J., & Valentine, A. (2017). The extent of South African authored articles in predatory journals. South African Journal of Science, 113(7–8), 1–9.

    Google Scholar 

  36. Nwagwu, W. E., & Ojemeni, O. (2015). Penetration of Nigerian predatory biomedical open access journals 2007–2012: A bibliometric study. Learned Publishing, 28(1), 23–34.

    Article  Google Scholar 

  37. Önder, Ç., & Erdil, S. (2017). Opportunities and opportunism: Publication outlet selection under pressure to increase research productivity. Research Evaluation, 26(2), 66–77.

    Article  Google Scholar 

  38. Rijcke, S., Wouters, P. F., Rushforth, A. D., Franssen, T. P., & Hammarfelt, B. (2016). Evaluation practices and effects of indicator use-a literature review. Research Evaluation, 25(2), 161–169.

    Article  Google Scholar 

  39. Roland, M. C. (2007). Publish and perish. Hedging and fraud in scientific discourse. EMBO Reports, 8(5), 424–428.

    Article  Google Scholar 

  40. Savina, T., & Sterligov, I. (2016). Potentially predatory journals in scopus: Descriptive statistics and country-level dynamics [nwb’2016 presentation slides. In 21st Nordic Workshop on Bibliometrics and Research Policy,

  41. Shamseer, L., Moher, D., Maduekwe, O., Turner, L., Barbour, V., Burch, R., et al. (2017). Potential predatory and legitimate biomedical journals: Can you tell the difference? A cross-sectional comparison. BMC Medicine, 15, 28.

    Article  Google Scholar 

  42. Shen, C., & Björk, B. C. (2015). “Predatory” open access: a longitudinal study of article volumes and market characteristics. BMC Medicine, 13, 230.

    Article  Google Scholar 

  43. Steele, C., Butler, L., & Kingsley, D. (2006). The publishing imperative: the pervasive influence of publication metrics. Learned Publishing, 19(4), 277–290.

    Article  Google Scholar 

  44. Sterligov, I. (2020). Why blacklists matter. In Corruption in Higher Education, Brill Sense, p 49–56,

  45. Sterligov, I., & Savina, T. (2016). Riding with the metric tide: “predatory” journals in scopus. Higher Education in Russia and Beyond, 1(7), 9–12.

    Google Scholar 

  46. Waltman, L. (2016). A review of the literature on citation impact indicators. Journal of Informetrics, 10(2), 365–391.

    Article  Google Scholar 

  47. Waltman, L., Eck, N., Leeuwen, T., & Visser, M. (2013). Some modifications to the snip journal impact indicator. Journal of Informetrics, 7(2), 272–285.

    Article  Google Scholar 

  48. Weingart, P. (2005). Impact of bibliometrics upon the science system: Inadvertent consequences? Scientometrics, 62(1), 117–131.

    Article  Google Scholar 

  49. Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., Jones, R., Kain, R., Kerridge, S., Thelwall, M., Tinkler, J., Viney, I., Wouters, P., Hill, J., & Johnson, B. (2015). The metric tide. Report of the Independent Review of the Role of Metrics in Research Assessment and Management

  50. Wouters, P., Thelwall, M., Kousha, K., Waltman, L., Rijcke de, S., Rushforth, A., & Franssen, T. (2015). The metric tide: Literature review (supplementary report i to the independent review of the role of metrics in research assessment and management. HEFCE

  51. Xia, J., Harmon, J., Connolly, K., Donnelly, R., Anderson, M., & Howard, H. (2015). Who publishes in “predatory” journals? Journal of American Society for Information Science and Technology, 66(7), 1406–1417.

    Article  Google Scholar 

  52. Xia, J., Li, Y., & Situ, P. (2017). An overview of predatory journal publishing in asia. Journal of East Asian Libraries 2017(165):4,, available at:.

Download references


The authors would like to thank Dmitrii Marin (University of Waterloo, Canada) and Alexei Lutay (Russian Foundation for Basic Research, Russia) for helpful detail feedback and stimulating discussions.

Author information



Corresponding author

Correspondence to Tatiana Marina.


Appendix 1: Kruskal Wallis Test

The hypothesis: the median values of journal metrics for Publication Concerns, Active PPJ, Inactive PPJ groups were equal. We used the Kruskal–Wallis criterion (Kruskal and Wallis 1952). There were statistically significant differences in the journal metrics depending on the journal group, see Tables 6 and 7.

Table 6 Statistics: Kruskal Wallis test with grouping variable Journals
Table 7 Ranks: Kruskal Wallis test

Appendix 2: Rules for awarding academic degrees

According to the RulesFootnote 19 for awarding academic degrees of the Ministry of Education and Science of the Republic of Kazakhstan, the dissertation is written under the guidance of domestic and foreign supervisors who have academic degrees and are specialists in the field of scientific research of doctoral students. The main findings of the dissertation are to be published in at least 7 publications on the topic of the dissertation, including at least 3 in scientific publications recommended by the authorized body, 1 in an international scientific publication that has a non-zero impact factor in Web of Science or is indexed in Scopus, 3 in the materials of international conferences, including 1 in the materials of foreign conferences.Footnote 20

Appendix 3: List of potentially predatory journals

see Table 8

Table 8 List of potentially predatory journals

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Marina, T., Sterligov, I. Prevalence of potentially predatory publishing in Scopus on the country level. Scientometrics 126, 5019–5077 (2021).

Download citation


  • Potentially predatory journals
  • Government publishing policy
  • Publication concerns
  • Scopus database
  • Bibliometric analysis