, Volume 116, Issue 2, pp 1153–1180 | Cite as

Trends in Russian research output indexed in Scopus and Web of Science

  • Henk F. MoedEmail author
  • Valentina Markusova
  • Mark Akoev


Trends are analysed in the annual number of documents published by Russian institutions and indexed in Scopus and Web of Science, giving special attention to the time period starting in the year 2013 in which the Project 5-100 was launched by the Russian Government. Numbers are broken down by document type, publication language, type of source, research discipline, country and source. It is concluded that Russian publication counts strongly depend upon the database used, and upon changes in database coverage, and that one should be cautious when using indicators derived from WoS, and especially from Scopus, as tools in the measurement of research performance and international orientation of the Russian science system.


Scopus WoS Article Review Proceedings Compound annual growth rate Citation Russia Research discipline Project 5-100 Publication language 



The authors wish to thank Dr Alexander Libkind from the All Russian Institute for Scientific and Technical Information of the Russian Academy of Sciences (VINITI) for his comments on an earlier version of this paper. They are also grateful to two anonymous referees for their valuable comments on an earlier version of our manuscript.


This paper is partly supported by the Russian Foundation of Basic Research (Grant: 17-02-00157)


  1. Beall, J. (n.d.). Scholarly open access. Critical analysis of scholarly open-access publishing.
  2. Bornmann, L., Wagner, C., & Leydesdorff, L. (2015). BRICS countries and scientific excellence: A bibliometric analysis of most frequently cited papers. Journal of the Association for Information Science and Technology, 66(7), 1507–1513.CrossRefGoogle Scholar
  3. CWTS. (n.d.). CWTS leiden ranking. Indicators.
  4. Elsevier. (n.d.). Last Retrieved 30 March 2018.
  5. Finardi, U., & Buratti, A. (2016). Scientific collaboration framework of BRICS countries: An analysis of international coauthorship. Scientometrics, 109(1), 433–446.CrossRefGoogle Scholar
  6. Franceschini, F., Maisano, D., & Mastrogiacomo, L. (2016). Empirical analysis and classification of database errors in Scopus and Web of Science. Journal of Informetrics, 10, 933–953.CrossRefGoogle Scholar
  7. Graham, L. R. (1998). What have we learned about science and technology from the Russian experience?. Stanford, CA: Stanford University Press.Google Scholar
  8. Ivanov, V. V., Libkind, A. N., & Markusova, V. A. (2014). Publication activity and research cooperation between higher education institutions and the Russian Academy of Sciences. Herald of the Russian Academy of Sciences, 84(1), 28–34. Scholar
  9. Karaulova, M., Abdullah, G., Shackleton, O., & Shapira, P. (2016). Science system pass-dependencies and their influences: Nanotechnology research in Russia. Scientometrics, 100(3), 365–383. Scholar
  10. Khantemirov, R. (2014). RINTS: From primitive fraud (cheating?) to child molestation. Newspaper “Troitskii Variant”. N19.-6.
  11. MICCEDU. (n.d.). Website of the Ministry of Higher Education and Science.
  12. Mindeli, L. (2013) (ed.). Russian Academy of Sciences in numbers (2013). Statistical issue. IPRAN RAS, Moscow.Google Scholar
  13. Mindeli, L. E., Ivanov, V. V., LIibkind, A. N., & Markusova, V. A. (2016). Domestic research collaboration based on co-authorship, Web of Science, 2006–2013. Scientific and Technical Information, 8, 13–23.Google Scholar
  14. Moed, H. F. (2017). Applied evaluative informetrics. Springer, ISBN 978-3-319-60521-0 (hard cover); 978-3-319-60522-7 (E-Book),, XXI + 312 pp.
  15. Neicon. (2017). The 5th NEICON international conference “electronic resources for research and education: Development, promotion and use”. Jesolo, Italy, 25 Sept–1 Oct 2017.Google Scholar
  16. Savina, T., & Sterligov, I. (2016). Prevalence of potentially predatory publishing in scopus on the country level. (n.d.). Last Retrieved 8 January 2018.
  17. Schiermeier, Q. (2007). The battle for Russia’s brains. Nature, 449, 524–527. Scholar
  18. Schiermeier, Q. (2010). Russia to boost university science. Nature, 464, 1257. Scholar
  19. Schiermeier, Q. (2012). Higher education: Russia shakes up its universities. Nature, 492, 120.Google Scholar
  20. Turko, T., Bakhturin, G., Bagan, V., Poloskov, S., & Gudym, D. (2016). Influence of the program “5-top 100” on the publication activity of Russian universities. Scientometrics, 109, 769–782. Scholar
  21. Van Leeuwen, T. N., Moed, H. F., Tijssen, R. J. W., Visser, M. S., & van Raan, A. F. J. (2001). Language biases in the coverage of the Science Citation Index and its consequences for international comparisons of national research performance. Scientometrics, 51, 335–346.CrossRefGoogle Scholar
  22. Van Raan, A. F. J. (2004). Measuring Science. In H. F. Moed, W. Glänzel, & U. Schmoch (Eds.), Handbook of quantitative science and technology research. The use of publication and patent statistics in studies of S&T systems (pp. 19–50). Dordrecht: Kluwer Academic Publishers.Google Scholar
  23. Wilson, C., & Markusova, V. A. (2004). Changes in the scientific output of Russia from 1980 to 2000, as reflected in the Science Citation Index in relation to national politico-economic factors. Scientometrics, 59, 345–389.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2018

Authors and Affiliations

  • Henk F. Moed
    • 1
    Email author
  • Valentina Markusova
    • 2
  • Mark Akoev
    • 3
  1. 1.Sapienza University of RomeRomeItaly
  2. 2.All Russian Institute for Scientific & Technical Information of the Russian Academy of Sciences (VINITI)MoscowRussia
  3. 3.Scimetrics LabUral Federal UniversityEkaterinburgRussia

Personalised recommendations