Abstract
This article seeks to provide a comprehensive overview of publication activity dynamics in the USSR and in the Russian Federation in the context of evolution of national economic and political systems and science policies. A broad set of bibliometric indicators derived from the Web of Science Core Collection database and InCites electronic analytical tool were used to assess the scientific output of the Soviet and Russian research establishments. Various aspects of path dependence of contemporary Russia’s patterns of publication activity on the earlier institutional models of the R&D sector established in the Soviet Union were considered. This path dependence may be clearly observed in the thematic structure of scientific publications (even more so in internationally collaborated papers), in the composition of partner countries for joint publications, and in citation indicators. The evolution of national science policies is tracked in the context of historical development of policy instruments and government actions intended to stimulate and support the publication activity of Russian (and Soviet) academics and maximize their potential effects upon the country’s key research performance indicators.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Notes
No examination is performed here with regard to the variety of these indicators and their strengths and weaknesses. These issues have been widely discussed in the specialized literature: (Kostoff, 1997; Pendlebury, 2009; Gingras, 2014; Belter, 2015; Haustein and Larivière, 2015; Kosten, 2016; Rijcke et al., 2016).
For example, in China in the last two to three years, the criteria for assessing the scientific productivity of national scientists have been significantly revised. If in the 2000s the Chinese government directly stimulated the PA of scientists, taking into account quantitative bibliometric indicators, now emphasis is placed upon the quality of publications, taking into account the national interests and needs of the country, increasing the availability of information on the results of R&D for the Chinese audience, etc. (Huang, 2020; Xu, 2020).
Detailed information on the InCites is available via the URL-link: https://clarivate.com/products/incites.
These types of publications are included in the analysis in addition to the three so-called cited document types (articles, proceedings papers, and reviews) to take into account the WoS CC-indexed publications of the USSR as fully as possible. In the USSR, other types of documents apart from articles, proceedings papers, and reviews make up quite a significant share of the country’s corpus of publications in the WoS CC: 16.7% for 1973-1992.
The country “Russia” in the WoS CC had 1.5 ths. publications (of all types) in 1990, 2.1 ths. in 1991, 10.2 ths. in 1992, and 27,400 in 1993.
The calculation of fractional count was not directly possible neither in Web of Science nor in InCites at the time when analysis was performed.
See more on the methodology of calculation of this indicator in: http://help.prod-incites.com/inCites2Live/indicatorsGroup/aboutHandbook/usingCitationIndicatorsWisely/impactRelativeToWorld.html.
Multiple editions of the annual statistical data books "National Economy of the USSR" (e.g. in editions of 1965, 1972, 1977, 1981, 1987) in a footnote to a table, which provided the data on R&D personnel in the USSR, stated invariably for many years that “the number of researchers in the USSR is one quarter of all researchers in the world”. Such rough estimates were quite typical for the Soviet statistical practice when comparing the USSR to the rest of the world or to so-called capitalistic (or Western Block) countries.
Indicators (quantitative estimates) were calculated based on five-year periods. In total, three programmes were prepared (1981-2000, 1986-2005, 1991-2010).
For China, the USSR and further Russia in 1980-1989 and 1990-1999 was a more important partner than China for a country (IRIR was 3.23 and 1.83 respectively). For 2000-2009, the average IRIR for China decreased to 0.77.
For some partner countries of Russia with 100 + joint publications in 2019 the average IRIR for 2010-2019 was very high (20.9-55.6): Uzbekistan, Mongolia, Azerbaijan, Georgia, Armenia, Kazakhstan, Latvia, Sri Lanka.
Since one publication may belong to several fields of science the sum of publications by fields of science (on the level of broad OECD FOS classification as well as on the second level of this classification) exceeds 100%.
Globally the “Humanities” (and to a lesser extent “Social sciences”) also shows much less intensity of international collaborations compared to other fields of science. In 2010-2019, the overall share of ICPs in the global number of publications in “Humanities” was 3.3-6.6%, in “Social Sciences” 10.9-19.8% vs. 16.2-22.4% for all fields of science and 21.6-27.6% in “Natural Sciences”.
According to the journal citation report (JCR) in 2019, most Russian journals in the SCIE and SSCI (86%) were in Q4. There was only one Russian journal in Q1 and there were six in Q2 out of 150 of the country’s journals in the SCIE and SSCI. For comparison, China had 59.9% of domestic WoS CC-indexed journals in the Q1 and Q2, the USA – 58.9%.
On the second level of classification the largest fields of science by number of publications in Q1 journals in 2019 are: ‘Physical sciences’ (25.0% of the total Russia’s number), ‘Clinical medicine’ (23.5%), ‘Chemical sciences’ (18.4%), ‘Biological sciences’ (12.8%), ‘Materials engineering’ (11.1%), ‘Basic medical research’ (10.3%).
For explanation of the extremely low numbers of publications in Q1 journals in ‘Humanities’ see footnote to Fig. 5.
In some comparatively large second-level fields of science, the share of Q1 journal publications was even higher in 2019: 52.0% in ‘Clinical medicine’; 43.4% in ‘Agricultural biotechnology’.
In some second-level fields of science, this figure exceeded 75% in 2019: ‘Earth and related environmental science’ (78.6%); ‘Agriculture, forestry, fisheries’ (77.6%); ‘Other social sciences’ (77.4%).
The key quantitative goal of the 5-100 Project is to ensure that in 2020 at least five of the 21 universities participating in the project, be among the top 100 universities in one of three international university rankings: Quacquarelli Symonds (QS) World University Rankings, Times Higher Education (THE) World University Rankings and Academic Ranking of World Universities (ARWU).
The Russian Science Fund was established in 2014 to support ambitious scientific projects and scientists performing research at the highest international level.
Some authors examined the situation with publications in the so-called potentially-predatory journals. E.g. (Marina and Sterligov, 2021) analyzed the Scopus database and showed that the share of publications in potentially predatory journals in Russia increased from 0.24% in 2010 to its maximum level of 8.41% in 2016 while in the USA, the UK, Germany, France, Canada and Spain it was lower than 1% in the period 2010-2018. (Guskov et al., 2018) show that some universities which participated in the 5-100 Project had extremely high share of publications in predatory journals.
(Sterligov, 2021) calculated the share of conference papers in the total number of the so-called citable documents (conference papers, articles, and reviews) of Russia in the Web of Science (ESCI data is excluded). Median value of this indicator for top-55 countries by the number of WoS CC-indexed citable documents was 9.5% in 2019.
To compare, in Japan the share of conference papers in 2019 was 12.6%, 11.1% in Germany, 10.2% in France, 10.0% in China, 8.8% in the USA, 6.8% in the UK.
The volume of GERD in 2019 at constant prices of 1989 was 1.8 times higher than in 2000, but only at 61% of the 1989 level.
References
Almqvist, R., Grossi, G., van Helden, G. J., & Reichard, C. (2013). Public sector governance and accountability. Critical Perspectives on Accounting, 24(7–8), 479–487.
Amabile, T. M. (1997). Motivating creativity in organizations: on doing what you love and loving what you do. California Management Review, 40(1), 39–58.
Andersen, L. B., & Pallesen, T. (2008). “Not just for the money?” How financial incentives affect the number of publications at danish research institutions. International Public Management Journal, 11(1), 28–47.
Anninos, L. N. (2014). Research performance evaluation: some critical thoughts on standard bibliometric indicators. Studies in Higher Education, 39(9), 1542–1561.
Archibugi, D., & Pianta, M. (1992). The technological specialization of advanced countries: A report to the EEC on international science and technology activities. NY: Springer.
Balassa, B. (1965). Trade Liberalisation and “Revealed” comparative advantage. The Manchester School, 33(2), 99–123.
Balzer, H. D. (1989). Soviet Science on the Edge of Reform. Westview Press, Inc.
Belter, C. W. (2015). Bibliometric indicators: Opportunities and limits. Journal of the Medical Library Association: JMLA, 103(4), 219–221.
Bernanke, B. S. (2011). Promoting research and development the government’s role. Issues in Science and Technology, 27(4), 37–41.
Besir Demir, S. (2018). A mixed-methods study of the ex post funding incentive policy for scholarly publications in Turkey. Journal of Scholarly Publishing, 49(4), 453–476.
Blair, P. (1997). The evolving role of government in science and technology. Engineering Evolving, 27(3), 31–36.
Bordons, M., Fernández, M., & Gómez, I. (2002). Advantages and limitations in the use of impact factor measures for the assessment of research performance. Scientometrics, 53(2), 195–206.
Bornmann, L. (2011). Peer Review and Bibliometric: Potentials and Problems. In J. C. Shin, R. K. Toutkoushian, & U. Teichler (Eds.), University Rankings Theoretical Basis, Methodology and Impacts on Global Higher Education. Dordrecht: Springer.
Butler, L. (2003a). Explaining Australia’s increased share of ISI publications—the effects of a funding formula based on publication counts. Research Policy, 32(1), 143–155.
Butler, L. (2003b). Modifying publication practices in response to funding formulas. Research Evaluation, 12(1), 39–46.
Butler, L. (2004). What happens when funding is linked to publication counts? In handbook of quantitative science and technology research (pp. 389–405). Springer.
Chankseliani, M., Lovakov, A., & Pislyakov, V. (2021). A Big picture: Bibliometric study of academic publications from post-Soviet countries. Scientometrics, 126(10), 8701–8730.
Dill, D. D. (2018). Can Public Policy Promote Academic Quality? An Assessment of Policy Instruments for Instruction and Research. In Research Handbook on Quality, Performance and Accountability in Higher Education. Edward Elgar Publishing.
Drucker, P. F. (1964). Managing for results. Economic tasks and risk-taking decisions. Heinemann.
EC (2016a). Research and Innovation Futures 2030: Exploring the Future of Research. Trends and Drivers in Doing and Governing Research. European Commission. Brussels.
EC (2016b). Realising the European Open Science Cloud. Publishing Office of the European Union. Luxembourg
Flanagan, K., Uyarra, E., & Laranja, M. (2010). The 'Policy Mix' for Innovation: Rethinking Innovation Policy in a Multi-Level, Multi-Actor Context. Manchester Business School Working Paper Series, Working paper No 599.
Freeman, C. (1994). Critical survey: The economics of technical change. Cambridge Journal of Economics, 18(5), 463–514.
Gabrys, B. J., & Langdale, J. A. (2011). How to Succeed as a Scientist: From Postdoc to Professor. Cambridge University Press.
Garfield, E. (1955). Citation indexes for science. Science, 122(3159), 108–111.
Gilyarevskii, R. S., Libkind, A. N., & Markusova, V. A. (2019). Dynamics of Russia’s Publication activity in 1993–2017 based on web of science data. Automatic Documentation and Mathematical Linguistics, 53(2), 51–63.
Gingras, Y. (2014). Criteria for evaluating indicators. In B. Cronin & C. R. Sugimoto (Eds.), Beyond bibliometrics: harnessing multidimensional indicators of scholarly impact (pp. 109–125). MIT Press.
Godin, B. (2009). The Making Science, Technology and Innovation Policy: Conceptual Frameworks as Narratives, 1945–2005. Centre Urbanisation Culture Société. Institut National de la Recherche Scientifique. Montréal (Québec).
Gokhberg, L. (1990). Scientific Potential of the USSR [Nauchnyj Potentsial SSSR]. All-Russian Institute for Scientific and Technical Information.
Gokhberg, L. (1997). Transformation of the Soviet R&D System. In L. Gokhberg, M. J. Peck, & J. Gacs (Eds.), Russian applied research and development: Its problems and its promise (pp. 9–33). Laxenburg.
Gokhberg, L. (1999). The transformation of R&D in the post-socialist countries: Patterns and trends. innovation and structural change in post-socialist countries: A quantitative approach (pp. 153–172). Springer, Netherlands.
Gokhberg, L. (2003). Statistics of science [Statistika Nauki]. TEIS.
Gokhberg, L., & Kuznetsova, T. (2015). Russian Federation. UNESCO science report: Towards 2030. UNESCO Publishing.
Gokhberg, L., & Kuznetsova, T. (2021). Russian Federation. In S. Schneegans, J. Lewis, & T. Straza (Eds.), UNESCO Science report: The race against time for smarter development (pp. 347–365). UNESCO.
Gokhberg, L., & Mindeli, L. (1996). Research and development in Russia: trends of the 1990s. Centre for Science Research and Statistics.
Gokhberg, L., & Sagieva, G. (2007). Russian Science: Bibliometric indicators [Rossiyskaya Nauka: Bibliometricheskie Indikatory]. Foresight-Russia, 1(1), 44–53.
Gorodnikova, N. (1997). Methodological Notes and Statistical Tables. In L. Gokhberg, M. J. Peck, & J. Gacs (Eds.), Russian applied research and development: Its problems and promise (pp. 161–188). Laxenburg.
Grančay, M., Vveinhardt, J., & Šumilo, Ē. (2017). Publish or Perish: How central and Eastern European economists have dealt with the ever-increasing academic publishing requirements 2000–2015. Scientometrics, 111(3), 1813–1837.
Guskov, A., Kosyakov, D., & Selivanova, I. (2017). Strategies to improve publication activities of the universities participating in project 5–100. Scientific and Technical Libraries, 12, 5–18.
Guskov, A. E., Kosyakov, D. V., & Selivanova, I. V. (2018). Boosting research productivity in top Russian universities: The circumstances of breakthrough. Scientometrics, 117(2), 1053–1080.
Haustein, S., & Larivière, V. (2015). The Use of Bibliometrics for Assessing Research: Possibilities, Limitations and Adverse Effects. In I. M. Welpe, J. Wollersheim, S. Ringelhan, & M. Osterloh (Eds.), Incentives and performance (pp. 121–139). Springer.
Henriksen, D., & Schneider, J. W. (2014). Is the Publication Behavior of Danish Researchers Affected by the National Danish Publication Indicator? A Preliminary Analysis. In Proceedings of the Science and Technology Indicators Conference (pp. 273–275).
Hicks, D., Wouters, P., Waltman, L., De Rijcke, S., & Rafols, I. (2015). Bibliometrics: The leiden manifesto for research metrics. Nature News, 520(7548), 429–431.
“Historical Materials” web-portal. Collection of various issues of statistical digests “The National Economy of the USSR” and “Scientific and Technical Progress of the USSR” issued by State Committee of Statistics of the USSR (Goskomstat). Available at: https://istmat.org/node/21341
HSE. (2020). Science and technology indicators in the russian federation: 2020: Data book. HSE.
HSE. (2021). Science and technology indicators in the russian federation: 2021: Data book. HSE.
Huang, F. (2020). China is Choosing its Own Path on Academic Evaluation. University World News, 26. URL: https://www.universityworldnews.com/post.php?story=20200226122508451
Ingwersen, P., & Larsen, B. (2014). Influence of a performance indicator on danish research production and citation impact 2000–12. Scientometrics, 101(2), 1325–1344.
Jacobsen, C. B., & Andersen, L. B. (2014). Performance management for academic researchers: How publication command systems affect individual behavior. Review of Public Personnel Administration, 34(2), 84–107.
Joint Economic Committee. (1990). Measures of soviet gross national product in 1982 prices. US Government Printing Office.
Kallio, K. M., & Kallio, T. J. (2014). Management-by-results and performance measurement in universities-implications for work motivation. Studies in Higher Education, 39(4), 574–589.
Kim, D. H., & Bak, H. J. (2016). How do scientists respond to performance-based incentives? Evidence from South Korea. International Public Management Journal, 19(1), 31–52.
King, J. (1987). A review of bibliometric and other science indicators and their role in research evaluation. Journal of Information Science, 13(5), 261–276.
Kirchik, O., Gingras, Y., & Larivière, V. (2012). changes in publication languages and citation practices and their effect on the scientific impact of Russian science (1993–2010). Journal of the American Society for Information Science and Technology, 63(7), 1411–1419.
Koenig, M. E. (1983). Bibliometric indicators versus expert opinion in assessing research performance. Journal of the American Society for Information Science, 34(2), 136–145.
Korytkowski, P., & Kulczycki, E. (2019). Examining how country-level science policy shapes publication patterns: the case of poland. Scientometrics, 119(3), 1519–1543.
Kosten, J. (2016). A classification of the use of research indicators. Scientometrics, 108(1), 457–464.
Kostoff, R. N. (1997). Use and misuse of metrics in research evaluation. Science and Engineering Ethics, 3(2), 109–120.
Kosyakov, D., & Guskov, A. (2019). Impact of national science policy on academic migration and research productivity in Russia. Procedia Computer Science, 146, 60–71.
Kosyakov, D., & Guskov, A. (2022). Reasons and consequences of changes in Russian research assessment policies. Scientometrics, 127(8), 4609–4630.
Kotsemir, M. N. (2012). Publication activity of Russian researches in leading international scientific journals. Acta Naturae, 4(2), 14–34.
Leonelli, S., Spichtinger, D., & Prainsack, B. (2015). sticks and carrots: Encouraging open science at its source. Geo: Geography and Environment, 2(1), 12–16.
Leydesdorff, L. (2008). Caveats for the Use of Citation Indicators in Research and Journal Evaluations. Journal of the American Society for Information Science and Technology, 59(2), 278–287.
Linton, J. D., Tierney, R., & Walsh, S. T. (2011). Publish or perish: how are research and reputation related? Serials Review, 37(4), 244–257.
Lovakov, A., Panova, A., Sterligov, I., & Yudkevich, M. (2021). Does government support of leading universities affect the entire higher education system? Evidence from the Russian University Excellence Initiative. Research Evaluation.
Marina, T., & Sterligov, I. (2021). Prevalence of potentially predatory publishing in scopus on the country level. Scientometrics, 126(6), 5019–5077.
Markusova, V. A., Jansz, M., Libkind, A. N., Libkind, I., & Varshavsky, A. (2009). Trends in Russian research output in post-Soviet Era. Scientometrics, 79(2), 249–260.
Marnick R. (2015). Four reasons why the government needs to keep spending money on science. https://news.cancerresearchuk.org/2015/09/08/four-reasons-why-the-government-needs-to-keep-spending-money-on-science
Martin, B. R. (2012). The evolution of science policy and innovation studies. Research Policy, 41(7), 1219–1239.
Martin, B. (2016). R&D policy instruments—a critical review of what we do and don’t know. Industry and Innovation, 23(2), 157–176.
Matveeva, N., & Ferligoj, A. (2020). Scientific collaboration in Russian universities before and after the excellence initiative project 5–100. Scientometrics, 124(3), 2383–2407.
Matveeva, N., Sterligov, I., & Lovakov, A. (2022). International scientific collaboration of post-soviet countries: a bibliometric analysis. Scientometrics, 127(3), 1583–1607.
Matveeva, N., Sterligov, I., & Yudkevich, M. (2021). The effect of Russian university excellence initiative on publications and collaboration patterns. Journal of Informetrics, 15(1), 101110.
Melo, A. I., Sarrico, C. S., & Radnor, Z. (2010). The influence of performance management systems on key actors in universities: The case of an english university. Public Management Review, 12(2), 233–254.
Mindeli, L. (Ed.). (1992). Science in the USSR: Analysis and Statistics [Nauka v SSSR: Analiz i Statistika]. Centre for Science Research and Statistics, Moscow. [in Russian].
Moed, H. F., Burger, W. J. M., Frankfort, J. G., & Van Raan, A. F. (1985). The use of bibliometric data for the measurement of university research performance. Research Policy, 14(3), 131–149.
Moed, H. F., Markusova, V., & Akoev, M. (2018). Trends in Russian research output indexed in scopus and web of science. Scientometrics, 116(2), 1153–1180.
Narin, F., Olivastro, D., & Stevens, K. A. (1994). Bibliometrics/Theory. Practice and Problems. Evaluation Review, 18(1), 65–76.
OECD (1981). Proposed Standard Practice for Surveys of Research and Experimental Development: Frascati Manual 1980, The Measurement of Scientific and Technical Activities Series. OECD, Paris.
OECD. (1994). The oecd review of science, technology and innovation policies: Russian Federation. OECD.
OECD. (2010). OECD science, technology and industry outlook 2010. OECD Publishing.
OECD. (2015). Frascati manual 2015: guidelines for collecting and reporting data on research and experimental development. OECD Publishing.
OECD. (2016). OECD science, technology and innovation outlook 2016. OECD Publishing.
OECD. (2019). Perspectives on global development 2019: Rethinking development strategies. OECD Publishing.
OECD. (2021). OECD science, technology and innovation outlook 2021: Times of crisis and opportunity. OECD Publishing.
Paul-Hus, A., Bouvier, R. L., Ni, C., Sugimoto, C. R., Pislyakov, V., & Larivière, V. (2015). Fourty years of gender disparities in russian science: A historical bibliometric analysis. Scientometrics, 102, 1541–1553.
Pavitt, K. (1984). Sectoral patterns of technical change: towards a taxonomy and a theory. Research Policy, 13(6), 343–373.
Pendlebury, D. A. (2009). The use and misuse of journal metrics and other citation indicators. Archivum Immunologiae Et Therapiae Experimentalis, 57(1), 1–11.
Pouris, A. (2003). South Africa’s research publication record: the last ten years: science policy. South African Journal of Science, 99(9), 425–428.
Rijcke, S. D., Wouters, P. F., Rushforth, A. D., Franssen, T. P., & Hammarfelt, B. (2016). Evaluation practices and effects of indicator use—a literature review. Research Evaluation, 25(2), 161–169.
Shibayama, S., & Baba, Y. (2015). Impact-oriented science policies and scientific publication practices: The case of life sciences in Japan. Research Policy, 44(4), 936–950.
Snieder, R., & Larner, K. (2009). The art of being a scientist: A guide for graduate students and their mentors. Cambridge University Press.
Sousa, C. A., de Nijs, W. F., & Hendriks, P. H. (2010). Secrets of the Beehive: Performance management in university research organizations. Human Relations, 63(9), 1439–1460.
Sterligov, I. A. (2021). The Russian Conference outbreak: Description, causes and possible policy measures. Science. Management: Theory and Practice, 3(2), 222–251.
Tassey, G. (1997). The economics of R&D policy. Quorum Books.
Tassey, G. (2004). Policy issues for R&D investment in a knowledge-based economy. The Journal of Technology Transfer, 29(2), 153–185.
Taylor, J., & Taylor, R. (2003). Performance indicators in academia: An X-efficiency approach? Australian Journal of Public Administration, 62(2), 71–82.
Thomas, S. (1992). The evaluation of Plant Biomass Research: A case study of the problems inherent in bibliometric indicators. Scientometrics, 23(1), 149–167.
Todeschini, R., & Baccini, A. (2016). Handbook of bibliometric indicators: Quantitative tools for studying and evaluating research. Wiley.
Turko, T., Bakhturin, G., Bagan, V., Poloskov, S., & Gudym, D. (2016). Influence of the program “5-top 100” on the publication activity of Russian universities. Scientometrics, 109(2), 769–782.
Van Dalen, H. P., & Henkens, K. (2012). Intended and unintended consequences of a publish-or-perish culture: a worldwide survey. Journal of the American Society for Information Science and Technology, 63(7), 1282–1293.
Van Raan, A. F. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133–143.
Weingart, P. (2005). Impact of bibliometrics upon the science system: Inadvertent consequences? Scientometrics, 62(1), 117–131.
Westney, D. E. (1991). Country patterns in R&D organization: The United States and Japan. The MIT Japan program, Massachusetts Institute of Technology. Available at: http://hdl.handle.net/1721.1/17089.
Wien, C., Dorch, B. F., & Larsen, A. V. (2017). Contradicting incentives for research collaboration. Scientometrics, 112(2), 903–915.
Wildgaard, L., Schneider, J. W., & Larsen, B. (2014). A review of the characteristics of 108 author-level bibliometric indicators. Scientometrics, 101(1), 125–158.
Wilson, C. S., & Markusova, V. A. (2004). Changes in the Scientific Output of Russia from 1980 to 2000, as Reflected in the Science Citation Index, in relation to national politico-economic changes. Scientometrics, 59(3), 345–389.
Xu, J. (2020). Guest post — How china’s new policy may change researchers' publishing behavior. The scholarly kitchen: What’s hot and cooking in scholarly publishing. Available at https://scholarlykitchen.sspnet.org/2020/03/03/guest-post-how-chinas-new-policy-may-change-researchers-publishing-behavior
Zuin, A. A., & Bianchetti, L. (2015). Productivism in the Age of the" Publish, Appear or Perish": A ZLANCE. Cadernos De Pesquisa, 45(158), 726–750.
Acknowledgements
This article is based on the study funded by the Basic Research Program of the HSE University.
Funding
National Research University Higher School of Economics
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflicts of interests
We have no conflicts of interest as well as no competing interests to disclose.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Gokhberg, L., Kuznetsova, T. & Kotsemir, M. From the Soviet Union to the Russian Federation: publication activity dynamics along the evolution of national science policies. Scientometrics 128, 6195–6246 (2023). https://doi.org/10.1007/s11192-023-04838-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-023-04838-8