Scientific publication performance in post-communist countries: still lagging far behind


We present a bibliometric comparison of publication performance in 226 scientific disciplines in the Web of Science (WoS) for six post-communist EU member states relative to six EU-15 countries of a comparable size. We compare not only overall country-level publication counts, but also high quality publication output where publication quality is inferred from the journal Article Influence Scores. As of 2010–2014, post-communist countries are still lagging far behind their EU counterparts, with the exception of a few scientific disciplines mainly in Slovenia. Moreover, research in post-communist countries tends to focus relatively more on quantity rather than quality. The relative publication performance of post-communist countries in the WoS is strongest in natural sciences and engineering. Future research is needed to reveal the underlying causes of these performance differences, which may include funding and productivity gaps, the historical legacy of the communist ideology, and Web of Science coverage differences.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6


  1. 1.

    See, e.g., Shleifer and Treisman (2014) for a broad assessment of the post-communist transition.

  2. 2.

    Given that we focus on recent publications, we abstain from analyzing their citation impact. This is an important area for future research.

  3. 3.

    A few studies have attempted to deal with this data constraint: Abramo and D’Angelo (2014b) approximate researchers’ pay levels within a country to convert the average citation impact per paper for each author into a person-specific productivity index. Bentley (2015) compares productivity per researcher across countries based on a pay-level survey. Boyle (2008) contrasts pay levels by field between two countries to explain publication output gaps. Bornmann et al. (2014) normalize citation impacts using GDP per capita.

  4. 4.

    Mingers and Leydesdorff (2015) provide a comprehensive review of theory and practice in scientometrics, which highlights much recent progress in citation impact measurement.

  5. 5.

    See, e.g., Abramo et al. (2011), Bornmann and Leydesdorff (2013), and Smith et al. (2014). The fact that intensity of publications varies across fields has been well known since Garfield (1979) and Moed et al. (1985).

  6. 6.

    There are several other studies that consider the CEE countries' publication and citation aggregates (Abbott and Schiermeier 2014; Must 2006; Vinkler 2008; Kozlowski et al. 1999; Vanecek 2008, 2014; Radosevic and Yoruk 2014) and collaboration between post-communist scientists and their EU-15 counterparts (Gorraiz et al. 2012; Kozak et al. 2015; Makkonen and Mitze 2016). Some studies focused on specific fields of science also include post-communist countries; see, e.g., Fiala and Willet (2015) for computer science and Pajić (2015) for social sciences.

  7. 7.

    Using a more sophisticated approach, Ciminiet al. (2014) assess the competitiveness of nations using Scopus citation patterns across 26 disciplines.

  8. 8.

    The AIS measures the average per-article influence of the papers published in a journal. Formally, it is defined as 0.01*EigenFactor Score/X, where \(X\) is the 5-year journal article count relative to the 5-year article count from all journals. The EigenFactor Score reflects the overall importance of a journal by utilising an algorithm similar to Google’s PageRank. The AIS is thus similar to the more widely-used Impact Factor (IF), but it has several important advantages. First, it puts more weight on citations from more prestigious journals, making the measure more informative compared to raw citation counts. Second, unlike the standard IF, the AIS uses a five-year time window. Third, the AIS ignores citations to articles from the same journal, making it harder to manipulate. Clearly, neither the AIS nor the IF are particularly well suited to assessing the quality of a publication or a researcher. However, the AIS becomes useful with a higher degree of aggregation; this would be undermined only if some groups of researchers were to systematically publish journal articles whose impact did not on average correspond to that of the journals they were published in.

  9. 9.

    The RCI compares the average citation rate of articles published in scientific journals in a given discipline in a given country during a given year with the average citation rates of all articles published in that year and discipline worldwide.

  10. 10.

    Abramo and D’Angelo (2014a, b): “…a nation with 1000 publications in a field, each with 10 citations, would rank higher than a nation with 10,000 publications of which 9999 have 10 citations but the last one a mere nine”.

  11. 11.

    More precisely by population aged 15–64 in the year 2015.

  12. 12.

    The computation of the means does not reflect population differences of countries, so that country observations are counted with equal weight. We exclude the particular country in question from the computation of the mean across the other 11 countries to ensure that the mean used is not affected by the country in question.

  13. 13.

    In producing our figures, we have applied some basic WoS Category restrictions: We exclude 5 disciplines that do not belong to any of the broad research areas and another 5 very small disciplines that have on average fewer than 25 articles per country during our 5-year window. The charts also omit those WoS Categories that fall outside their scale: There are 51 disciplines omitted from the Netherlands, 18 from Sweden, 11 from Finland, 8 from Slovenia, and 3 from each of Austria, Belgium, Croatia and Portugal. About half of these disciplines belong to the Social Sciences. We left out these observations in order to maintain reasonable readability within our figures, but they were still used to compute the statistics in Tables 1 and 2. Note that most of these omissions correspond to highly productive fields in Western countries; hence, the charts are somewhat biased in favor of the post-communist countries.

  14. 14.

    It is plausible that our comparisons, which normalize for population only, systematically favor smaller countries that manage to maintain a minimum of viable scientific activity in each field of science. This is an interesting area for future research.


  1. Abbott, A., & Schiermeier, Q. (2014). Central Europe up close. Nature, 515(7525), 22–25.

    Article  Google Scholar 

  2. Abramo, G., & D’Angelo, C. A. (2014a). Assessing national strengths and weaknesses in research fields. Journal of Informetrics, 8(3), 766–775.

    Article  Google Scholar 

  3. Abramo, G., & D’Angelo, C. A. (2014b). How do you define and measure research productivity? Scientometrics, 101(2), 1129–1144.

    Article  Google Scholar 

  4. Abramo, G., D’Angelo, C. A., & Di Costa, F. (2008). Assessment of sectoral aggregation distortion in research productivity measurements. Research Evaluation, 17(2), 111–121.

    Article  Google Scholar 

  5. Abramo, G., D’Angelo, C. A., & Viel, F. (2011). The field-standardized average impact of national research systems compared to world average: the case of Italy. Scientometrics, 88(2), 599–615.

    Article  Google Scholar 

  6. Bentley, P. J. (2015). Cross-country differences in publishing productivity of academics in research universities. Scientometrics, 102(1), 865–883.

    Article  Google Scholar 

  7. Bornmann, L., & Leydesdorff, L. (2013). Macro-indicators of citation impacts of six prolific countries: InCites data and the statistical significance of trends. PLoS ONE. doi:10.1371/journal.pone.0056768.

    Google Scholar 

  8. Bornmann, L., Leydesdorff, L., & Wang, J. (2014). How to improve the prediction based on citation impact percentiles for years shortly after the publication date? Journal of Informetrics, 8(1), 175–180.

    Article  Google Scholar 

  9. Boyle, G. (2008). Pay peanuts and get monkeys? Evidence from academia. The B.E. Journal of Economic Analysis & Policy, 8(1), 1–24.

    Article  Google Scholar 

  10. Cimini, G., Gabrielli, A., & Labini, F. S. (2014). The scientific competitiveness of nations. PLoS ONE, 9(12), e113470. doi:10.1371/journal.pone.0113470.

    Article  Google Scholar 

  11. Fiala, D., & Willett, P. (2015). Computer science in Eastern Europe 1989–2014: A bibliometric study. Aslib Journal of Information Management, 67(5), 526–541.

    Article  Google Scholar 

  12. Garfield, E. (1979). Is citation analysis a legitimate evaluation tool? Scientometrics, 1(4), 359–375.

    Article  Google Scholar 

  13. Gorraiz, J., Reimann, R., & Gumpenberger, C. (2012). Key factors and considerations in the assessment of international collaboration: A case study for Austria and six countries. Scientometrics, 91(2), 417–433.

    Article  Google Scholar 

  14. Jonkers, K., & Zacharewicz, T. (2016). Research performance based funding systems: A comparative assessment (No. JRC101043). Institute for Prospective Technological Studies, Joint Research Centre.

  15. King, D. A. (2004). The scientific impact of nations. Nature, 430(6997), 311–316.

    Article  Google Scholar 

  16. Kozak, M., Bornmann, L., & Leydesdorff, L. (2015). How have the Eastern European countries of the former Warsaw Pact developed since 1990? A bibliometric study. Scientometrics, 102(2), 1101–1117.

    Article  Google Scholar 

  17. Kozlowski, J., Radosevic, S., & Ircha, D. (1999). History matters: The inherited disciplinary structure of the post-communist science in countries of central and Eastern Europe and its restructuring. Scientometrics, 45(1), 137–166.

    Article  Google Scholar 

  18. Makkonen, T., & Mitze, T. (2016). Scientific collaboration between ‘old’and ‘new’member states: Did joining the European Union make a difference? Scientometrics, 106(3), 1193–1215.

    Article  Google Scholar 

  19. May, R. M. (1997). The scientific wealth of nations. Science, 275(5301), 793–796.

    Article  Google Scholar 

  20. Mingers, J., & Leydesdorff, L. (2015). A review of theory and practice in scientometrics. European Journal of Operational Research, 246(1), 1–19.

    Article  MATH  Google Scholar 

  21. Moed, H. F., Burger, W. J. M., Frankfort, J. G., & Van Raan, A. F. (1985). The use of bibliometric data for the measurement of university research performance. Research Policy, 14(3), 131–149.

    Article  Google Scholar 

  22. Must, Ü. (2006). “New” countries in Europe-Research, development and innovation strategies vs bibliometric data. Scientometrics, 66(2), 241–248.

    Article  Google Scholar 

  23. Pajić, D. (2015). Globalization of the social sciences in Eastern Europe: Genuine breakthrough or a slippery slope of the research evaluation practice? Scientometrics, 102(3), 2131–2150.

    Article  Google Scholar 

  24. Radosevic, S., & Yoruk, E. (2014). Are there global shifts in the world science base? Analysing the catching up and falling behind of world regions. Scientometrics, 101(3), 1897–1924.

    Article  Google Scholar 

  25. Shleifer, A., & Treisman, D. (2014). Normal countries: The East 25 years after communism. Foreign Affairs, 93(6), 92–103.

    Google Scholar 

  26. Smith, M. J., Weinberger, C., Bruna, E. M., & Allesina, S. (2014). The scientific impact of nations: Journal placement and citation performance. PLoS ONE, 9(10), e109195.

    Article  Google Scholar 

  27. Vanecek, J. (2008). Bibliometric analysis of the Czech research publications from 1994 to 2005. Scientometrics, 77, 345–360.

    Article  Google Scholar 

  28. Vanecek, J. (2014). The effect of performance-based research funding on output of R&D results in the Czech Republic. Scientometrics, 98(1), 657–681.

    Article  Google Scholar 

  29. Vinkler, P. (2008). Correlation between the structure of scientific research, scientometric indicators and GDP in EU and non-EU countries. Scientometrics, 74(2), 237–254.

    Article  Google Scholar 

Download references


This study was completed with the support of the Czech Academy of Sciences, as part of its Strategy AV21. The authors wish to thank J. Kovařík for his help in obtaining information from the WoS, and two anonymous referees for valuable suggestions.

Author information



Corresponding author

Correspondence to Daniel Münich.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Jurajda, Š., Kozubek, S., Münich, D. et al. Scientific publication performance in post-communist countries: still lagging far behind. Scientometrics 112, 315–328 (2017).

Download citation


  • Bibliometrics
  • National comparison
  • Scientometric indicators
  • Article Influence Score
  • Web of Science
  • Post-communist Europe

Mathematics Subject Classification

  • 91F99

JEL Classification

  • I23
  • I29