, Volume 119, Issue 2, pp 973–985 | Cite as

Examining national citation impact by comparing developments in a fixed and a dynamic journal set

  • Jesper W. SchneiderEmail author
  • Thed van Leeuwen
  • Martijn Visser
  • Kaare Aagaard


In order to examine potential effects of methodological choices influencing developments in relative citation scores for countries, a fixed journal set comprising of 3232 journals continuously indexed in the Web of Science from 1981 to 2014 is constructed. From this restricted set, a citation database depicting the citing relations between the journal publications is formed and relative citation scores based on full and fractional counting are calculated for the whole period. Previous longitudinal studies of citation impact show stable rankings between countries. To examine such findings coming from a dynamic set of journals for potential “database effects”, we compare them to our fixed set. We find that relative developments in impact scores, country profiles and rankings are both very stable and very similar within and between the two journal sets as well as counting methods. We do see a small “inflation factor” as citation scores generally are somewhat lower for high-performing countries in the fixed set compared to the dynamic set. Consequently, using an ever-decreasing set of journals compared to the dynamic set, we are still able to reproduce accurately the developments in impact scores and the rankings between the countries found in the dynamic set. Hence, potential effects of methodological choices seem to be of limited importance compared to the stability of citation networks.


Fixed journal set Database effects National citation impact Longitudinal study 



The research was funded by the Research Council of Norway, Grant No. 256223 (the R-QUEST centre).

Supplementary material

11192_2019_3082_MOESM1_ESM.docx (655 kb)
Supplementary material 1 (DOCX 655 kb)


  1. Aagaard, K., & Schneider, J. W. (2015). Research funding and national academic performance: Examination of a Danish success story. Science and Public Policy. Scholar
  2. Aksnes, D. W., Schneider, J. W., & Gunnarsson, M. (2012). Ranking national research systems by citation indicators. A comparative analysis using whole and fractionalised counting methods. Journal of Informetrics, 6(1), 36–43. Scholar
  3. Althouse, B. M., West, J. D., Bergstrom, C. T., & Bergstrom, T. (2009). Differences in impact factor across fields and over time. Journal of the American Society for Information Science and Technology, 60(1), 27–34. Scholar
  4. Bonitz, M. (1997). The scientific talents of nations. Libri, 47(4), 206–213. Scholar
  5. Bonitz, M. (2002). Ranking of nations and heightened competition in Matthew core journals: Two faces of the Matthew effect for countries. Library Trends, 50(3), 440–460.Google Scholar
  6. Bonitz, M. (2005). Ten years Matthew effect for countries. Scientometrics, 64(3), 375–379. Scholar
  7. Bonitz, M., Bruckner, E., & Scharnhorst, A. (1997). Characteristics and impact of the Matthew effect for countries. Scientometrics, 40(3), 407–422. Scholar
  8. Braun, T., Glänzel, W., & Schubert, A. (1989). Assessing assessments of British science. Some facts and figures to accept or decline. Scientometrics, 15(3), 165–170. Scholar
  9. Braun, T., Glänzel, W., & Schubert, A. (1991). The bibliometric assessment of UK scientific performance—Some comments on Martin’s “reply”. Scientometrics, 20(2), 359–362. Scholar
  10. Cimini, G., Gabrielli, A., & Sylos Labini, F. (2014). The scientific competitiveness of nations. PLoS ONE, 9(12), e113470. Scholar
  11. Garfield, E. (1955). Citation indexes to science: A new dimension in documentation through association of ideas. Science, 122(3159), 108–111.CrossRefGoogle Scholar
  12. Garfield, E. (1996). The significant scientific literature appears in a small core of journals. The Scientist, 10(17), 13–15.Google Scholar
  13. Garfield, E., & Sher, I. H. (1963). Genetics Citation Index., Philadelphia, PA.
  14. Gauffriau, M., & Larsen, P. O. (2005). Counting methods are decisive for rankings based on publication and citation studies. Scientometrics, 64(1), 85–93. Scholar
  15. Gauffriau, M., Larsen, P., Maye, I., Roulin-Perriard, A., & von Ins, M. (2007). Publication, cooperation and productivity measures in scientific research. Scientometrics, 73(2), 175–214. Scholar
  16. Gauffriau, M., Larsen, P., Maye, I., Roulin-Perriard, A., & von Ins, M. (2008). Comparisons of results of publication counting using different methods. Scientometrics, 77(1), 147–176. Scholar
  17. Glanzel, W., Danell, R., & Persson, O. (2003). The decline of Swedish neuroscience: Decomposing a bibliometric national science indicator. Scientometrics, 57(2), 197–213. Scholar
  18. Hagen, N. T. (2009). Credit for coauthors. Science, 323(5914), 583.CrossRefGoogle Scholar
  19. Ioannidis, J. P. A. (2006). Concentration of the most-cited papers in the scientific literature: Analysis of Journal Ecosystems. PLoS ONE, 1(1), e5. Scholar
  20. Katz, J. S., & Martin, B. R. (1997). What is research collaboration? Research Policy, 26(1), 1–18. Scholar
  21. Kiesslich, T., Weineck, S. B., & Koelblinger, D. (2016). Reasons for journal impact factor changes: Influence of changing source items. PLoS ONE, 11(4), e0154199. Scholar
  22. Larsen, P. O., & von Ins, M. (2010). The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index. Scientometrics, 84(3), 575–603. Scholar
  23. Leydesdorff, L. (1988). Problems with the ‘measurement’ of national scientific performance. Science and Public Policy, 15(3), 149–152. Scholar
  24. Leydesdorff, L. (1989). The science citation index and the measurement of national performance in terms of numbers of scientific publications. Scientometrics, 17(1), 111–120. Scholar
  25. Leydesdorff, L. (1991). On the “scientometric decline” of British science. One additional graph in reply to Ben Martin. Scientometrics, 20(2), 363–367. Scholar
  26. Leydesdorff, L. (2002). Dynamic and evolutionary updates of classificatory schemes in scientific journal structures. Journal of the American Society for Information Science and Technology, 53(12), 987–994. Scholar
  27. Leydesdorff, L. (2017). The positive side of discursive disagreements in the social sciences. Journal of Informetrics, 11(4), 1043. Scholar
  28. Leydesdorff, L., & Wagner, C. (2009). Is the United States losing ground in science? A global perspective on the world science system. Scientometrics, 78(1), 23–36. Scholar
  29. Martin, B. R. (1991). The bibliometric assessment of UK scientific performance a reply to Braun, Glänzel and Schubert. Scientometrics, 20(2), 333–357. Scholar
  30. Martin, B. R. (1994). British science in the 1980s—Has the relative decline continued? Scientometrics, 29(1), 27–56. Scholar
  31. Martin, B. R. (2017). When social scientists disagree: Comments on the Butler-van den Besselaar debate. Journal of Informetrics, 11(3), 937–940. Scholar
  32. Martin, B. R., Irvine, J., Narin, F., & Sterritt, C. (1987). The continuing decline of British science. Nature, 330(6144), 123–126.CrossRefGoogle Scholar
  33. Michels, C., & Schmoch, U. (2012). The growth of science and database coverage. Scientometrics, 93(3), 831–846. Scholar
  34. Mirowski, P. (2010). Bibliometrics and the modern commercial regime. Archives Europeennes De Sociologie, 51(2), 243–270. Scholar
  35. Mirowski, P. (2011). Science-mart. Privatizing American science. Cambridge, MA: Harvard University Press.CrossRefGoogle Scholar
  36. Moed, H. F. (2008). UK research assessment exercises: Informed judgments on research quality or quantity? Scientometrics, 74(1), 153–161. Scholar
  37. Narin, F. (1976). Evaluative bibliometrics: The use of publication and citation analysis in the evaluation of scientific activity. Washington, DC: Computer Horizons Inc.Google Scholar
  38. Narin, F., Stevens, K., Anderson, J., Collins, P., Irvine, J., Isard, P., et al. (1988). On-line approaches to measuring national scientific output: A cautionary tale. Science and Public Policy, 15(3), 153–161.Google Scholar
  39. Neff, B. D., & Olden, J. D. (2010). Not so fast: Inflation in impact factors contributes to apparent improvements in journal quality. BioScience, 60(6), 455–459.CrossRefGoogle Scholar
  40. Schneider, J. W. (2010). Bibliometric research performance Indicators for the Nordic Countries. Retrieved from
  41. Schneider, J. W., & Aagaard, K. (2015). Developments in Danish research performance. Retrieved from
  42. Stahlschmidt, S., & Hinze, S. (2016). How does the scientific progress in developing countries affect bibliometric impact measures of developed countries? A counterfactual case study on China. In Paper presented at the 21st international conference on science and technology indicators (STI), Universitat Politècnica de València, Spain.Google Scholar
  43. Studer, K. E., & Chubin, D. E. (1980). The cancer mission: Social contexts of biomedical research. London: Sage.Google Scholar
  44. Testa, J. (2011). The globalization of web of science: 20052010.
  45. Tijssen, R. J. W., & Winnink, J. (2016). Twenty-first century macro-trends in the institutional fabric of science: bibliometric monitoring and analysis. Scientometrics, 109(3), 2181–2194. Scholar
  46. van Leeuwen, T., & Tijssen, R. (2000). Interdisciplinary dynamics of modern science: Analysis of cross-disciplinary citation flows. Research Evaluation, 9(3), 183–187. Scholar
  47. Waltman, L., Calero-Medina, C., Kosten, J., Noyons, E. C. M., Tijssen, R. J. W., van Eck, N. J., et al. (2012). The Leiden ranking 2011/2012: Data collection, indicators, and interpretation. Journal of the American Society for Information Science and Technology, 63(12), 2419–2432. Scholar
  48. Waltman, L., & van Eck, N. J. (2015). Field-normalized citation impact indicators and the choice of an appropriate counting method. Journal of Informetrics, 9(4), 872–894. Scholar
  49. Wilson, A. E. (2007). Journal impact factors are inflated. BioScience, 57(7), 550–551. Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2019

Authors and Affiliations

  • Jesper W. Schneider
    • 1
    Email author
  • Thed van Leeuwen
    • 2
  • Martijn Visser
    • 2
  • Kaare Aagaard
    • 1
  1. 1.Department of Political Science, Centre for Studies in Research and Research PolicyAarhus UniversityAarhusDenmark
  2. 2.Centre for Science and Technology Studies (CWTS)Leiden UniversityLeidenThe Netherlands

Personalised recommendations