Advertisement

Scientometrics

, Volume 119, Issue 2, pp 973–985 | Cite as

Examining national citation impact by comparing developments in a fixed and a dynamic journal set

  • Jesper W. SchneiderEmail author
  • Thed van Leeuwen
  • Martijn Visser
  • Kaare Aagaard
Article

Abstract

In order to examine potential effects of methodological choices influencing developments in relative citation scores for countries, a fixed journal set comprising of 3232 journals continuously indexed in the Web of Science from 1981 to 2014 is constructed. From this restricted set, a citation database depicting the citing relations between the journal publications is formed and relative citation scores based on full and fractional counting are calculated for the whole period. Previous longitudinal studies of citation impact show stable rankings between countries. To examine such findings coming from a dynamic set of journals for potential “database effects”, we compare them to our fixed set. We find that relative developments in impact scores, country profiles and rankings are both very stable and very similar within and between the two journal sets as well as counting methods. We do see a small “inflation factor” as citation scores generally are somewhat lower for high-performing countries in the fixed set compared to the dynamic set. Consequently, using an ever-decreasing set of journals compared to the dynamic set, we are still able to reproduce accurately the developments in impact scores and the rankings between the countries found in the dynamic set. Hence, potential effects of methodological choices seem to be of limited importance compared to the stability of citation networks.

Keywords

Fixed journal set Database effects National citation impact Longitudinal study 

Notes

Acknowledgements

The research was funded by the Research Council of Norway, Grant No. 256223 (the R-QUEST centre).

Supplementary material

11192_2019_3082_MOESM1_ESM.docx (655 kb)
Supplementary material 1 (DOCX 655 kb)

References

  1. Aagaard, K., & Schneider, J. W. (2015). Research funding and national academic performance: Examination of a Danish success story. Science and Public Policy.  https://doi.org/10.1093/scipol/scv058.Google Scholar
  2. Aksnes, D. W., Schneider, J. W., & Gunnarsson, M. (2012). Ranking national research systems by citation indicators. A comparative analysis using whole and fractionalised counting methods. Journal of Informetrics, 6(1), 36–43.  https://doi.org/10.1016/j.joi.2011.08.002.CrossRefGoogle Scholar
  3. Althouse, B. M., West, J. D., Bergstrom, C. T., & Bergstrom, T. (2009). Differences in impact factor across fields and over time. Journal of the American Society for Information Science and Technology, 60(1), 27–34.  https://doi.org/10.1002/asi.20936.CrossRefGoogle Scholar
  4. Bonitz, M. (1997). The scientific talents of nations. Libri, 47(4), 206–213.  https://doi.org/10.1515/libr.1997.47.4.206.CrossRefGoogle Scholar
  5. Bonitz, M. (2002). Ranking of nations and heightened competition in Matthew core journals: Two faces of the Matthew effect for countries. Library Trends, 50(3), 440–460.Google Scholar
  6. Bonitz, M. (2005). Ten years Matthew effect for countries. Scientometrics, 64(3), 375–379.  https://doi.org/10.1007/s11192-005-0256-5.CrossRefGoogle Scholar
  7. Bonitz, M., Bruckner, E., & Scharnhorst, A. (1997). Characteristics and impact of the Matthew effect for countries. Scientometrics, 40(3), 407–422.  https://doi.org/10.1007/BF02459289.CrossRefGoogle Scholar
  8. Braun, T., Glänzel, W., & Schubert, A. (1989). Assessing assessments of British science. Some facts and figures to accept or decline. Scientometrics, 15(3), 165–170.  https://doi.org/10.1007/bf02017195.CrossRefGoogle Scholar
  9. Braun, T., Glänzel, W., & Schubert, A. (1991). The bibliometric assessment of UK scientific performance—Some comments on Martin’s “reply”. Scientometrics, 20(2), 359–362.  https://doi.org/10.1007/bf02017525.CrossRefGoogle Scholar
  10. Cimini, G., Gabrielli, A., & Sylos Labini, F. (2014). The scientific competitiveness of nations. PLoS ONE, 9(12), e113470.  https://doi.org/10.1371/journal.pone.0113470.CrossRefGoogle Scholar
  11. Garfield, E. (1955). Citation indexes to science: A new dimension in documentation through association of ideas. Science, 122(3159), 108–111.CrossRefGoogle Scholar
  12. Garfield, E. (1996). The significant scientific literature appears in a small core of journals. The Scientist, 10(17), 13–15.Google Scholar
  13. Garfield, E., & Sher, I. H. (1963). Genetics Citation Index. http://www.garfield.library.upenn.edu/essays/v7p515y1984.pdf, Philadelphia, PA.
  14. Gauffriau, M., & Larsen, P. O. (2005). Counting methods are decisive for rankings based on publication and citation studies. Scientometrics, 64(1), 85–93.  https://doi.org/10.1007/s11192-005-0239-6.CrossRefGoogle Scholar
  15. Gauffriau, M., Larsen, P., Maye, I., Roulin-Perriard, A., & von Ins, M. (2007). Publication, cooperation and productivity measures in scientific research. Scientometrics, 73(2), 175–214.  https://doi.org/10.1007/s11192-007-1800-2.CrossRefGoogle Scholar
  16. Gauffriau, M., Larsen, P., Maye, I., Roulin-Perriard, A., & von Ins, M. (2008). Comparisons of results of publication counting using different methods. Scientometrics, 77(1), 147–176.  https://doi.org/10.1007/s11192-007-1934-2.CrossRefGoogle Scholar
  17. Glanzel, W., Danell, R., & Persson, O. (2003). The decline of Swedish neuroscience: Decomposing a bibliometric national science indicator. Scientometrics, 57(2), 197–213.  https://doi.org/10.1023/a:1024185601555.CrossRefGoogle Scholar
  18. Hagen, N. T. (2009). Credit for coauthors. Science, 323(5914), 583.CrossRefGoogle Scholar
  19. Ioannidis, J. P. A. (2006). Concentration of the most-cited papers in the scientific literature: Analysis of Journal Ecosystems. PLoS ONE, 1(1), e5.  https://doi.org/10.1371/journal.pone.0000005.CrossRefGoogle Scholar
  20. Katz, J. S., & Martin, B. R. (1997). What is research collaboration? Research Policy, 26(1), 1–18.  https://doi.org/10.1016/s0048-7333(96)00917-1.CrossRefGoogle Scholar
  21. Kiesslich, T., Weineck, S. B., & Koelblinger, D. (2016). Reasons for journal impact factor changes: Influence of changing source items. PLoS ONE, 11(4), e0154199.  https://doi.org/10.1371/journal.pone.0154199.CrossRefGoogle Scholar
  22. Larsen, P. O., & von Ins, M. (2010). The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index. Scientometrics, 84(3), 575–603.  https://doi.org/10.1007/s11192-010-0202-z.CrossRefGoogle Scholar
  23. Leydesdorff, L. (1988). Problems with the ‘measurement’ of national scientific performance. Science and Public Policy, 15(3), 149–152.  https://doi.org/10.1093/spp/15.3.149.Google Scholar
  24. Leydesdorff, L. (1989). The science citation index and the measurement of national performance in terms of numbers of scientific publications. Scientometrics, 17(1), 111–120.  https://doi.org/10.1007/bf02017727.CrossRefGoogle Scholar
  25. Leydesdorff, L. (1991). On the “scientometric decline” of British science. One additional graph in reply to Ben Martin. Scientometrics, 20(2), 363–367.  https://doi.org/10.1007/bf02017526.CrossRefGoogle Scholar
  26. Leydesdorff, L. (2002). Dynamic and evolutionary updates of classificatory schemes in scientific journal structures. Journal of the American Society for Information Science and Technology, 53(12), 987–994.  https://doi.org/10.1002/asi.10144.CrossRefGoogle Scholar
  27. Leydesdorff, L. (2017). The positive side of discursive disagreements in the social sciences. Journal of Informetrics, 11(4), 1043.  https://doi.org/10.1016/j.joi.2017.09.006.CrossRefGoogle Scholar
  28. Leydesdorff, L., & Wagner, C. (2009). Is the United States losing ground in science? A global perspective on the world science system. Scientometrics, 78(1), 23–36.  https://doi.org/10.1007/s11192-008-1830-4.CrossRefGoogle Scholar
  29. Martin, B. R. (1991). The bibliometric assessment of UK scientific performance a reply to Braun, Glänzel and Schubert. Scientometrics, 20(2), 333–357.  https://doi.org/10.1007/bf02017524.CrossRefGoogle Scholar
  30. Martin, B. R. (1994). British science in the 1980s—Has the relative decline continued? Scientometrics, 29(1), 27–56.  https://doi.org/10.1007/bf02018382.MathSciNetCrossRefGoogle Scholar
  31. Martin, B. R. (2017). When social scientists disagree: Comments on the Butler-van den Besselaar debate. Journal of Informetrics, 11(3), 937–940.  https://doi.org/10.1016/j.joi.2017.05.021.CrossRefGoogle Scholar
  32. Martin, B. R., Irvine, J., Narin, F., & Sterritt, C. (1987). The continuing decline of British science. Nature, 330(6144), 123–126.CrossRefGoogle Scholar
  33. Michels, C., & Schmoch, U. (2012). The growth of science and database coverage. Scientometrics, 93(3), 831–846.  https://doi.org/10.1007/s11192-012-0732-7.CrossRefGoogle Scholar
  34. Mirowski, P. (2010). Bibliometrics and the modern commercial regime. Archives Europeennes De Sociologie, 51(2), 243–270.  https://doi.org/10.1017/s0003975610000123.CrossRefGoogle Scholar
  35. Mirowski, P. (2011). Science-mart. Privatizing American science. Cambridge, MA: Harvard University Press.CrossRefGoogle Scholar
  36. Moed, H. F. (2008). UK research assessment exercises: Informed judgments on research quality or quantity? Scientometrics, 74(1), 153–161.  https://doi.org/10.1007/s11192-008-0108-1.CrossRefGoogle Scholar
  37. Narin, F. (1976). Evaluative bibliometrics: The use of publication and citation analysis in the evaluation of scientific activity. Washington, DC: Computer Horizons Inc.Google Scholar
  38. Narin, F., Stevens, K., Anderson, J., Collins, P., Irvine, J., Isard, P., et al. (1988). On-line approaches to measuring national scientific output: A cautionary tale. Science and Public Policy, 15(3), 153–161.Google Scholar
  39. Neff, B. D., & Olden, J. D. (2010). Not so fast: Inflation in impact factors contributes to apparent improvements in journal quality. BioScience, 60(6), 455–459.CrossRefGoogle Scholar
  40. Schneider, J. W. (2010). Bibliometric research performance Indicators for the Nordic Countries. Retrieved from https://www.nordforsk.org/en/publications/publications_container/bibliometric-research-performance-indicators-for-the-nordic-countries.
  41. Schneider, J. W., & Aagaard, K. (2015). Developments in Danish research performance. Retrieved from http://ufm.dk/publikationer/2015/filer/dfir_scientometric_analysis_final.pdf.
  42. Stahlschmidt, S., & Hinze, S. (2016). How does the scientific progress in developing countries affect bibliometric impact measures of developed countries? A counterfactual case study on China. In Paper presented at the 21st international conference on science and technology indicators (STI), Universitat Politècnica de València, Spain.Google Scholar
  43. Studer, K. E., & Chubin, D. E. (1980). The cancer mission: Social contexts of biomedical research. London: Sage.Google Scholar
  44. Testa, J. (2011). The globalization of web of science: 20052010. http://wokinfo.com/media/pdf/globalwos-essay.
  45. Tijssen, R. J. W., & Winnink, J. (2016). Twenty-first century macro-trends in the institutional fabric of science: bibliometric monitoring and analysis. Scientometrics, 109(3), 2181–2194.  https://doi.org/10.1007/s11192-016-2041-z.CrossRefGoogle Scholar
  46. van Leeuwen, T., & Tijssen, R. (2000). Interdisciplinary dynamics of modern science: Analysis of cross-disciplinary citation flows. Research Evaluation, 9(3), 183–187.  https://doi.org/10.3152/147154400781777241.CrossRefGoogle Scholar
  47. Waltman, L., Calero-Medina, C., Kosten, J., Noyons, E. C. M., Tijssen, R. J. W., van Eck, N. J., et al. (2012). The Leiden ranking 2011/2012: Data collection, indicators, and interpretation. Journal of the American Society for Information Science and Technology, 63(12), 2419–2432.  https://doi.org/10.1002/asi.22708.CrossRefGoogle Scholar
  48. Waltman, L., & van Eck, N. J. (2015). Field-normalized citation impact indicators and the choice of an appropriate counting method. Journal of Informetrics, 9(4), 872–894.  https://doi.org/10.1016/j.joi.2015.08.001.CrossRefGoogle Scholar
  49. Wilson, A. E. (2007). Journal impact factors are inflated. BioScience, 57(7), 550–551.  https://doi.org/10.1641/B570702.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2019

Authors and Affiliations

  • Jesper W. Schneider
    • 1
    Email author
  • Thed van Leeuwen
    • 2
  • Martijn Visser
    • 2
  • Kaare Aagaard
    • 1
  1. 1.Department of Political Science, Centre for Studies in Research and Research PolicyAarhus UniversityAarhusDenmark
  2. 2.Centre for Science and Technology Studies (CWTS)Leiden UniversityLeidenThe Netherlands

Personalised recommendations