Scientometrics

, Volume 77, Issue 1, pp 147–176 | Cite as

Comparisons of results of publication counting using different methods

  • Marianne Gauffriau
  • Peder Olesen Larsen
  • Isabelle Maye
  • Anne Roulin-Perriard
  • Markus von Ins
Article

Abstract

Using a database for publications established at CEST and covering the period from 1981 to 2002 the differences in national scores obtained by different counting methods have been measured. The results are supported by analysing data from the literature. Special attention has been paid to the comparison between the EU and the USA. There are big differences between scores obtained by different methods. In one instance the reduction in scores going from whole to complete-normalized (fractional) counting is 72 per cent. In the literature there is often not enough information given about methods used, and no sign of a clear and consistent terminology and of agreement on properties of and results from different methods. As a matter of fact, whole counting is favourable to certain countries, especially countries with a high level of international cooperation. The problems are increasing with time because of the ever-increasing national and international cooperation in research and the increasing average number of authors per publication. The need for a common understanding and a joint effort to rectify the situation is stressed.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Anderson, J., Collins, P. M. D., Irvine, J., Isard, P. A., Martin, B. R., Narin, F., Stevens, K. (1988), On-line approaches to measuring national scientific output: a cautionary tale, Science and Public Policy, 15: 153–161.Google Scholar
  2. Borgman, C. L., Furner, J. (2002), Scholarly communication and bibliometrics, Annual Review of Information Science and Technology, 36: 3–72.CrossRefGoogle Scholar
  3. Bourke, P., Butler, L. (1994), A Crisis for Australian Science? Performance Indicators Project, Monograph Series No.1, The Australian National University.Google Scholar
  4. Bourke, P., Butler, L. (1996), Standards issues in a national bibliometric database: the Australian case, Scientometrics, 35: 199–207.CrossRefGoogle Scholar
  5. Braun, T., Glänzel, W., Schubert, A. (1989), Assessing assessments of British science. Some facts and figures to accept or decline, Scientometrics, 15: 165–170.CrossRefGoogle Scholar
  6. Braun, T., Glänzel, W., Schubert, A. (1991), The bibliometric assessment of UK scientific performance: some comments on Martin’s “Reply”, Scientometrics, 20: 359–362.CrossRefGoogle Scholar
  7. Butler, L. (2003), Explaining Australia’s increased share of ISI publications-the effects of a funding formula based on publication counts, Research Policy, 32: 143–155.CrossRefGoogle Scholar
  8. CEST (2003), Place scientifique suisse 2001. Développement de la recherche en comparaison internationale sur la base d’indicateurs biblométriques 1981-2001. http://www.cest.ch/Publikationen/2003/CEST_2003_5.pdf
  9. CEST (2004), Les institutions du domaine des Ecoles polytechniques fédérales. Profils de recherche et comparaisons internationales. Indicateurs bibliométriques pour les années 1981-2002. http://www.cest.ch/Publikationen/2004/CEST_2004_5.pdf
  10. Cole, S., Phelan, T. J. (1999), The scientific productivity of nations, Minerva, 37 (1): 1–23.CrossRefGoogle Scholar
  11. Cronin, B., Overfelt, K. (1994), Citation-based auditing of academic performance, Journal of the American Society for Information Science, 45: 61–72.CrossRefGoogle Scholar
  12. Egghe, L., Rousseau, R., Van Hooydonk, G. (2000), Methods for accrediting publications to authors or countries: consequences for evaluation studies, Journal of the American Society for Information Science, 51: 145–157.CrossRefGoogle Scholar
  13. EUROPEAN COMMISSION (2001), Towards a European Research Area. Key Figures 2001. BruxellesGoogle Scholar
  14. EUROPEAN COMMISSION (2002), Towards a European Research Area. Key Figures 2002. BruxellesGoogle Scholar
  15. EUROPEAN COMMISSION (2003), Third European Report on Science & Technology Indicators 2003. Towards a knowledge-based economy. Bruxelles.Google Scholar
  16. EUROPEAN COMMISSION (2005), Towards a European Research Area. Science, Technology and Innovation. Key Figures. Bruxelles.Google Scholar
  17. EUROPEAN COMMISSION (2006), Report on indicators of hydrogen and fuel cells research. Work Package 2. 27th September 2006. http://www.hy-co-era.net/datapool/page/18/D2.5_Final_set_of_indicators.pdf. Last accessed 2007-04-28.
  18. Gauffriau, M., Larsen, P. O. (2005), Counting methods are decisive for rankings based on publication and citation studies, Scientometrics, 64: 85–93.CrossRefGoogle Scholar
  19. Gauffriau, M., Larsen, P. O., Maye, I., Roulin-Perriard, A., Von Ins, M. (2007), Publication, cooperation and productivity measures in scientific research, Scientometrics, 73: 175–214.CrossRefGoogle Scholar
  20. Glänzel, W. (1996), The need for standards in bibliometric research and technology, Scientometrics, 35: 167–176.CrossRefGoogle Scholar
  21. Glänzel, W. (2001), National characteristics in international scientific co-authorship relations, Scientometrics, 51: 69–115.CrossRefGoogle Scholar
  22. Glänzel, W., Katz, S., Moed, U., Schoepflin, U. (1996), Preface, Scientometrics, 35: 165–166.CrossRefGoogle Scholar
  23. Glänzel, W., Schoepflin, U. (1994), Little scientometrics, big scientometrics ... and beyond? Scientometrics, 30: 375–384.CrossRefGoogle Scholar
  24. Haeffner-Cavaillon, N., Graillot-Gak, C., Bréchot, C. (2005), Automated grading of research performance clearly fails to measure up, Nature, 438: 559.CrossRefGoogle Scholar
  25. Horta, H., Veloso, F. (2007), Opening the box: comparing EU and US scientific output by scientific field, Technological Forecasting and Social Change, 74: 1334–1356.CrossRefGoogle Scholar
  26. Igami, M., Saka, A. (2007), Capturing the Evolving Nature of Science, the Development of New Scientific Indicators and the Mapping of Science, OECD Science, Technology and Industry Working Papers 2007/1, OECD Publishing. doi: 10.1787/30005636714.Google Scholar
  27. King, D. A. (2004), The scientific impact of nations. What different countries get for their research spending, Nature, 430: 311–316.CrossRefGoogle Scholar
  28. Lange, L. (2001), Citation counts on multi-authored papers-first-name authors and further authors, Scientometrics, 52 (3): 457–470.CrossRefGoogle Scholar
  29. Leydesdorff, L. (1988), Problems with the ‘measurement’ of national scientific performance, Science and Public Policy, 15: 149–152.Google Scholar
  30. Leydesdorff, L. (1991), On the ’scientometric Decline’ of British science. One additional graph in response to Ben Martin, Scientometrics, 20: 363–368.CrossRefGoogle Scholar
  31. Lindsey, D. (1982), Further evidence for adjusting for multiple authorship, Scientometrics, 4: 389–395.CrossRefGoogle Scholar
  32. L’Observatoire des Sciences et des Technologies-OST (2004), Indicateurs bibliométriques des institutions publiques de recherche françaises (2000-hors sciences humaines et sociales). http://www.obs-osst.fr/services/etudes_ost/virtual/20_institution_cooperative/edocs/00/00/00/24/document_etude.phtml. Last accessed 2006-04-04.
  33. Long, J. S., Mcginnis, R. (1982), On adjusting productivity measures for multiple authorship, Scientometrics, 4: 379–387.CrossRefGoogle Scholar
  34. Mcgrath, W. E. (1996), The unit of analysis (objects of study) in bibliometrics and scientometrics. Scientometrics, 35: 257–264.CrossRefGoogle Scholar
  35. Martin, B. R. (1991), The bibliometric assessment of UK scientific performance. A reply to Braun, Glänzel and Schubert, Scientometrics, 20: 333–357.CrossRefGoogle Scholar
  36. Martin, B. R. (1994), British science in the 1980s-has the relative decline continued? Scientometrics, 29: 27–56.CrossRefGoogle Scholar
  37. May, R. M. (1997), The scientific wealth of nations, Science, 275: 793–796.CrossRefGoogle Scholar
  38. Moed, H. F. (1996), Differences in the construction of SCI based bibliometric indicators among various producers: a first overview. Scientometrics, 35: 177–191.CrossRefGoogle Scholar
  39. Moed, H. F. (2005), Citation Analysis in Research Evaluation, Springer, Dordrecht.Google Scholar
  40. Moed, H. F., Glänzel, W., Schmock, U. (Eds), (2004), Handbook of Quantitative Science and Technology Research. The Use of Publication and Patent Statistics in Studies of S&T Systems, Kluwer Academic Publishers, Dordrecht (the Netherlands).Google Scholar
  41. Narin, F., Stevens, E. S., Whitlow, E. S. (1991), Scientific cooperation in Europe and the citation of multinationally authored papers, Scientometrics, 21: 313–324.CrossRefGoogle Scholar
  42. National Science Foundation (2004), Science and Engineering Indicators 2004. www.nsf.gov. Last accessed 2007-04-26.Google Scholar
  43. National Science Foundation (2006), Science and Engineering Indicators 2006. www.nsf.gov. Last accessed 2007-04-26.Google Scholar
  44. Nederhof, A. J., Moed, H. F. (1993), Modelling multinational publication: development of an on-line fractionation approach to measure national scientific output, Scientometrics, 27: 39–52.CrossRefGoogle Scholar
  45. OECD (1999), Science, Technology and Industry Scoreboard 1999. Benchmarking Knowledge-Based Economies. Paris.Google Scholar
  46. OECD (2001), OECD Science, Technology and Industry Scoreboard. Towards a Knowledge-Based Economy. Paris.Google Scholar
  47. Okubo, Y. (1997), Bibliometric indicators and analysis of research systems: Methods and examples. STI Working Papers 1997/1, Document OECD/GD (97) 41, OECD, Paris, pp. 1–70.Google Scholar
  48. Persson, O., Danell, R. (2004), Decomposing national trends in acitivity and impact. A study of Swedish neuroscience papers. In: H. F. Moed, W. Glänzel, U. Schmoch, (Eds), Handbook of Quantitative Science and Technology Research. The Use of Publication and Paten Statistics in Studies of S&T System, Kluwer Acacemic Publishers. Dordrecht (The Netherlands), pp. 514–526.Google Scholar
  49. Persson, O., Glänzel, W., Danell, R. (2004), Inflationary bibliometric values: The role of scientific collaboration and the need for relative indicators in evaluative studies, Scientometrics, 60: 421–432.CrossRefGoogle Scholar
  50. Price, De Solla (1981), Multiple authorship, Science, 212: 986.CrossRefGoogle Scholar
  51. Rinia, E. J., De Lange, C., Moed, H. F. (1993), Measuring national output in physics: delimitation problems, Scientometrics, 28: 89–110.CrossRefGoogle Scholar
  52. Schubert, A., Glänzel, W., Braun, T. (1989), Scientometric datafiles. A comprehensive set of indicators on 2649 journals and 96 countries in all major science fields and subfields, Scientometrics, 16: 3–478.CrossRefGoogle Scholar
  53. Soteriades, E. S., Falagas, M. E. (2005), Comparison of amount of biomedical research originating from the European Union and the United States, British Medical Journal, 331: 192–194.CrossRefGoogle Scholar
  54. Tijssen, R. J. W., Van Leeuwen, T. N. (2003), Bibliometric Analyses of World Science. Extended technical annex to chapter 5 of the Third European Report on S&T Indicators. ftp://ftp.cordis.europa.eu/pub/indicators/docs/3rd_report_biblio_ext_methodology.pdf Last accessed 2006-07-18.
  55. Trueba, F. J., Guerrero, H. (2004), A robust formula to credit authors for their publications, Scientometrics, 60: 181–204.CrossRefGoogle Scholar
  56. Van Raan, A. F. J. (1993), Advanced bibliometric methods to assess research performance and scientific development: basic principles and recent practical applications, Research Evaluation, 3: 151–166.Google Scholar
  57. Van Raan, A. F. J., Tijssen, R. J. W. (1990), An overview of quantitative science and technology indicators based on bibliometric methods, Technology Economy Programme for the development of Indicators, OECD, Paris. Cited from Okubo, 1997.Google Scholar
  58. Vinkler, P. (1996), Some practical aspects of the standardization of scientometric indicators. Scientometrics, 35: 237–245.CrossRefGoogle Scholar
  59. Zitt, M., Teixeira, N. (1996), Science macro-indicators: some aspects of OST experience, Scientometrics, 35: 209–222.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media B.V. 2008

Authors and Affiliations

  • Marianne Gauffriau
    • 1
  • Peder Olesen Larsen
    • 2
  • Isabelle Maye
    • 3
  • Anne Roulin-Perriard
    • 3
  • Markus von Ins
    • 3
    • 4
  1. 1.Technical University of Denmark, Technical Knowledge Center of Denmark, D’ARC - DTU Analysis & Research Promotion CenterLyngbyDenmark
  2. 2.Marievej 10A, 2HellerupDenmark
  3. 3.Center for Science and Technology Studies (CEST)BernSwitzerland
  4. 4.Institute for Research Information and Quality AssuranceBonnGermany

Personalised recommendations