Skip to main content
Log in

International comparisons of scientific performance revisited

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

This paper presents a methodological analysis of the latest update of the CHI/NSF Science Literature Indicators Data-Base. The data-base contains a range of publication and citation indicators borken down by country and field or subfield, and now convers the period from 1973 to 1984. It can be used to draw comparisons of the changing output and impact of basic research in different countries. Earlier applications of the data-base have been constrained by various technical limitations, and have been subject to certain criticism. In this article, after some conceptual analysis of what aspects of scientific performance the different indicators relate to, we show that much of the criticism is misplaced. We also describe subsequent methodological improvements to the indicators and the effect these have on the policy use that can be made. Finally, we examine what the latest statistics reveal about the relative international standing of seven leading scientific nations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes and references

  1. B. R. MARTIN, J. IRVINE, R. TURNER, The writing on the wall for British science,New Scientist, 104 (8 October 1984) 25–29.

    Google Scholar 

  2. J. IRVINE, B. R. MARTIN, T. PEACOCK, R. TURNER, Charting the decline in British science,Nature, 316 (15 August 1985) 587–90.

    Article  Google Scholar 

  3. M. CALLON, L. LEYDESDORFF, La recherche française-est-elle en bone santé?,La Recherche, 18 (1987) 412–19; L. LEYDESDORFF, Problems with the “measurement” of national scientific performance,Science and Public Policy, 15 (1988) 149–52.

    Google Scholar 

  4. P. WEINGART, R. SEHRINGER, M. WINTERHAGER, Bibliometric indicators for assessing strengths and weaknesses of West German science, in A. F. J. van RAAN (Ed.),Handbook of the Quantitative Studies of Science and Technology, Elsevier, Amsterdam, 1988.

    Google Scholar 

  5. J. IRVINE, B. R. MARTIN,Foresight in Science: Pickinig the Winners, Pinter Publ., London, 1984

    Google Scholar 

  6. B. R. MARTIN, J. IRVINE,Research Foresight: Priority-setting in Science, Pinter Publ., London, and Columbia University Press, New York, 1989.

    Google Scholar 

  7. F. NARIN, E. NOMA, Is technology becoming science?,Scientometrics, 7 (1985) 359–81.

    Google Scholar 

  8. M. P. CARPENTER, M. COOPER, F. NARIN, Linkage between basic research literature and patents,Research Management, 13 (1980) 30–35.

    Google Scholar 

  9. F. NARIN, M. P. CARPENTER, National publication and citation comparisons,Journal of the American Society of Information Science, 26 (1975) 80–93. Much of the material in this paragraph is drawn from their historical review.

    Google Scholar 

  10. F. J. COLE, N. B. EALES, The history of comparative anatomy,Science Progress, 11 (1917) 578–96.

    Google Scholar 

  11. E. W. HULME,Statistical Bibliography in Relation to the Growth of Modern Civilization, Grafton, London, 1923.

    Google Scholar 

  12. P. L. K. GROSS, E. M. GROSS, College libraries and chemical education,Science, 66 (1927) 385–89.

    Google Scholar 

  13. F. S. BOIG, P. W. HOWERTON, History and development of chemical periodicals in the field of organic chemistry,Science, 115 (1952) 25–31; History and development of chemical periodicals in the field of analytical chemistry,ibid., Science, 115 (1952) 555–60.

    Google Scholar 

  14. S. KEENAN, P. ATHERTON,The Journal Literature of Physics, American Institute of Physics, New York, 1964 (AIP/DRP PA1).

    Google Scholar 

  15. D. De SOLLA PRICE Measuring the size of science,Proceedings of the Israel Academy of Science and Humanities, 4 (1969) 98–111; reprinted in: D. De SOLLA PRICE,Little Science, Big Science... and Beyond, Columbia University Press, New York, 1986, pp. 135–54.

    Google Scholar 

  16. D. De SOLLA PRICE, S. GURSEY, Some statistical results for the numbers of authors in the states of the United States and the nations of the world, preface toISI's Who is Publishing in Science, 1975 Annual, Institute for Scientific Information, Philadelphia, 1975; reprinted in PRICE (1986),ibid., Measuring the size of science,Proceedings of the Israel Academy of Science and Hunanities, 4 (1969) pp. 180–205.

  17. I. S. SPIEGEL-ROSING, Journal authors as an indicator of scientific manpower: a methodological study using data from the two Germanies and Europe,Science Studies, 2 (1972) 337–59.

    Google Scholar 

  18. H. INHABER, Distribution of world science,Geoforum, 6 (1975) 231–36; Changes in centralization of science,Research Policy, 6 (1977) 178–93.

    Article  Google Scholar 

  19. J. D. FRAME, F. NARIN, M. P. CARPENTER, The distribution of world science,Social Studies of Science, 7 (1977) 501–16; J. D. FRAME, F. NARIN, The international distribution of biomedical publications,Federation Proceedings, 36 (1977) 1790–95.

    Google Scholar 

  20. J. D. FRAME, National economic resources and the production of research in lesser developed countries,Social Studies of Science, 9 (1979) 233–46; Measuring scientific activity in lesser developed countries,Scientometrics, 2 (1980) 133–45.

    Google Scholar 

  21. See, for example, D. DE SOLLA PRICE, Science indicators of quantity and quality for fine tuning of United States investment in research in major fields of science and technology, paper prepared for the Commission on Human Resources, National Research Council, Washington DG, 1980.

    Google Scholar 

  22. J. RONAYNE,Australian Science and Technology Feasibility Study—Government and Private Non-Profit Sectors, Australian Science and Technology Indicators, Camberra, Department of Science, 1983. This and other studies have led to the CHI data now being included in the official Australian science indicators report—Measures of Science and Innovation, Department of Industry, Technology and Commerce, Canberra, 1987.

    Google Scholar 

  23. op. cit. note 1..

    Google Scholar 

  24. op. cit. note 2..

    Article  Google Scholar 

  25. D. C. SMITH, P. M. D. COLLINS, D. M. HICKS, S. WYATT, National performance in basic reserach,Nature, 323 (1986) 681–84.

    Article  Google Scholar 

  26. F. NARIN, D. OLIVASTRO, National trends in physics and technology,Czechoslovak Journal of Physics, B. 36 (1987) 101–106.

    Article  Google Scholar 

  27. op. cit. note 3..

    Google Scholar 

  28. op. cit. note 4..

    Google Scholar 

  29. J. MACAULAY,An Indicator of Excellence in Canadian Science, Ottawa, Statistics Canada, 1985; see alsoScience and Technology Indicators, 1985, Statistics Canada, Ottawa, 1986.

    Google Scholar 

  30. CHI has also carried out several comparisons of its data-base with abstracting services — seeop. cit. note 8.. FRAMEet al., op. cit. note 18 The distribution of world science,Social Studies of Science, 7 (1977) 501–16; FRAME and NARIN,op. cit. note. 18; The distribution of world science,Social Studies of Science, 7 (1977) 501–16; FRAME (1979),op. cit. note 19. National economic resources and the productin of research in lesser developed countries,Social Studies of Science, 9 (1979), 233–46

    Google Scholar 

  31. S. M. LAWANI, On the relationship between quantity and quality of a country's research productivity,Journal of Information Science, 5 (1982) 143–45.

    Google Scholar 

  32. S. ARUNACHALAM, U. N. SINGH, Publication and citation patterns in the literature of a high metabolism area: the case of superconductivity in 1970,Journal of Information Science, 8 (1984) 93–102.

    Google Scholar 

  33. For example, J. VLACHÝ, Publication output of world physics,Czechoslovak Journal of Physics, B 29 (1979) 475–80; Publication output of European physics,ibid. Czechoslovak Journal of Physics, B29 (1979) 237–44; World publication output in particle physics,ibid. Czechoslovak Journal of Physics, B32 (1982) 1065–72; World publication output in cross-disciplinary physics,ibid. Czechoslovak Journal of Physics, B33 (1983) 247–50; World publication output in condensed matter physics,ibid. Czechoslovak Journal of Physics, B33 (1983) 117–20.

    Article  Google Scholar 

  34. A. MENDEZ, I. GOMEZ, The Spanish scientific productivity through eight international data-bases,Scientometrics, 10 (1986) 207–19.

    Article  Google Scholar 

  35. R. BARRÉ,La Position de la France dans la Competition Scientifique Internationale: Comparaison des ‘Profils Scientifiques’ de 11 Pays, mimeo, Ministry of Research and Technology, Paris, 1986.

    Google Scholar 

  36. A. GRANBERG,A Bibliometric Survey of Fiber-Optics Research in Sweden, West Germany and Japan, Research Policy Institute, University of Lund, 1985 (Discussion Paper No. 171).

  37. A. GRANBERG,A Bibliometric Survey of Laser Research in Sweden, West Germany and Japan, Research Policy Institute, University of Lund, 1986 (Discussion Paper No. 171).

  38. R. STANKIEWICZ, Genetic engineering—international R&D trends, paper presented at the International Symposium on Japanese Technology, 4th Meeting of the Japanese-Sweden-German Crosscountry Study on Long-Term Technological R&D, Urawa, Saitama, Japan, 17–19 September 1986.

  39. op. cit. note 24..

    Article  Google Scholar 

  40. Besides comparing countries in terms of citation totals, another approach involves analyzing the production of highly-cited papers—see, for example, E. GARFIELD, The 1982 articles most cited in 1982 and 1983,Current Contents, 45 (1984) 3–15, andibid. The distribution of world science,Social Studies of Science, 48 (1984) 3–14.

    Google Scholar 

  41. For example,op. cit. note 18; Martinet al., op. cit. note 1. The writing on the wall for British science,New Scientists, 04 (8 October 1984) 25–29. C. HILL,The Nobel-Prize Awards in Science as a Measure of National Strength in Science, Background Report No. 3, Task Force on Science Policy, Committee on Science and Technology, U.S. House of Representatives, Ninety-Ninth Congress. Second Session, U.S. Government Printing Office, Washington D C., 1986.

    Google Scholar 

  42. R. M. FRIEDMAN, Nobel physics prize in perspective,Nature, 292 (1981) 793–98.

    Article  Google Scholar 

  43. See, for example, GRANBERG,op. cit., notes 35 and 36A Bibliometric Survey of Fiber-Optics Research in Sweden, West Germany and Japan; STANKEIWICZ,op. cit. note 37; Genetic engineering-international R&D trends, paper presented at the International Symposium on Japanese Technology, 4th Meeting of the Japanese-Sweden-German Crosscountry Study on Long-Term Technological R&D, Urawa, Saitama, Japan, 17–19 September 1986 SMITHet al., op. cit. note 24 National performance in basic research,Nature, 323 (1986) 681–84.

  44. M. P. CARPENTER, F. NARIN, P. WOOLF, Citation rates to technologically important patents,World Patent Information, 3 (1981) 160–63.

    Article  Google Scholar 

  45. op. cit. note 6..

    Google Scholar 

  46. H. G. SMALL, Co-citation in the scientific literature: a new measure of the relationship between two documents,Journal of the American Society for Information Science, 24 (1973) 265–69; H. G. SMALL, E. SWEENEY, Clustering the Science Citation Index using co-citations,Scientometrics, 7 (1985) 391–409, andibid., Journal of the American Society for Information Science, 8 (1985) 321–40.

    Google Scholar 

  47. See, for example, M. CALLON, J. LAW, A. RIP (Eds)Mapping the Dynamics of Science and Technology, Macmillan, London, 1986.

    Google Scholar 

  48. W. L. GUISTI, L. GEORGHIOU, The use of co-nomination analysis in real-time evaluation,Scientometrics, 14 (1988) 265–82.

    Article  Google Scholar 

  49. A. van HEERINGEN, C. MOMBERS, R. Van VENETIE,Science and Technology Indicators 1983, Netherlands Advisory Council for Science Policy (RAWB), The Hague, 1984.

    Google Scholar 

  50. P. HEALEY, H. ROTHMAN, P. K. HOCH,Research Policy, 15 (1986) 323–52; D. PHILLIPS, J. TURNEY, Bibliometrics and UK science policyScientometrics, 14 (1988) 185–200.

    Article  Google Scholar 

  51. Academic researchers in the UK, however, have continued development work on the techniques in collaboration with overseas groups—see, for example, J. LAW, S. BAUIN, J.-P. COURTIAL, J. WHITTAKER, Policy and mapping of scientific change: a co-word analysis of research in environmental acidification,Scientometrics, 14 (1988) 251–64 However, they have tended to focus on analyzing individual specialties rather than on science as a whole.

    Article  Google Scholar 

  52. Evidence for this comes from the comments made by the Spanish delegate at the OECD Workshop on Science and Technology Indicators in the Higher Education Sector, 10–13 June 1985, OECD, Paris.

  53. J. J. FRANKLIN, R. JOHNSTON, Co-citation bibliometric modelling as a tool for S&T policy and R&D management: issues, applications and developments, in van RAAN,op. cit. note 4;; WEINGARTet al., op. cit. note 4.Handbook of the Quantitative Studies of Science and Technology, Elsevier, Amsterdam, 1988. A review of the policy impact of both the WEINGARTet al. and FRANKLIN and JOHNSTON studies is included in B. MARTIN, J. IRVINE, op. cit. note 5.Foresight in Science: Picking the Winners, Pinter Publ., London, 1984

    Google Scholar 

  54. The most recent isScience Indicators: the 1987 Report, National Science Board, Washington D. C., 1987.

  55. For example, a paper with two authors, one giving a French address and the other a German, would be credited 50% to each country. For a description of the data-base, seeData-Users Guide to the National Science Foundation's Science Literature Indicators Data-Base, CHI Research, Haddon Heights, New Jersey, 1987.

  56. L. LEYDESDORFF, Increases in British and Dutch scientific performance, mimeo, Department of Science Dynamics, Amsterdam University, 1985; M. CALLON, L. LEYDESDORFF,op. cit. note. 3; La recherche française — est-elle en bonne senté?,La Recherche, 18 (1987) 412–19; LEYDESDORFF,op. cit. note 3. La recherche française — est-elle on bonne santé?,La Recherche, 18 (1987) 412–19.

  57. F. NARIN, Response to Leydesdorff, mimeo, CHI Research, Haddon Heights, N.J. 1985.

    Google Scholar 

  58. J. ANDERSON, P. M. D. COLLINS, J. IRVINE, P. A. ISARD, B. R. MARTIN, F. NARIN, K. STEVENS, On-line approaches to measuring national scientific performance — a cautionary tale,Science and Public Policy, 15, (1988) 153–61.

    Google Scholar 

  59. In the version of this paper presented at the Amsterdam conference, we suggested that the apparent upturn in Britain's percentage share of world publications found by LEYDESDORFF (op. cit. note 3 La recherche française — est-elle en bonne santé?,La Recherche, 18 (1987) 412–19 might be partly related to the fact that papers from countries like the United States and the United Kingdom tend to enter the ISI data-base faster than those from Eastern Europe and the Third World. However, during discussions at that conference, it became clear that other more important factors were likely to be at play. These are analyzed in ANDERSONet al., ibid. On-line approaches to measuring national scientific performance — a cautionary tale,Science and Public Policy, 15 (1988) 153–61.

  60. Clearly, papers published in 1973 and 1974 would have earned relatively few citations by the end of 1974, making the statistics on national shares in that year less reliable. This is the reason why the time-series in Table 5 start at 1976.

  61. J. IRVINE, B. R. MARTIN, Basic research in the East and West: a comparison of the scientific performance of high-energy physics accelerators,Social Studies of Science, 15 (1985) 293–341.

    Google Scholar 

  62. Since the number of citations gained by 1983 and 1984 publications up to the end of 1984 is comparatively small, the statistics on national shares for those years are unreliable, and hence the time-series in Table 6 stop at 1982.

  63. op. cit. note 4.. especially footnotes 1 and 2.

    Google Scholar 

  64. See footnote 1 inibid..

    Google Scholar 

  65. op. cit. note 1..; IRVINEet al., op. cit. note 2. Charting the decline in British science,Nature, 316 (15 August 1985) 587–90.

    Google Scholar 

  66. op. cit. note. 5.

    Google Scholar 

  67. K. PAVITT, The size and structure of British technology activities: what we do and do not know,Scientometrics, 14 (1988) 329–46.

    Article  Google Scholar 

  68. Ibid..

    Article  Google Scholar 

  69. M. CARPENTER, F. GIBB, M. HARRIS, J. IRVINE, B. R. MARTIN, F. NARIN, Bibliometric profiles for British academic institutions: an experiment to develop research output indicators,Scientometrics, 14 (1988) 213–34.

    Article  Google Scholar 

  70. For example, B. R. MARTIN, J. IRVINE, Assessing basic research: some partial indicators of scientific progress in radio astronomy,Research Policy, 12 (1983) 61–90; J. IRVINE, B. R. MARTIN, Evaluating big science: CERN's past performance and future prospects,Scientometrics, 7 (1985) 281–308.

    Article  Google Scholar 

  71. Several examples can be found in vanop. cit. note 4.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Irvine, J., Martin, B.R. International comparisons of scientific performance revisited. Scientometrics 15, 369–392 (1989). https://doi.org/10.1007/BF02017060

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02017060

Keywords

Navigation