Skip to main content
Log in

A comparison of the scientific performance of the U.S. and the European union at the turn of the 21st century

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

In this paper, scientific performance is identified with the impact that journal articles have through the citations they receive. In 15 disciplines, as well as in all sciences as a whole, the EU share of total publications is greater than that of the U.S. However, as soon as the citations received by these publications are taken into account the picture is completely reversed. Firstly, the EU share of total citations is still greater than the U.S. in only seven fields. Secondly, the mean citation rate in the U.S. is greater than in the EU in every one of the 22 fields studied. Thirdly, since standard indicators—such as normalized mean citation ratios—are silent about what takes place in different parts of the citation distribution, this paper compares the publication shares of the U.S. and the EU at every percentile of the world citation distribution in each field. It is found that in seven fields the initial gap between the U.S. and the EU widens as we advance towards the more cited articles, while in the remaining 15 fields—except for Agricultural Sciences—the U.S. always surpasses the EU when it counts, namely, at the upper tail of citation distributions. Finally, for all sciences as a whole the U.S. publication share becomes greater than that of the EU for the top 50% of the most highly cited articles. The data used refers to 3.6 million articles published in 1998–2002, and the more than 47 million citations they received in 1998–2007.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. NAFTA includes the U.S., Canada, and Mexico.

  2. As a matter of fact, for later reference highly cited papers as a percentage of total number of scientific publications are 1.64 and 0.25 in the U.S. and the EU, respectively.

  3. These two facts are corroborated in Fig. 2 of the previous version of this paper in Albarrán et al. (2009).

  4. On the other hand, Katz (2000) adjusts relative citation impact indicators to take into account a strong, non-linear relationship between the number of citations a collection of papers receives and the collection size. As a consequence, there is a dramatic reversal of positions in many sub-fields between the U.S. and some European and non-European countries (see EC 2003a, pp. 443–444 for a large reversal between large and small countries). However, a discussion of Katz’s approach is beyond the scope of this paper.

  5. Archambault et al. (2006) have recently established that there is a 20–25% overrepresentation of English-language journals in TS’s databases compared to the list of journals in Ulrich’s International Periodicals Directory.

  6. Albarrán and Ruiz-Castillo (2009) contains a discussion of the characteristics shared by these two social sciences and the remaining broad scientific fields.

  7. It should be noted that when the 1998–2002 dataset is partitioned into the U.S., the EU and a third geographical area consisting of the rest of the world, the total number of articles in such extended count is 13.6% more than the standard count in which all articles are counted once. Similarly, the total number of citations in the extended sample is 20.2% greater than the one in the standard dataset. For further details, see Albarrán et al. (2009).

  8. This is an important measurement issue that might be affecting the important contribution by King 2004, whose Fig. 1 (p. 311) states: “The EU15 total contains some duplication because of papers jointly authored between countries in the EU group.”

  9. For the simultaneous measure of outputs and inputs to the scientific and innovation process, as well as a discussion of productivity indicators, see May (1997, 1998), EC (2003a), King (2004), and Shelton and Holdridge (2004). The latter also includes a review of qualitative methods for the measurement of science and technology consisting of studies of the international stature of research centers in the U.S. and the EU conducted by experts in the corresponding disciplines. For a general discussion of the evolution and shortcomings of science and technology indicators and their use in national policy, see Grupp and Mogee (2004).

  10. There are two types of average-based measures: the impact measures rebased against the world baseline, used inter alia in May (1997), Adams (1998), King (2004), EC (2003a), and Shelton and Holdridge (2004), and the relational charts in Glänzel et al. (2002) that use information—unavailable in our database—about the journals where each country’s articles are published.

  11. The Leiden group also constructs their average-based indicators counting with information about the journals where each country’s articles are published. This allows them to compare the research groups’ observed mean citation with the expected behavior of the set of journals where the group is known to publish. The ratio of such expected behavior to the behavior of the journals in the entire field constitutes another interesting indicator in this case.

  12. See also Batty (2003) for a study of the pattern of spatial concentration by the highly cited scientists.

  13. The same idea can be found in the study of domestic versus internationally co-authored papers in Glänzel (2000, 2001).

  14. Since the total number of extended articles is greater than the actual number, the sum of the shares in (i) and (ii) over the partition of the world into geographical areas would add up to more than one.

  15. This is also the method followed in the construction of the top 1% of the most highly cited articles in the Web of Science’s Essential Science Indicators.

  16. Like before, the sum of such shares at every percentile will not add up to one.

  17. As economists and/or members of Economics Departments, we believe that members of the European Economic Association and many other colleagues in Economics accept the information in the SCI and the SSCI as valid in our field.

References

  • Adams, J. (1998). Benchmarking international research. Nature, 396, 615–618.

    Article  Google Scholar 

  • Adler, R, Erwing, J, & Taylor, P. (2008). Citation Statistics, a report from the International Mathematical Union (IMU) in cooperation with the International Council of Industrial and Applied Mathematics (ICIAM) and the Institute of Mathematical Statistics (IMS).

  • Albarrán, P., & Ruiz-Castillo, J. (2009). References made and citations received by scientific articles. Working Paper 09-81, Economics Series 45, Universidad Carlos III.

  • Albarrán, P., Crespo, J., Ortuño, I., & Ruiz-Castillo, J. (2009). A Comparison of the Scientific Performance of the U.S. and Europe at the Turn of the XX Century. Working Paper 09-55, Economics Series 34, Universidad Carlos III (http://www.eco.uc3m.es/personal/cv/jrc.html).

  • Anderson, J., Collins, P., Irvine, J., Isard, P., Martin, B., Narin, F., et al. (1988). On-line approaches to measuring national scientific output: A cautionary tale. Science and Public Policy, 15, 153–161.

    Google Scholar 

  • Archambault, E., Vignola-Gagne, E., Côté, G., Larivière, V., & Gingras, Y. (2006). Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics, 68, 329–342.

    Article  Google Scholar 

  • Batty, M. (2003). The geography of scientific citation. Environment and Planning A, 35, 761–770.

    Article  Google Scholar 

  • Dosi, G., Llerena, P., & Sylos Labini, M. (2006). Science-technology-industry links and the ‘European Paradox’: Some notes on the dynamics of scientific and technological research in Europe. Research Policy, 35, 1450–1464.

    Article  Google Scholar 

  • EC. (1994). First European report on science and technology indicators. Luxembourg: Directorate-General XII, Science, Research, and Development, Office for Official Publications of the European Community.

    Google Scholar 

  • EC. (1997). Second European report on science and technology indicators. Luxembourg: Directorate-General XII, Science, Research, and Development, Office for Official Publications of the European Community.

    Google Scholar 

  • EC. (2002). Key Figures. Towards a European research area. Science, technology, and innovation. Luxembourg: Research Directorate General, Office for Official Publications of the European Community.

    Google Scholar 

  • EC. (2003a). Third European report on science and technology indicators. Directorate-General for Research. Luxembourg: Office for Official Publications of the European Community. http://www.cordis.lu/rtd2002/indicators/home.htlm.

  • EC. (2003b). From ‘European Paradox’ to declining competitiveness? Snapshots, 4. In Key Figures 2003/2004, Directorate-General for Research. Luxembourg: Office for Official Publications of the European Community, http://cordis.europa.eu/indicators/publications.htm.

  • Glänzel, W. (2000). Science in Scandinavia: A bibliometric approach. Scientometrics, 48, 121–150.

    Article  Google Scholar 

  • Glänzel, W. (2001). National characteristics in international scientific co-authorship relations. Scientometrics, 51, 69–115.

    Article  Google Scholar 

  • Glänzel, W., Schubert, A., & Braun, T. (2002). A relational charting approach to the world of basic research in twelve science fields at the end of the second millennium. Scientometrics, 55, 335–348.

    Article  Google Scholar 

  • Grupp, H., & Mogee, M. E. (2004). Indicators for National Science and Technology Policy: How robust are composite indicators? Research Policy, 33, 1373–1384.

    Article  Google Scholar 

  • Katz, J. S. (2000). Scale-independent indicators and research evaluation. Science and Public Policy, 27, 23–36.

    Article  Google Scholar 

  • King, D. (2004). The scientific impact of nations. Nature, 430, 311–316.

    Article  Google Scholar 

  • Leydesdorff, L., & Wagner, C. (2009). Is the United States losing ground in science? A global perspective on the world science system. Scientometrics, 78, 23–36.

    Article  Google Scholar 

  • May, R. (1997). The scientific wealth of nations. Science, 275, 793–796.

    Article  Google Scholar 

  • May, R. (1998). The scientific investments of nations. Science, 281, 879–880.

    Article  Google Scholar 

  • Moed, H. F., & van Raan A. F. J. (1988). Indicators of research performance. In A. F. J. van Raan (ed.), Handbook of quantitative studies of science and technology (pp. 177–192). North Holland.

  • Moed, H. F., Burger, W. J., Frankfort, J. G., & van Raan, A. F. J. (1985). The use of bibliometric data for the measurement of university research performance. Research Policy, 14, 131–149.

    Article  Google Scholar 

  • Moed, H. F., De Bruin, R. E., & van Leeuwen, Th. N. (1995). New bibliometrics tools for the assessment of national research performance: Database description, overview of indicators, and first applications. Scientometrics, 133, 381–422.

    Article  Google Scholar 

  • Shelton, R., & Holdridge, G. (2004). The US-EU race for leadership of science and technology: Qualitative and quantitative indicators. Scientometrics, 60, 353–363.

    Article  Google Scholar 

  • Tijssen, R., & van Leeuwen, T. (2003). Bibliomeric analysis of world science. Extended technical annex to chapter V of EC (2003a).

  • Van Leeuwen, T., Moed, H., Tijssen, R., Visser, M., & van Raan, A. (2001). Language biases in the coverage of the science citation index and its consequences for international comparisons of national research performance. Scientometrics, 51, 335–346.

    Article  Google Scholar 

  • Van Leeuwen, T., Visser, M., Moed, H., Nederhof, T., & van Raan, A. (2003). The holy grail of science policy: Exploring and combining bibliometric tools in search of scientific excellence. Scientometrics, 57, 257–280.

    Article  Google Scholar 

Download references

Acknowledgements

The authors acknowledge financial support from the Spanish MEC through grants SEJ2007-63098, SEJ2006-05710, SEJ2007-67135, and SEJ2007-67436. The database of Thomson Scientific (formerly Thomson-ISI; Institute for Scientific Information) has been acquired with funds from Santander Universities Global Division of Banco Santander. This paper is part of the SCIFI-GLOW Collaborative Project supported by the European Commission’s Seventh Research Framework Programme, Contract number SSH7-CT-2008-217436. Suggestions by a referee helped to improve a previous version of the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Javier Ruiz-Castillo.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Albarrán, P., Crespo, J.A., Ortuño, I. et al. A comparison of the scientific performance of the U.S. and the European union at the turn of the 21st century. Scientometrics 85, 329–344 (2010). https://doi.org/10.1007/s11192-010-0223-7

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-010-0223-7

Keywords

Navigation