Skip to main content

A discussion of measuring the top-1% most-highly cited publications: quality and impact of Chinese papers


The top-1% most-highly-cited articles are watched closely as the vanguards of the sciences. Using Web of Science data, one can find that China had overtaken the USA in the relative participation in the top-1% (PP-top1%) in 2019, after outcompeting the EU on this indicator in 2015. However, this finding contrasts with repeated reports of Western agencies that the quality of China’s output in science is lagging other advanced nations, even as it has caught up in numbers of articles. The difference between the results presented here and the previous results depends mainly upon field normalizations, which classify source journals by discipline. Average citation rates of these subsets are commonly used as a baseline so that one can compare among disciplines. However, the expected value of the top-1% of a sample of N papers is N / 100, ceteris paribus. Using the average citation rates as expected values, errors are introduced by (1) using the mean of highly skewed distributions and (2) a specious precision in the delineations of the subsets. Classifications can be used for the decomposition, but not for the normalization. When the data is thus decomposed, the USA ranks ahead of China in biomedical fields such as virology. Although the number of papers is smaller, China outperforms the US in the field of Business and Finance (in the Social Sciences Citation Index; p < .05). Using percentile ranks, subsets other than indexing-based classifications can be tested for the statistical significance of differences among them.

Graphical Abstract

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6


  1. The National Science Board of the United States establishes the policies of the National Science Foundation and serves as advisor to Congress and the President. NSB’s biennial report—Science and Engineering Indicators (SEI)—provides comprehensive information on the nation’s S&E.

  2. The “Flagship Collection” of WoS includes the Science Citation Index-Expanded (SCIE), the Social Sciences Citation Index, (SSCI), and the Arts & Humanities Citation Index (A&HCI), and the Emerging Sources Citation Index.

  3. Interestingly, since 2017, the differences between the EU with or without the UK are not statistically significant (z ≤ 1.96; p > .05).

  4. The Institute for Scientific Information (ISI) in Philadelphia (PA) was the owner and producer of the Science Citation Indexes at the time.

  5. One can formalize I3 as follows:

    $$I3 = \sum\nolimits_{i = 1}^{C} {f(X) \times X_{i} } \quad (1)$$

    where Xi indicates the percentile ranks and f(Xi) denotes the frequencies of the ranks with i = [1,C] as the percentile rank classes.


  • Ahlgren, P., Persson, O., & Rousseau, R. (2014). An approach for efficient online identification of the top-k percent most cited documents in large sets of Web of Science documents. ISSI Newsletter, 10(4), 81–89.

    Google Scholar 

  • Archambault, É., Beauchesne, O. H., & Caruso, J. (2011). Towards a multilingual, comprehensive and open scientific journal ontology. In Proceedings of the 13th international conference of the International Society for Scientometrics and Informetrics (pp. 66–77). Durban.

  • Bensman, S. J. (2007). Garfield and the impact factor. Annual Review of Information Science and Technology, 41(1), 93–155.

    Article  Google Scholar 

  • Bornmann, L, De Moya Anegón, F., & Leydesdorff, L. (2010). Do scientific advancements lean on the shoulders of giants? A bibliometric investigation of the Ortega hypothesis. PLoS ONE, 5(10).

  • Bornmann, L., & Mutz, R. (2011). Further steps towards an ideal method of measuring citation performance: The avoidance of citation (ratio) averages in field-normalization. Journal of Informetrics, 5(1), 228–230.

    Article  Google Scholar 

  • Boyack, K. W., Klavans, R., & Börner, K. (2005). Mapping the backbone of science. Scientometrics, 64(3), 351–374.

    Article  Google Scholar 

  • Braun, T., Glänzel, W., & Schubert, A. (1989). National publication patterns and citation impact in the multidisciplinary journals Nature and Science. Scientometrics, 17(1–2), 11–14.

    Article  Google Scholar 

  • Gauffriau, M., & Leino, Y. (2020). Understanding SciVal’s calculation of field-weighted percentile indicators. . The Bibliomagician,

  • Glänzel, W., & Schubert, A. (2003). A new classification scheme of science fields and subfields designed for scientometric evaluation purposes. Scientometrics, 56(3), 357–367.

    Article  Google Scholar 

  • Griliches, Z. (1994). Productivity, R&D and the data constraint. American Economic Review, 84(1), 123.

    Google Scholar 

  • Jin, B., & Rousseau, R. (2004). Evaluation of research performance and scientometric indicators in China. In H. F. Moed, W. Glänzel, & U. Schmoch (Eds.), Handbook of quantitative science and technology research (pp. 497–514). Kluwer Academic Publishers.

    Google Scholar 

  • Jonkers, K., Fako, P., Goenaga, X., & Wagner, C. S. (2021). China overtakes the EU in high impact science. Office of publication of the European Union.

    Google Scholar 

  • King, D. (2004). The scientific impact of nations. Nature, 430, 311–316.

  • Klavans, R., & Boyack, K. (2009). Towards a consensus map of science. Journal of the American Society for Information Science and Technology, 60(3), 455–476.

    Article  Google Scholar 

  • Leydesdorff, L. (1988). Problems with the ‘measurement’ of national scientific performance. Science and Public Policy, 15(3), 149–152.

    Google Scholar 

  • Leydesdorff, L. (2006). Can scientific journals be classified in terms of aggregate journal-journal citation relations using the journal citation reports? Journal of the American Society for Information Science and Technology, 57(5), 601–613.

    Article  Google Scholar 

  • Leydesdorff, L., & Bornmann, L. (2011). Integrated impact indicators compared with impact factors: An alternative research design with policy implications. Journal of the American Society for Information Science and Technology, 62(11), 2133–2146.

    Article  Google Scholar 

  • Leydesdorff, L., & Bornmann, L. (2012). Testing differences statistically with the Leiden ranking. Scientometrics, 92(3), 781–783.

    Article  Google Scholar 

  • Leydesdorff, L., & Bornmann, L. (2016). The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies.” Journal of the Association for Information Science and Technology, 67(3), 707714.

    Article  Google Scholar 

  • Leydesdorff, L., Bornmann, L., & Adams, J. (2019). The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor. Scientometrics, 119(3), 1669–1694.

    Article  Google Scholar 

  • Leydesdorff, L., Wagner, C. S., & Bornmann, L. (2014). The European Union, China, and the United States in the top-1% and top-10% layers of most-frequently cited publications: Competition and collaborations. Journal of Informetrics, 8(3), 606–617.

    Article  Google Scholar 

  • Milojević, S. (2020). Practical method to reclassify Web of Science articles into unique subject categories and broad disciplines. Quantitative Science Studies, 1(1), 183–206.

    Article  Google Scholar 

  • Moed, H. (2002). Measuring China’s research performance using the Science Citation Index. Scientometrics, 53(3), 281–296.

    Article  Google Scholar 

  • Narin, F. (1976). Evaluative bibliometrics: The use of publication and citation analysis in the evaluation of scientific activity. National Science Foundation.

    Google Scholar 

  • National Science Board. (2020). Research and Development: U.S. Trends and International Comparisons. Retrieved 6 March, 2021, form

  • Opthof, T., & Leydesdorff, L. (2010). Caveats for the journal and field normalizations in the CWTS (“Leiden”) evaluations of research performance. Journal of Informetrics, 4(3), 423–430.

    Article  Google Scholar 

  • Organization for Economic Cooperation and Development. (2015). Frascati Manual. Retrieved, September, 2021, from

  • Pudovkin, A. I., & Garfield, E. (2002). Algorithmic procedure for finding semantically related journals. Journal of the American Society for Information Science and Technology, 53(13), 1113–1119.

    Article  Google Scholar 

  • Rafols, I., & Leydesdorff, L. (2009). Content-based and algorithmic classifications of journals: Perspectives on the dynamics of scientific communication and indexer effects. Journal of the American Society for Information Science and Technology, 60(9), 1823–1835.

    Article  Google Scholar 

  • Robinson, W. D. (1950). Ecological correlations and the behavior of individuals. American Sociological Review, 15, 351–357.

    Article  Google Scholar 

  • Science-Metrix. (2021). Bibliometric indicators for the science and engineering indicators, 2022.

  • Schubert, A., & Braun, T. (1986). Relative indicators and relational charts for comparative assessment of publication output and citation impact. Scientometrics, 9(5), 281–291.

    Article  Google Scholar 

  • Scopus (2021) What is field-weighted citation impact? Retrieved December, 2021, from

  • Sivertsen, G., Rousseau, R., & Zhang, L. (2019). Measuring scientific contributions with modified fractional counting. Journal of Informetrics, 13(2), 679–694.

    Article  Google Scholar 

  • Tijssen, R. J., Visser, M. S., & van Leeuwen, T. N. (2002). Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference? Scientometrics, 54(3), 381–397.

    Article  Google Scholar 

  • van den Besselaar, P., & Sandström, U. (2016). What is the required level of data cleaning? A research evaluation case. Journal of Scientometric Research, 5(1), 07–12.

    Article  Google Scholar 

  • Veugelers, R. (2017). The challenge of China’s rise as a science and technology powerhouse (No. 2017/19). Bruegel Policy Contribution.

  • Wagner, C. S. (2008). The new invisible college. Brookings Press.

    Google Scholar 

  • Wagner, C. S., & Leydesdorff, L. (2012). An Integrated Impact Indicator: A new definition of ‘Impact’with policy relevance. Research Evaluation, 21(3), 183–188.

    Article  Google Scholar 

  • Waltman, L., & van Eck, N. J. (2012). A new methodology for constructing a publication-level classification system of science. Journal of the American Society for Information Science and Technology, 63(12), 2378–2392.

    Article  Google Scholar 

  • Waltman, L., van Eck, N. J., van Leeuwen, T. N., Visser, M. S., & van Raan, A. F. J. (2011). Towards a new crown indicator: Some theoretical considerations. Journal of Informetrics, 5(1), 37–47.

    Article  Google Scholar 

  • Zhang, L., & Sivertsen, G. (2020). The new research assessment reform in China and its implementation. Scholarly Assessment Reports, 2(1), 3.

    Article  Google Scholar 

  • Zhou, P., & Leydesdorff, L. (2006). The emergence of China as a leading nation in science. Research Policy, 35(1), 83–104.

    Article  Google Scholar 

Download references


We thank Koen Jonkers, Xabier Goenaga, Ronald Rousseau and two anonymous referees for advice, critiques, and suggestions. Loet Leydesdorff is grateful to ISI/Clarivate for JCR data.

Author information

Authors and Affiliations


Corresponding author

Correspondence to Caroline S. Wagner.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Wagner, C.S., Zhang, L. & Leydesdorff, L. A discussion of measuring the top-1% most-highly cited publications: quality and impact of Chinese papers. Scientometrics 127, 1825–1839 (2022).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:


  • Citation analysis
  • National standing
  • International cooperation
  • Science
  • Technology
  • United States
  • China