Scientometrics

, Volume 57, Issue 2, pp 257–280 | Cite as

The Holy Grail of science policy: Exploring and combining bibliometric tools in search of scientific excellence

  • Thed N. Van Leeuwen
  • Martijn S. Visser
  • Henk F. Moed
  • Ton J. Nederhof
  • Anthony F. J. Van Raan
Article

Abstract

Evaluation studies of scientific performance conducted during the past years more and more focus on the identification of research of the 'highest quality', 'top' research, or 'scientific excellence'. This shift in focus has lead to the development of new bibliometric methodologies and indicators. Technically, it meant a shift from bibliometric impact scores based on average values such as the average impact of all papers published by some unit to be evaluated towards indicators reflecting the topof the citation distribution, such as the number of 'highly cited' or 'top' articles. In this study we present a comparative analysis of a number of standard and new indicators of research performance or 'scientific excellence', using techniques applied in studies conducted by CWTS in recent years. It will be shown that each type of indicator reflects a particular dimension of the general concept of research performance. Consequently, the application of one single indicator only may provide an incomplete picture of a unit's performance. It is argued that one needs to combine the various types of indicators in order to offer policy makers and evaluators valid and useful assessment tools.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. AKSNES, D. W., G. SIVERTSEN (2001), The effect of highly cited papers on national citation indicators, Proceedings of the 8 th International Conference on Scientometrics & Informetrics, Sydney, Australia, pp. 23-30.Google Scholar
  2. MOED, H. F., R. E. DE BRUIN, TH. N. VAN LEEUWEN (1995), New bibliometric tools for the assessment of national research performance: database description, overview of indicators and first applications, Scientometrics, 33: 381-422.CrossRefGoogle Scholar
  3. SCHUBERT, A, W. GLÄNZEL, T. BRAUN (1989), Scientometric datafiles-A comprehensive set of indicators on 2649 journals and 96 countries in all major science fields and subfield, 1981-1985, Scientometrics, 16: 3-478.CrossRefGoogle Scholar
  4. SEGLEN, P. O. (1992), The skewness of science, Journal of the American Society of Information Sciences, 43: 628-638.CrossRefGoogle Scholar
  5. TIJSSEN, R. J. W., M. S. VISSER, TH. N. VAN LEEUWEN (2002), Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference? Scientometrics, 54: 381-397.CrossRefGoogle Scholar
  6. TIJSSEN, R. J. W. (to be published) Organizational scoreboards of research excellence.Google Scholar
  7. VAN LEEUWEN, TH. N., H. F. MOED (2002), Development and application of journal impact measures in the Dutch science system, Scientometrics, 53: 249-266.CrossRefGoogle Scholar
  8. VAN RAAN, A. F. J. (1996), Advanced bibliometric methods as quantitative core of peer review based evaluation and foresight exercises, Scientometrics, 36: 397-420CrossRefGoogle Scholar
  9. VAN RAAN, A. F. J., TH. N. VAN LEEUWEN (2002), Assessment of the scientific basis of interdisciplinary, applied research: Application of bibliometric methods in nutrition and food research, Research Policy, 31: 611-632.CrossRefGoogle Scholar

Copyright information

© Kluwer Academic Publishers/Akadémiai Kiadó 2003

Authors and Affiliations

  • Thed N. Van Leeuwen
    • 1
  • Martijn S. Visser
    • 1
  • Henk F. Moed
    • 1
  • Ton J. Nederhof
    • 1
  • Anthony F. J. Van Raan
    • 1
  1. 1.Centre for Science and Technology Studies (CWTS)Leiden UniversityLeidenThe Netherlands

Personalised recommendations