, Volume 62, Issue 1, pp 117–131 | Cite as

Impact of bibliometrics upon the science system: Inadvertent consequences?

  • Peter Weingart


Ranking of research institutions by bibliometric methods is an improper tool for research performance evaluation, even at the level of large institutions. The problem, however, is not the ranking as such. The indicators used for ranking are often not advanced enough, and this situation is part of the broader problem of the application of insufficiently developed bibliometric indicators used by persons who do not have clear competence and experience in the field of quantitative studies of science. After a brief overview of the basic elements of bibliometric analysis, I discuss the major technical and methodological problems in the application of publication and citation data in the context of evaluation. Then I contend that the core of the problem lies not necessarily at the side of the data producer. Quite often persons responsible for research performance evaluation, for instance scientists themselves in their role as head of institutions and departments, science administrators at the government level and other policy makers show an attitude that encourages 'quick and dirty' bibliometric analyses whereas better quality is available. Finally, the necessary conditions for a successful application of advanced bibliometric indicators as support tool for peer review are discussed.


Policy Maker Performance Evaluation Research Performance Quantitative Study Basic Element 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag/Akadémiai Kiadó 2005

Authors and Affiliations

  • Peter Weingart
    • 1
  1. 1.Institute for Science & Technology Studies, University of Bielefeld

Personalised recommendations