Skip to main content

Objective assessment of scientific performances world-wide


In order to identify the indicators having world-wide standards for the assessment of scientific performances at the level of both individual and institutions normalized for disciplines, we have carried out a comparative analysis of the relative scientific and technological level of individual scientists and individual scientific institutions competing internationally for given fields, using alternative indicators all based on the number of publications and on their impact factors in international SCI journals properly ranked properly weighted for their position, number of coauthors and discipline using deciles. This study, contrary to some gloomy opinions, suggests that interesting conclusions can be drawn from the above indicators. The utilization of the chosen indicators, tested world-wide in real situations, appears capable to effectively and objectively assess institutions and individual university professors and researchers proving to be quite significant and should be used to provide computer-assisted evaluation criteria for either maintaining or upgrading the given position, maintaining or closing public Institutions, and filtering grant applications.

This is a preview of subscription content, access via your institution.


  1. Walgate, R., Perpetual motion?, Nature, 303 (1983) 105.

    Google Scholar 

  2. Balla, M. I., Gandini, E. Nicolini, C., Commission of the European Communities, Proceedings of the “European Symposium on the Utilization of the Results of Public Research and Development”, (1986) Luxembourg.

  3. Several national and international journals, including Nature and Science.

  4. Garfield, E., Current Contents, 44–45 (1983) ISI, Philadelphia, PA.

    Google Scholar 

  5. Garfield, E., Citation analysis as a tool in journal evaluation: Journals can be ranked by frequency and impact of citations for science policy studies. Science, 178 (1972) 471–479.

    Article  Google Scholar 

  6. Balla, M., Gandini, E., Nicolini, C., Can bibliometric indicators assess science and technology? The western world, Italian institutions and biophysics as test cases. Cell Biophysics, 14 (1989) 99–116.

    Google Scholar 

  7. Nicolini, C., (in collaboration with Balla, M. I.), in Report to the Prime Minister of Italy on Science and Technology in Italy, Italian Presidency of the Council of Ministers, Rome (1986).

    Google Scholar 

  8. Nicolini, C., Vakula, S., Balla, M. I., Gandini, E., Can the assignment of University chairs be automated? Scientometrics, 32 (1995) 93–107.

    Article  Google Scholar 

  9. Ahlgren, P., Jarneving, B., Rousseau, R., Requirement for a cocitation similarity measure, with special reference to Pearson’s correlation coefficient. Journal of the American Society for Information Science and Technology, 54(6) (2003) 550–560.

    Article  Google Scholar 

  10. Garfield, E., Pudovkin, A. I., Istomin, V. S., Why do we need algorithmic historiography? Journal of the American Society for Information Science and Technology, 54(5) (2003) 400–412.

    Article  Google Scholar 

  11. Katz, J. S., Scale Independent indicators and research evaluation. Science & Public Policy, 27(1) (2000) 23–36.

    Article  Google Scholar 

  12. Kessler, M. M., Bibliographic coupling between scientific papers. American Documentation, 14 (1963) 10–25.

    Article  Google Scholar 

  13. Schubert, A., Braun, T., Cross-field normalization of scientometric indicators. Scientometrics, 36(3) (1996) 311–324.

    Article  Google Scholar 

  14. Braun, T., Schubert, A., Indicators of research output in the sciences from 5 central European countries, 1990–1994. Scientometrics, 36(2) (1996) 145–165.

    Article  Google Scholar 

  15. Braun, T., Bibliometric indicators for the evaluation of universities — Intelligence from the quantitation of the scientific literature. Scientometrics, 45(3) (1999) 425–432.

    Article  Google Scholar 

  16. Braun, T., Glänzel, W., Grupp, H., The scientometric weight of 50 nations in 27 science areas, 1989–1993, all fields combined, mathematics, engineering, chemistry and physics. Scientometrics, 33(3) (1995) 263–293.

    Article  Google Scholar 

  17. Leydesdorff, L., Visualization of the citation impact environments of scientific journals: An online mapping exercise. Journal of the American Society of Information Science and Technology, 58(1) (2007) 207–222.

    Article  Google Scholar 

  18. Journal Citation Reports.

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Claudio Nicolini.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Nicolini, C., Nozza, F. Objective assessment of scientific performances world-wide. Scientometrics 76, 527–541 (2008).

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI:


  • Impact Factor
  • Objective Assessment
  • Journal Citation Report
  • Bibliometric Indicator
  • Scientific Performance