Scientometrics

, Volume 76, Issue 3, pp 527–541 | Cite as

Objective assessment of scientific performances world-wide

Article

Abstract

In order to identify the indicators having world-wide standards for the assessment of scientific performances at the level of both individual and institutions normalized for disciplines, we have carried out a comparative analysis of the relative scientific and technological level of individual scientists and individual scientific institutions competing internationally for given fields, using alternative indicators all based on the number of publications and on their impact factors in international SCI journals properly ranked properly weighted for their position, number of coauthors and discipline using deciles. This study, contrary to some gloomy opinions, suggests that interesting conclusions can be drawn from the above indicators. The utilization of the chosen indicators, tested world-wide in real situations, appears capable to effectively and objectively assess institutions and individual university professors and researchers proving to be quite significant and should be used to provide computer-assisted evaluation criteria for either maintaining or upgrading the given position, maintaining or closing public Institutions, and filtering grant applications.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Walgate, R., Perpetual motion?, Nature, 303 (1983) 105.Google Scholar
  2. 2.
    Balla, M. I., Gandini, E. Nicolini, C., Commission of the European Communities, Proceedings of the “European Symposium on the Utilization of the Results of Public Research and Development”, (1986) Luxembourg.Google Scholar
  3. 3.
    Several national and international journals, including Nature and Science.Google Scholar
  4. 4.
    Garfield, E., Current Contents, 44–45 (1983) ISI, Philadelphia, PA.Google Scholar
  5. 5.
    Garfield, E., Citation analysis as a tool in journal evaluation: Journals can be ranked by frequency and impact of citations for science policy studies. Science, 178 (1972) 471–479.CrossRefGoogle Scholar
  6. 6.
    Balla, M., Gandini, E., Nicolini, C., Can bibliometric indicators assess science and technology? The western world, Italian institutions and biophysics as test cases. Cell Biophysics, 14 (1989) 99–116.Google Scholar
  7. 7.
    Nicolini, C., (in collaboration with Balla, M. I.), in Report to the Prime Minister of Italy on Science and Technology in Italy, Italian Presidency of the Council of Ministers, Rome (1986).Google Scholar
  8. 8.
    Nicolini, C., Vakula, S., Balla, M. I., Gandini, E., Can the assignment of University chairs be automated? Scientometrics, 32 (1995) 93–107.CrossRefGoogle Scholar
  9. 9.
    Ahlgren, P., Jarneving, B., Rousseau, R., Requirement for a cocitation similarity measure, with special reference to Pearson’s correlation coefficient. Journal of the American Society for Information Science and Technology, 54(6) (2003) 550–560.CrossRefGoogle Scholar
  10. 10.
    Garfield, E., Pudovkin, A. I., Istomin, V. S., Why do we need algorithmic historiography? Journal of the American Society for Information Science and Technology, 54(5) (2003) 400–412.CrossRefGoogle Scholar
  11. 11.
    Katz, J. S., Scale Independent indicators and research evaluation. Science & Public Policy, 27(1) (2000) 23–36.CrossRefGoogle Scholar
  12. 12.
    Kessler, M. M., Bibliographic coupling between scientific papers. American Documentation, 14 (1963) 10–25.CrossRefGoogle Scholar
  13. 13.
    Schubert, A., Braun, T., Cross-field normalization of scientometric indicators. Scientometrics, 36(3) (1996) 311–324.CrossRefGoogle Scholar
  14. 14.
    Braun, T., Schubert, A., Indicators of research output in the sciences from 5 central European countries, 1990–1994. Scientometrics, 36(2) (1996) 145–165.CrossRefGoogle Scholar
  15. 15.
    Braun, T., Bibliometric indicators for the evaluation of universities — Intelligence from the quantitation of the scientific literature. Scientometrics, 45(3) (1999) 425–432.CrossRefGoogle Scholar
  16. 16.
    Braun, T., Glänzel, W., Grupp, H., The scientometric weight of 50 nations in 27 science areas, 1989–1993, all fields combined, mathematics, engineering, chemistry and physics. Scientometrics, 33(3) (1995) 263–293.CrossRefGoogle Scholar
  17. 17.
    Leydesdorff, L., Visualization of the citation impact environments of scientific journals: An online mapping exercise. Journal of the American Society of Information Science and Technology, 58(1) (2007) 207–222.CrossRefGoogle Scholar
  18. 18.
    Journal Citation Reports.Google Scholar

Copyright information

© Springer Science+Business Media B.V. 2008

Authors and Affiliations

  1. 1.Fondazione EL.B.A.RomeItaly
  2. 2.CIRSDNNOB-Nanoworld Institute and Eminent Chair of BiophysicsUniversity of GenoaGenoaItaly

Personalised recommendations