Advertisement

Scientometrics

, Volume 95, Issue 1, pp 311–324 | Cite as

National peer-review research assessment exercises for the hard sciences can be a complete waste of money: the Italian case

  • Giovanni AbramoEmail author
  • Tindaro Cicero
  • Ciriaco Andrea D’Angelo
Article

Abstract

There has been ample demonstration that bibliometrics is superior to peer-review for national research assessment exercises in the hard sciences. In this paper we examine the Italian case, taking the 2001–2003 university performance rankings list based on bibliometrics as benchmark. We compare the accuracy of the first national evaluation exercise, conducted entirely by peer-review, to other rankings lists prepared at zero cost, based on indicators indirectly linked to performance or available on the Internet. The results show that, for the hard sciences, the costs of conducting the Italian evaluation of research institutions could have been completely avoided.

Keywords

Research evaluation Bibliometrics VTR Ranking Productivity Universities 

References

  1. Abramo, G., & D’Angelo, C. A. (2011). Evaluating research: from informed peer-review to bibliometrics. Scientometrics, 87(3), 499–514.CrossRefGoogle Scholar
  2. Abramo, G., D’Angelo, C. A., & Di Costa, F. (2008). Assessment of sectoral aggregation distortion in research productivity measurements. Research Evaluation, 17(2), 111–121.CrossRefGoogle Scholar
  3. Abramo, G., D’Angelo, C. A., & Di Costa, F. (2011). National research assessment exercises: a comparison of peer-review and bibliometrics rankings. Scientometrics, 89(3), 929–941.CrossRefGoogle Scholar
  4. Aksnes, D. W., & Taxt, R. E. (2004). Peer reviews and bibliometric indicators: a comparative study at Norvegian University. Research Evaluation, 13(1), 33–41.CrossRefGoogle Scholar
  5. Bornmann, L., & Leydesdorff, L. (2012). Which are the best performing regions in information science in terms of highly cited papers? Some improvements of our previous mapping approaches. Journal of Informetrics, 6(2), 336–345.CrossRefGoogle Scholar
  6. Cohen, J. (1988). Statistical Power Analysis for the Behavioral Sciences (2nd edn.). Hillsdale, NJ: Lawrence Erlbaum Associates.zbMATHGoogle Scholar
  7. D’Angelo, C. A., Giuffrida, C., & Abramo, G. (2011). A heuristic approach to author name disambiguation in large-scale bibliometric databases. Journal of the American Society for Information Science and Technology, 62(2), 257–269.CrossRefGoogle Scholar
  8. ERA (2010). The excellence in research for Australia initiative. http://www.arc.gov.au/era/. Accessed 5 Sept 2012.
  9. Franceschet, M., & Costantini, A. (2011). The first Italian research assessment exercise: A bibliometric perspective. Journal of Informetrics, 5(2), 275–291.CrossRefGoogle Scholar
  10. Guisan, M. C. (2005). Universities and research expenditure in Europe and the USA, 1993–2003: an analysis of countries and regions. Regional and Sectoral Economic Studies, AEEADE, 5(2), 35–46.Google Scholar
  11. Horrobin, D. F. (1990). The philosophical basis of peer-review and the suppression of innovation. Journal of the American Medical Association, 263(10), 1438–1441.CrossRefGoogle Scholar
  12. MacRoberts, M. H., & MacRoberts, B. R. (1996). Problems of citation analysis. Scientometrics, 36(3), 435–444.CrossRefGoogle Scholar
  13. Meho, L. I., & Sonnenwald, D. H. (2000). Citation ranking versus peer evaluation of senior faculty research performance: a case study of Kurdish Scholarship. Journal of the American Society for Information Science, 51(2), 123–138.CrossRefGoogle Scholar
  14. Moed, H. F. (2002). The impact-factors debate: the ISI’s uses and limits. Nature, 415, 731–732.CrossRefGoogle Scholar
  15. Moxham, H., & Anderson, J. (1992). Peer-review. A view from the inside. Science and Technology Policy, 5, 7–15.Google Scholar
  16. Oppenheim, C. (1997). The correlation between citation counts and the 1992 research assessment exercise ratings for British research in genetics, anatomy and archaeology. Journal of Documentation, 53, 477–487.CrossRefGoogle Scholar
  17. Oppenheim, C., & Norris, M. (2003). Citation counts and the research assessment exercise V: archaeology and the 2001 RAE. Journal of Documentation, 56(6), 709–730.Google Scholar
  18. Pendlebury, D. A. (2009). The use and misuse of journal metrics and other citation indicators. Scientometrics, 57(1), 1–11.Google Scholar
  19. RAE (2008). Research assessment exercise. http://www.rae.ac.uk/aboutus/. Accessed 5 Sept. 2012.
  20. Rinia, E. J., van Leeuwen, Th. N., van Vuren, H. G., & van Raan, A. F. J. (1998). Comparative analysis of a set of bibliometric indicators and central peer-review criteria, Evaluation of condensed matter physics in the Netherlands. Research Policy, 27(1), 95–107.CrossRefGoogle Scholar
  21. Serenko, A., & Dohan, M. (2011). Comparing the expert survey and citation impact journal ranking methods: Example from the field of Artificial Intelligence. Journal of Informetrics, 5(4), 629–648.CrossRefGoogle Scholar
  22. Thomas, P. R., & Watkins, D. S. (1998). Institutional research rankings via bibliometric analysis and direct peer-review: A comparative case study with policy implications. Scientometrics, 41(3), 335–355.CrossRefGoogle Scholar
  23. Van Raan, A. F. J. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133–143.CrossRefGoogle Scholar
  24. Van Raan, A. F. J. (2006). Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups. Scientometrics, 67(3), 491–502.Google Scholar
  25. VTR (2006). Three-year research evaluation (20012003). Results separated by scientific area. http://vtr2006.cineca.it/index_EN.html. Accessed on 5 Sept 2012.

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2012

Authors and Affiliations

  • Giovanni Abramo
    • 1
    • 2
    • 3
    Email author
  • Tindaro Cicero
    • 2
  • Ciriaco Andrea D’Angelo
    • 2
  1. 1.Institute for System Analysis and Computer Science (IASI-CNR)National Research Council of ItalyRomeItaly
  2. 2.Laboratory for Studies of Research and Technology Transfer, Department of Management, School of EngineeringUniversity of Rome “Tor Vergata”RomeItaly
  3. 3.Laboratory for Studies of Research and Technology Transfer, Department of Management, School of Engineering, Dipartimento di Ingegneria dell’ImpresaUniversità degli Studi di Roma “Tor Vergata”RomeItaly

Personalised recommendations