Scientometrics

, Volume 108, Issue 2, pp 917–935 | Cite as

Research assessment using early citation information

Article

Abstract

Peer-review based research assessment, as implemented in Australia, the United Kingdom, and some other countries, is a very costly exercise. We show that university rankings in economics based on long-run citation counts can be easily predicted using early citations. This would allow a research assessment to predict the relative long-run impact of articles published by a university immediately at the end of the evaluation period. We compare these citation-based university rankings with the rankings of the 2010 Excellence in Research assessment in Australia and the 2008 Research Assessment Exercise in the United Kingdom. Rank correlations are quite strong, but there are some differences between rankings. However, if assessors are willing to consider citation analysis to assess some disciplines, as is the case for the natural sciences and psychology in Australia, it seems reasonable to consider also including economics in that set.

Keywords

Citations Research assessment Bibliometrics 

JEL Classification

A14 H83 I23 

Notes

Acknowledgments

We thank Guido Bünstorf and an anonymous referee for valuable comments and Andreas Rehs and Immanuel Bachem for helpful research assistance.

Compliance with ethical standards

Conflict of interest

There is no conflict of interest.

References

  1. Adams, J. (2005). Early citation counts correlate with accumulated impact. Scientometrics, 63(3), 567–581.CrossRefGoogle Scholar
  2. Anderson, D. L., & Tressler, J. (2014). The New Zealand performance based research fund and its impact on publication activity in economics. Research Evaluation, 23(1), 1–11.CrossRefGoogle Scholar
  3. Bertocchi, G., Gambardella, A., Jappelli, T., Nappi, C. A., & Peracchi, F. (2015). Bibliometric evaluation vs. informed peer review: Evidence from Italy. Research Policy, 44(2), 451–466.CrossRefGoogle Scholar
  4. Bornmann, L. (2011). Scientific peer review. Annual Review of Information Science and Technology, 45, 199–245.CrossRefGoogle Scholar
  5. Bornmann, L. (2015). How much does the expected number of citations for a publication change if it contains the address of a specific scientific institute? A new approach for the analysis of citation data on the institutional level based on regression models. Journal of the Association for Information Science and Technology. doi:10.1002/asi.23546.
  6. Bornmann, L., & Leydesdorff, L. (2014). Scientometrics in a changing research landscape. EMBO Reports, 15(12), 1228–1232.CrossRefGoogle Scholar
  7. Clerides, S., Pashardes, P., & Polycarpou, A. (2011). Peer review vs metric-based assessment: Testing for bias in the RAE ratings of UK economics departments. Economica, 78(311), 565–583.CrossRefGoogle Scholar
  8. Colman, A. M., Dhillon, D., & Coulthard, B. (1995). A bibliometric evaluation of the research performance of British university politics departments: Publications in leading journals. Scientometrics, 32(1), 49–66.CrossRefGoogle Scholar
  9. Department for Business, Innovation & Skills and Johnson, J. (2015). Press release: Government launches review to improve university research funding. https://www.gov.uk/government/news/government-launches-review-to-improve-university-research-funding.
  10. Farla, K., & Simmonds, P. (2015). REF accountability review: Costs, benefits and burden. Report by Technopolis to the four UK higher education funding bodies.Google Scholar
  11. Gallo, S. A., Carpenter, A. S., Irwin, D., McPartland, C. D., Travis, J., Reynders, S., et al. (2014). The validation of peer review through research impact measures and the implications for funding strategies. PLoS ONE, 9(9), e106474.CrossRefGoogle Scholar
  12. HEFCE. (2015). The Metric Tide: Correlation analysis of REF2014 scores and metrics (Supplementary Report II to the Independent Review of the Role of Metrics in Research Assessment and Management). Higher Education Funding Council for England. doi:10.13140/RG.2.1.3362.4162.Google Scholar
  13. Holgate, S. T. (2015). A comment on “Scientometrics in a changing research landscape”. EMBO Reports, 16(3), 261.MathSciNetCrossRefGoogle Scholar
  14. Hudson, J. (2013). Ranking journals. Economic Journal, 123, F202–F222.CrossRefGoogle Scholar
  15. Im, K. S., Pesaran, M. H., & Shin, Y. (2003). Testing for unit roots in heterogeneous panels. Journal of Econometrics, 115, 53–74.MathSciNetCrossRefMATHGoogle Scholar
  16. Johnston, J., Reeves, A., & Talbot, S. (2014). Has economics become an elite subject for elite UK universities? Oxford Review of Education, 40(5), 590–609.CrossRefGoogle Scholar
  17. Kenna, R., & Berche, B. (2011). Critical mass and the dependency of research quality on group size. Scientometrics, 86(2), 527–540.CrossRefGoogle Scholar
  18. Key Perspectives Ltd. (2009). A comparative review of research assessment regimes in five countries and the role of libraries in the research assessment process: A pilot study. Dublin: OCLC Research.Google Scholar
  19. Levitt, J. M., & Thelwall, M. (2011). A combined bibliometric indicator to predict article impact. Information Processing and Management, 47, 300–308.CrossRefGoogle Scholar
  20. Moed, H. (2010). Measuring contextual citation impact of scientific journals. Journal of Informetrics, 4(3), 265–277.CrossRefGoogle Scholar
  21. Mryglod, O., Kenna, R., Holovatch, Y., & Berche, B. (2013). Comparison of a citation-based indicator and peer review for absolute and specific measures of research-group excellence. Scientometrics, 97, 767–777.CrossRefGoogle Scholar
  22. Mryglod, O., Kenna, R., Holovatch, Y., & Berche, B. (2015). Predicting results of the Research Excellence Framework using departmental h-index. Scientometrics, 102(3), 2165–2180.CrossRefGoogle Scholar
  23. Neri, F., & Rodgers, J. (2015). The contribution of Australian academia to the world’s best economics research: 2001 to 2010. Economic Record, 91(292), 107–124.CrossRefGoogle Scholar
  24. Norris, M., & Oppenheim, C. (2003). Citation counts and the research assessment exercise V: Archaeology and the 2001 RAE. Journal of Documentation, 59(6), 709–730.CrossRefGoogle Scholar
  25. Oppenheim, C. (1996). ‘Do citations count? Citation indexing and the research assessment exercise’, Serials, 9, 155–161.Google Scholar
  26. Ostriker, J. P., Kuh, C. V., & Voytuk, J. A. (Eds.) (2011) A data-based assessment of research-doctorate programs in the United States. Committee to Assess Research-Doctorate Programs, National Research Council.Google Scholar
  27. Regibeau, P., & Rockett, K. E. (2014). A tale of two metrics: Research assessment vs. recognized excellence. University of Essex, Department of Economics, Discussion Paper Series 757.Google Scholar
  28. Sayer, D. (2014). Rank hypocrisies: The insult of the REF. Thousand Oaks: Sage.Google Scholar
  29. Sgroi, D., & Oswald, A. J. (2013). How should peer-review panels behave? Economic Journal, 123, F255–F278.CrossRefGoogle Scholar
  30. Stern, D. I. (2014). High-ranked social science journal articles can be identified from early citation information. PLoS ONE, 9(11), e112520.CrossRefGoogle Scholar
  31. Süssmuth, B., Steininger, M., & Ghio, S. (2006). Towards a European economics of economics: Monitoring a decade of top research and providing some explanation. Scientometrics, 66(3), 579–612.CrossRefGoogle Scholar
  32. Waltman, L., Calero-Medina, C., Kosten, J., Noyons, E. C. M., Tijssen, R. J. W., Van Eck, N. J., et al. (2012). The Leiden Ranking 2011/2012: Data collection, indicators, and interpretation. Journal of the American Society for Information Science and Technology, 63, 2419–2432.CrossRefGoogle Scholar
  33. Waltman, L., & van Eck, N. J. (2013). Source normalized indicators of citation impact: An overview of different approaches and an empirical comparison. Scientometrics, 96(3), 699–716.CrossRefGoogle Scholar
  34. Waltman, L., van Eck, N. J., van Leeuwen, T. N., Visser, M. S., & van Raan, A. F. J. (2011). Towards a new crown indicator: An empirical analysis. Scientometrics, 87, 467–481.CrossRefGoogle Scholar
  35. Wang, J. (2013). Citation time window choice for research impact evaluation. Scientometrics, 94(3), 851–872.CrossRefGoogle Scholar
  36. Wang, D., Song, C., & Barabási, A.-L. (2013). Quantifying long-term scientific impact. Science, 342, 127–132.CrossRefGoogle Scholar
  37. Wilsdon, J., et al. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management. Higher Education Funding Council for England. doi:10.13140/RG.2.1.4929.1363.Google Scholar
  38. Wooding, S., van Leeuwen, T. N., Parks, S., Kapur, S., & Grant, J. (2015). UK doubles its “world-leading” research in life sciences and medicine in six years: Testing the claim? PLoS ONE, 10(7), e0132990.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2016

Authors and Affiliations

  1. 1.Meta-Research in Economics Group and INCHERUniversity of KasselKasselGermany
  2. 2.Crawford School of Public PolicyThe Australian National UniversityActonAustralia

Personalised recommendations