Skip to main content
Log in

Research assessment using early citation information

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

Peer-review based research assessment, as implemented in Australia, the United Kingdom, and some other countries, is a very costly exercise. We show that university rankings in economics based on long-run citation counts can be easily predicted using early citations. This would allow a research assessment to predict the relative long-run impact of articles published by a university immediately at the end of the evaluation period. We compare these citation-based university rankings with the rankings of the 2010 Excellence in Research assessment in Australia and the 2008 Research Assessment Exercise in the United Kingdom. Rank correlations are quite strong, but there are some differences between rankings. However, if assessors are willing to consider citation analysis to assess some disciplines, as is the case for the natural sciences and psychology in Australia, it seems reasonable to consider also including economics in that set.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. A variety of research assessment models are in place in different countries (Key Perspectives 2009). Research assessment exercises in other countries use different combinations of peer review and bibliometric analysis. For example, the Italian Evaluation of Research Quality must peer review at least half the submitted research items (Bertocchi et al. 2015). In the U.S., the National Research Council carries out a periodical assessment of doctoral programs. The 2011 assessment (Ostriker et al. 2011) covered 5000 doctoral programs at 212 universities. The most recent assessment aggregated various quantitative metrics, including numbers of citations, using weights derived from a survey of faculty on the importance of the various metrics.

  2. In New Zealand, though, individual researchers are assessed (Anderson and Tressler 2014).

  3. Of course, this effect will also apply to many other ways of aggregating publications including random samples of publications.

  4. The REF, and previously the RAE, assesses only four publications for each submitted researcher. The submitting university chooses both which researchers and which of their publications to submit.

  5. The vast majority of bibliometric research uses the Web of Science as its data source. One reason for this is that it allows researchers to easily download the results of searches as data files. This data includes year-by-year citations to each article. Though Google Scholar covers a wider range of citing and cited sources, it is very noisy with many misidentified publications and citations. Constructing a data set for a discipline in a country would be a very labor-intensive process. Scopus is also not as user-friendly as the Web of Science. For example, one cannot search by discipline in Scopus.

  6. In ERA 2010 and 2012, publications assigned to four-digit fields of research (e.g. economic theory or econometrics) with less than fifty publications in total were not assessed. In ERA 2015, these were assessed as part of the two-digit field of research (e.g. economics) even though the four-digit field was not be assessed. This seems to be a move to reduce gaming of the system by assigning weak publications to four-digit codes that were then not assessed.

  7. RAE 2008 included publications published from 2001 to 2007 inclusively by researchers affiliated with eligible institutions on 31 October 2007 and included by their university in its submission. The 2010 ERA included publications published from 2003 to 2008 inclusively by researchers affiliated with eligible institutions on 31 March 2009.

References

  • Adams, J. (2005). Early citation counts correlate with accumulated impact. Scientometrics, 63(3), 567–581.

    Article  Google Scholar 

  • Anderson, D. L., & Tressler, J. (2014). The New Zealand performance based research fund and its impact on publication activity in economics. Research Evaluation, 23(1), 1–11.

    Article  Google Scholar 

  • Bertocchi, G., Gambardella, A., Jappelli, T., Nappi, C. A., & Peracchi, F. (2015). Bibliometric evaluation vs. informed peer review: Evidence from Italy. Research Policy, 44(2), 451–466.

    Article  Google Scholar 

  • Bornmann, L. (2011). Scientific peer review. Annual Review of Information Science and Technology, 45, 199–245.

    Article  Google Scholar 

  • Bornmann, L. (2015). How much does the expected number of citations for a publication change if it contains the address of a specific scientific institute? A new approach for the analysis of citation data on the institutional level based on regression models. Journal of the Association for Information Science and Technology. doi:10.1002/asi.23546.

  • Bornmann, L., & Leydesdorff, L. (2014). Scientometrics in a changing research landscape. EMBO Reports, 15(12), 1228–1232.

    Article  Google Scholar 

  • Clerides, S., Pashardes, P., & Polycarpou, A. (2011). Peer review vs metric-based assessment: Testing for bias in the RAE ratings of UK economics departments. Economica, 78(311), 565–583.

    Article  Google Scholar 

  • Colman, A. M., Dhillon, D., & Coulthard, B. (1995). A bibliometric evaluation of the research performance of British university politics departments: Publications in leading journals. Scientometrics, 32(1), 49–66.

    Article  Google Scholar 

  • Department for Business, Innovation & Skills and Johnson, J. (2015). Press release: Government launches review to improve university research funding. https://www.gov.uk/government/news/government-launches-review-to-improve-university-research-funding.

  • Farla, K., & Simmonds, P. (2015). REF accountability review: Costs, benefits and burden. Report by Technopolis to the four UK higher education funding bodies.

  • Gallo, S. A., Carpenter, A. S., Irwin, D., McPartland, C. D., Travis, J., Reynders, S., et al. (2014). The validation of peer review through research impact measures and the implications for funding strategies. PLoS ONE, 9(9), e106474.

    Article  Google Scholar 

  • HEFCE. (2015). The Metric Tide: Correlation analysis of REF2014 scores and metrics (Supplementary Report II to the Independent Review of the Role of Metrics in Research Assessment and Management). Higher Education Funding Council for England. doi:10.13140/RG.2.1.3362.4162.

    Google Scholar 

  • Holgate, S. T. (2015). A comment on “Scientometrics in a changing research landscape”. EMBO Reports, 16(3), 261.

    Article  MathSciNet  Google Scholar 

  • Hudson, J. (2013). Ranking journals. Economic Journal, 123, F202–F222.

    Article  Google Scholar 

  • Im, K. S., Pesaran, M. H., & Shin, Y. (2003). Testing for unit roots in heterogeneous panels. Journal of Econometrics, 115, 53–74.

    Article  MathSciNet  MATH  Google Scholar 

  • Johnston, J., Reeves, A., & Talbot, S. (2014). Has economics become an elite subject for elite UK universities? Oxford Review of Education, 40(5), 590–609.

    Article  Google Scholar 

  • Kenna, R., & Berche, B. (2011). Critical mass and the dependency of research quality on group size. Scientometrics, 86(2), 527–540.

    Article  Google Scholar 

  • Key Perspectives Ltd. (2009). A comparative review of research assessment regimes in five countries and the role of libraries in the research assessment process: A pilot study. Dublin: OCLC Research.

    Google Scholar 

  • Levitt, J. M., & Thelwall, M. (2011). A combined bibliometric indicator to predict article impact. Information Processing and Management, 47, 300–308.

    Article  Google Scholar 

  • Moed, H. (2010). Measuring contextual citation impact of scientific journals. Journal of Informetrics, 4(3), 265–277.

    Article  Google Scholar 

  • Mryglod, O., Kenna, R., Holovatch, Y., & Berche, B. (2013). Comparison of a citation-based indicator and peer review for absolute and specific measures of research-group excellence. Scientometrics, 97, 767–777.

    Article  Google Scholar 

  • Mryglod, O., Kenna, R., Holovatch, Y., & Berche, B. (2015). Predicting results of the Research Excellence Framework using departmental h-index. Scientometrics, 102(3), 2165–2180.

    Article  Google Scholar 

  • Neri, F., & Rodgers, J. (2015). The contribution of Australian academia to the world’s best economics research: 2001 to 2010. Economic Record, 91(292), 107–124.

    Article  Google Scholar 

  • Norris, M., & Oppenheim, C. (2003). Citation counts and the research assessment exercise V: Archaeology and the 2001 RAE. Journal of Documentation, 59(6), 709–730.

    Article  Google Scholar 

  • Oppenheim, C. (1996). ‘Do citations count? Citation indexing and the research assessment exercise’, Serials, 9, 155–161.

    Google Scholar 

  • Ostriker, J. P., Kuh, C. V., & Voytuk, J. A. (Eds.) (2011) A data-based assessment of research-doctorate programs in the United States. Committee to Assess Research-Doctorate Programs, National Research Council.

  • Regibeau, P., & Rockett, K. E. (2014). A tale of two metrics: Research assessment vs. recognized excellence. University of Essex, Department of Economics, Discussion Paper Series 757.

  • Sayer, D. (2014). Rank hypocrisies: The insult of the REF. Thousand Oaks: Sage.

    Google Scholar 

  • Sgroi, D., & Oswald, A. J. (2013). How should peer-review panels behave? Economic Journal, 123, F255–F278.

    Article  Google Scholar 

  • Stern, D. I. (2014). High-ranked social science journal articles can be identified from early citation information. PLoS ONE, 9(11), e112520.

    Article  Google Scholar 

  • Süssmuth, B., Steininger, M., & Ghio, S. (2006). Towards a European economics of economics: Monitoring a decade of top research and providing some explanation. Scientometrics, 66(3), 579–612.

    Article  Google Scholar 

  • Waltman, L., Calero-Medina, C., Kosten, J., Noyons, E. C. M., Tijssen, R. J. W., Van Eck, N. J., et al. (2012). The Leiden Ranking 2011/2012: Data collection, indicators, and interpretation. Journal of the American Society for Information Science and Technology, 63, 2419–2432.

    Article  Google Scholar 

  • Waltman, L., & van Eck, N. J. (2013). Source normalized indicators of citation impact: An overview of different approaches and an empirical comparison. Scientometrics, 96(3), 699–716.

    Article  Google Scholar 

  • Waltman, L., van Eck, N. J., van Leeuwen, T. N., Visser, M. S., & van Raan, A. F. J. (2011). Towards a new crown indicator: An empirical analysis. Scientometrics, 87, 467–481.

    Article  Google Scholar 

  • Wang, J. (2013). Citation time window choice for research impact evaluation. Scientometrics, 94(3), 851–872.

    Article  Google Scholar 

  • Wang, D., Song, C., & Barabási, A.-L. (2013). Quantifying long-term scientific impact. Science, 342, 127–132.

    Article  Google Scholar 

  • Wilsdon, J., et al. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management. Higher Education Funding Council for England. doi:10.13140/RG.2.1.4929.1363.

    Google Scholar 

  • Wooding, S., van Leeuwen, T. N., Parks, S., Kapur, S., & Grant, J. (2015). UK doubles its “world-leading” research in life sciences and medicine in six years: Testing the claim? PLoS ONE, 10(7), e0132990.

    Article  Google Scholar 

Download references

Acknowledgments

We thank Guido Bünstorf and an anonymous referee for valuable comments and Andreas Rehs and Immanuel Bachem for helpful research assistance.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stephan B. Bruns.

Ethics declarations

Conflict of interest

There is no conflict of interest.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bruns, S.B., Stern, D.I. Research assessment using early citation information. Scientometrics 108, 917–935 (2016). https://doi.org/10.1007/s11192-016-1979-1

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-016-1979-1

Keywords

JEL Classification

Navigation