Abstract
Bibliometrics are often used as key indicators when evaluating academic groups and individual researchers in biomedical research. Citation metrics, when used as indicators of research performance, require accurate benchmarking for homogenous groups. This study describes the research performance of academic departments in the University of Toronto’s Faculty of Medicine using article-level bibliometrics for scientific papers published from 2008 to 2012. Eligible publications of all academic faculty members were verified from each researcher’s curriculum vitae and Web of Science® (Thomson Reuters). For 3792 researchers, we identified 26,845 unique papers with 79,502 authors published from 2008 to 2012. The overall mean citations per paper for the faculty was 17.35. The academic departments with the highest levels of collaboration and interdisciplinary research activity also had the highest research impact. The citation window for biomedical scientific papers was still active at 5 years after publication, indicating that the citation window for publications in biomedical research is active longer than previously thought, and this may hinder the reliable use of bibliometrics when evaluating recent scientific publications in biomedical research.
References
Ball, P. (2007). Achievement index climbs the ranks. Nature, 448(7155), 737.
Catala-Lopez, F., Alonso-Arroyo, A., Hutton, B., Aleixandre-Benavent, R., & Moher, D. (2014). Global collaborative networks on meta-analyses of randomized trials published in high impact factor medical journals: A social network analysis. BMC Medicine, 12, 15.
Derrick, G. E., Haynes, A., Chapman, S., & Hall, W. D. (2011). The association between four citation metrics and peer rankings of research influence of Australian researchers in six fields of public health. PLoS One, 6(4), e18521.
European Commission Directorate-General for Research and Innovation. (2010). Assessing Europe’s University-Based Research—Expert Group on Assessment of University-Based Research. http://ec.europa.eu/research/science-society/document_library/pdf_06/assessing-europe-university-based-research_en.pdf. Accessed 28 November 2014.
Holbrook, J. B., Barr, K. R., & Brown, K. W. (2013). We need negative metrics too. Nature, 497, 439.
O’Leary, J. D., & Crawford, M. W. (2010). Bibliographic characteristics of the research output of pediatric anesthesiologists in Canada. Canadian Journal of Anaesthesia, 57(6), 573–577.
Rosas, S. R., Kagan, J. M., Schouten, J. T., Slack, P. A., & Trochim, W. M. (2011). Evaluating research and impact: A bibliometric analysis of research by the NIH/NIAID HIV/AIDS clinical trials networks. PLoS One, 6(3), e17428.
Wang, J. (2013). Citation time window choice for research impact evaluation. Scientometrics, 94(3), 851–872.
Acknowledgments
The authors wish to acknowledge the contribution of Cathy Grilo, Special Projects and Evaluation Coordinator, Office of the Vice Dean, Research and International Relations, Faculty of Medicine, University of Toronto to this project.
Author contribution
James D. O’Leary, Mark W. Crawford, Eva Jurczyk, and Alison Buchan helped design the study, analyzed the data, and wrote the manuscript. James D. O’Leary conducted the study.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
None.
Rights and permissions
About this article
Cite this article
O’Leary, J.D., Crawford, M.W., Jurczyk, E. et al. Benchmarking bibliometrics in biomedical research: research performance of the University of Toronto’s Faculty of Medicine, 2008–2012. Scientometrics 105, 311–321 (2015). https://doi.org/10.1007/s11192-015-1676-5
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-015-1676-5