Evaluating the research performance of the Greek medical schools using bibliometrics
- 629 Downloads
Quality in Higher Education Institutions is the subject of several debates in the academic community in a worldwide basis and various efforts are made towards identifying ways to quantify it. In this respect, the use of bibliometrics gains significant ground as an effective tool for the evaluation of universities’ research output. In the present study, the research performance of the seven Greek medical schools is assessed by means of widely accepted and advanced bibliometric indices, such as total and average publications and citations, average and median h- and g-index with and without self-citations for all the 1,803 academics, while statistical analysis of the data was also performed in order to compare the observed differences in the mean values of the calculated indices. Considerable effort was exerted to overcome all inherent limitations of a bibliometric analysis through a meticulous data collection. This large-scale work was conducted both in school and academic rank level leading to interesting results concerning the scientific activity of the medical schools studied as units and of the various academic ranks separately, which can be partially justified with geographic and socioeconomic criteria. In general, bibliometrics demonstrate statistically significant difference in favour of Crete University medical school, while it was also found that self-citations have only marginal effect on the individual’s research profile and the average indices. Finally, the useful findings of the present study render the methodology adopted of high viability for assessing the research performance of Higher Education Institutions even in a broader context.
KeywordsResearch evaluation Bibliometrics h-index g-index Medical school Self-citations ANOVA
- Aksnes, D. W. (2009). The use of bibliometric indicators in research evaluations in Norway. In 14th Nordic workshop on bibliometrics and research policy, Stockholm, Norway, 29–30 September.Google Scholar
- Bach, J. -F., Jerome, D., Bony, J. -M., Braunstein, P., Cesarsky, C., Dalibard, J., Dumas, C., Friedel, J., Ghys, E., Le Moal, M., Meunier, B., Pironneau, O., Sentenac, A., Valleron, A. -J., Filliatreau, G., Glorieux, P., Jensen, P., Kahn, A., Lemercier, C., Linnemer, L., Weber, F., Adams, J., Friend, R., d’Artemare, B. (2011). On the proper use of bibliometrics to evaluate individual researchers. Institute of France, Report of the Academy of Sciences. http://www.academie-sciences.fr/activite/rapport/avis170111gb.pdf. Accessed 5 February 2013.
- Frazer, M. (1994). Quality in higher education: an international perspective. In D. Green (Ed.), What is quality in higher education? (pp. 107–117). Buckingham: SRHE and Open University Press.Google Scholar
- Katsaros, D., Matsoukas, V., Manolopoulos, Y. (2008). Evaluating Greek departments of computer science & engineering using bibliometric indices. In Proceedings of the Panhellenic conference on informatics (PCI), Samos Island, Greece, August 28–30, 93–102.Google Scholar
- Sypsa, V., Petrodaskalaki, M., Hatzakis, A., Hatzakis, A. (2008). Assesing the research output of 219 European academic medical institutions (AMIs): Short report. Greece: Department of Hygiene and Epidemiology, Athens University Medical School. http://220.127.116.11/hygiene/images/evaluation.pdf. Accessed 1 February 2013.
- Vaxevanidis, N. M., Despotidi, H., Prokopiou, H., & Koutsomichalis, A. (2011). On the evaluation of the quality of research in Greek HEIs using bibliometric indices. International Journal for Quality Research, 5(4), 247–254.Google Scholar