, Volume 119, Issue 2, pp 1207–1225 | Cite as

A new approach to the analysis and evaluation of the research output of countries and institutions

  • Domingo Docampo
  • Jean-Jacques BessouleEmail author


A plethora of bibliometric indicators is available nowadays to gauge research performance. The spectrum of bibliometric based measures is very broad, from purely size-dependent indicators (e.g. raw counts of scientific contributions and/or citations) up to size-independent measures (e.g. citations per paper, publications or citations per researcher), through a number of indicators that effectively combine quantitative and qualitative features (e.g. the h-index). In this paper we present a straightforward procedure to evaluate the scientific contribution of territories and institutions that combines size-dependent and scale-free measures. We have analysed in the paper the scientific production of 189 countries in the period 2006–2015. Our approach enables effective global and field-related comparative analyses of the scientific productions of countries and academic/research institutions. Furthermore, the procedure helps to identifying strengths and weaknesses of a given country or institution, by tracking variations of performance ratios across research fields. Moreover, by using a straightforward wealth-index, we show how research performance measures are highly associated with the wealth of countries and territories. Given the simplicity of the methods introduced in this paper and the fact that their results are easily understandable by non-specialists, we believe they could become a useful tool for the assessment of the research output of countries and institutions.


Research output Bibliometric indicators Countries Institutions Publications Citations 



We thank Dominique Dunon-Bluteau and Daniel Egret for initiating the human link between the authors. We are grateful to Paul Gouguet, Amélie Bernard and Pierre Madre for critical reading of the manuscript. The work of D. Docampo was supported by the European Regional Development Fund (ERDF) and the Galician Regional Government under an agreement for funding the Atlantic Research Center for Information and Communication Technologies (AtlantTIC).


  1. Abramo, C., & D’Angelo, A. (2016). A farewell to the MNCS and like size-independent indicators. Journal of Informetrics, 10(2), 646–651.CrossRefGoogle Scholar
  2. Aksnes, D. W., Schneider, J. W., & Gunnarsson, M. (2012). Ranking national research systems by citation indicators. A comparative analysis using whole and fractionalised counting methods. Journal of Informetrics, 6(1), 36–43.CrossRefGoogle Scholar
  3. Bornmann, L., & Leydesdorf, L. (2013). Macro-indicators of citation impacts of six prolific countries: InCites data and the statistical significance of trends. PLoS ONE, 8(2), e56768. Scholar
  4. Cabrerizo, F. J., Alonso, S., Herrera-Viedma, E., & Herrera, F. (2010). q 2-Index: Quantitative and qualitative evaluation based on the number and impact of papers in the Hirsch core. Journal of Informetrics, 4(1), 23–28.CrossRefGoogle Scholar
  5. Clarivate Analytics. (2017). Clarivate analytics/web of sciences/in cites essential science indicators. Accessed 9 Jan 2018.
  6. Clarivate Analytics. (2018b). Essential science—Highly cited papers. Accessed 17 Sept 2018.
  7. Egghe, L. (2006). Theory and practise of the g-index. Scientometrics, 69(1), 131–152.MathSciNetCrossRefGoogle Scholar
  8. European Commission. (2013). Country and regional scientific production profiles. Accessed November 30, 2018.
  9. Garner, R. M., Hirsch, J. A., Albuquerque, F. C., & Fargen, K. M. (2018). Bibliometric indices: Defining academic productivity and citation rates of researchers, departments and journals. Journal of Neuro Interventional Surgery, 10(2), 102–106.Google Scholar
  10. Glänzel, W., & Moed, H. F. (2002). Journal impact measures in bibliometric research. Scientometrics, 53(2), 171–193.CrossRefGoogle Scholar
  11. Glänzel, W., Thijs, B., & Debackere, K. (2016). Productivity, performance, efficiency, impact. What do we measure anyway?: Some comments on the paper “A farewell to the MNCS and like size-independent indicators” by Abramo and D’Angelo. Journal of Informetrics, 10(2), 658–660.CrossRefGoogle Scholar
  12. Gupta, B. M. (2010). Ranking and performance of Indian Universities, based on publication and citation data. Indian Journal of Science and Technology, 3, 837–843.Google Scholar
  13. Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences of the United States of America, 102(46), 16569–16572.CrossRefzbMATHGoogle Scholar
  14. IMF. (2017). World Economic Outlook (WEO) Accessed October 17, 2018.
  15. Jenab, S. M. H. (2016). Two-dimensional mapping of scientific production of nations in the fields of physics and astronomy. South African Journal of Science, 112(5–6), 1–8. Scholar
  16. Leydesdorff, L. (2009). How are new citation-based journal indicators adding to the bibliometric toolbox? Journal of the American Society for Information Science and Technology, 60(7), 1327–1336.CrossRefGoogle Scholar
  17. Lindsey, D. (1976). Distinction, achievement, and editorial board membership. American Psychologist, 31(11), 799–804.CrossRefGoogle Scholar
  18. Lindsey, D. (1978). The corrected quality ratio: A composite index of scientific contribution to knowledge. Social Studies of Science, 8, 349–354.CrossRefGoogle Scholar
  19. Luzar, V., Dobrić, V., Maričić, S., Pifat, G., & Paventi, J. (1992). A methodology for cluster analysis of citation histories. Quality & Quantity, 26, 337–365.CrossRefGoogle Scholar
  20. Mueller, C. E. (2016). Accurate forecast of countries’ research output by macro-level indicators. Scientometrics, 109(2), 1307–1328.CrossRefGoogle Scholar
  21. Nejati, A., & Jenab, S. M. H. (2010). A two-dimensional approach to evaluate the scientific production of countries (case study: The basic sciences). Scientometrics, 84(2), 357–364.CrossRefGoogle Scholar
  22. Prathap, G. (2010a). Is there a place for a mock h-index? Scientometrics, 84(1), 153–165.CrossRefGoogle Scholar
  23. Prathap, G. (2010b). The 100 most prolific economists using the p-index. Scientometrics, 84(1), 167–172.CrossRefGoogle Scholar
  24. Prathap, G. (2011). Quasity, when quantity has a quality all of its own—Toward a theory of performance. Scientometrics, 848(2), 555–562.CrossRefGoogle Scholar
  25. Romer, P. M. (1990). Endogenous technological change. The Journal of Political Economy, 98(5), S71–S102.CrossRefGoogle Scholar
  26. Schreiber, M., Malesios, C. C., & Psarakis, S. (2012). Exploratory factor analysis for the Hirsch index, 17 h-type variants, and some traditional bibliometric indicators. Journal of Informetrics, 6(3), 347–358.CrossRefGoogle Scholar
  27. Tahira, M., Alias, R. A., Bakri, A., & Shabri, A. (2014). h-index, h-type indices, and the role of corrected quality ratio. Journal of Information Science Theory and Practice, 2(4), 20–30.CrossRefGoogle Scholar
  28. UNESCO. (2015). UNESCO science report: Towards 2030, Accessed February 26, 2019.
  29. Wildgaard, L., Schneider, J. W., & Larsen, B. (2014). A review of the characteristics of 108 author-level bibliometric indicators. Scientometrics, 101(1), 125–158.CrossRefGoogle Scholar
  30. Yan, Z., Wu, Q., & Li, X. (2016). Do Hirsch-type indices behave the same in assessing single publications? An empirical study of 29 bibliometric indicators. Scientometrics, 109(3), 1815–1833.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2019

Authors and Affiliations

  1. 1.atlanTTic Research Center for Communications TechnologiesUniversity of VigoVigoSpain
  2. 2.Laboratoire de Biogenèse MembranaireUMR 5200, CNRS – Univ. BordeauxVillenave d’OrnonFrance

Personalised recommendations