, Volume 106, Issue 3, pp 1239–1264 | Cite as

Benchmarking scientific performance by decomposing leadership of Cuban and Latin American institutions in Public Health

  • Zaida Chinchilla-Rodríguez
  • Grisel Zacca-González
  • Benjamín Vargas-Quesada
  • Félix de Moya-Anegón


Comparative benchmarking with bibliometric indicators can be an aid in decision-making with regard to research management. This study aims to characterize scientific performance in a domain (Public Health) by the institutions of a country (Cuba), taking as reference world output and regional output (other Latin American centers) during the period 2003–2012. A new approach is used here to assess to what extent the leadership of a specific institution can change its citation impact. Cuba was found to have a high level of specialization and scientific leadership that does not match the low international visibility of Cuban institutions. This leading output appears mainly in non-collaborative papers, in national journals; publication in English is very scarce and the rate of international collaboration is very low. The Instituto de Medicina Tropical Pedro Kouri stands out, alone, as a national reference. Meanwhile, at the regional level, Latin American institutions deserving mention for their high autonomy in normalized citation would include Universidad de Buenos Aires (ARG), Universidade Federal de Pelotas (BRA), Consejo Nacional de Investigaciones Científicas y Técnicas (ARG), Instituto Oswaldo Cruz (BRA) and the Centro de Pesquisas Rene Rachou (BRA). We identify a crucial aspect that can give rise to misinterpretations of data: a high share of leadership cannot be considered positive for institutions when it is mainly associated with a high proportion of non-collaborative papers and a very low level of performance. Because leadership might be questionable in some cases, we propose future studies to ensure a better interpretation of findings.


Public Health Latin America Cuba Scientific collaboration Normalized citation Leadership 

Mathematics Subject Classification

94 Information and communication Circuit 

JEL Classification

D8 Information Knowledge Uncertainty 



This work was made possible through financing by the scholarship funds for international mobility between Andalusian and IberoAmerican Universities and the SCImago Group. The authors appreciate the comments of the reviewers, which served to improve the quality and clarity of the manuscript, as well as the support in translating/editing of the manuscript by Jean Sanders.


  1. Abramo, G., Cicero, T., & D’Angelo, C. (2013). National peer-review research assessment exercises for the hard sciences can be a complete waste of money: The Italian case. Scientometrics, 95(1), 311–324.CrossRefGoogle Scholar
  2. Aguillo, I., Bar-Ilan, J., Levene, M., & Ortega, J. (2010). Comparing university rankings. Scientometrics, 85, 243–256.CrossRefGoogle Scholar
  3. Arencibia Jorge, R., Carrillo Calvet, H., Corera Álvarez, E., Chinchilla Rodríguez, Z., & de Moya Anegón, F. (2013a). La investigación científica en las universidades cubanas y su caracterización a partir del ranking de instituciones de SCImago. Revista Universidad de La Habana, 276, 163–192.Google Scholar
  4. Arencibia Jorge, R., Corera Álvarez, E., Chinchilla Rodríguez, Z., & de Moya Anegón, F. (2013b). Intersectoral relationships, scientific output and national policies for research development: A case study on Cuba 2003–2007. Revista Cubana de Información en Ciencias de la Salud, 24(3), 243–254.Google Scholar
  5. Arencibia-Jorge, R., Vega-Almeida, R., Chinchilla-Rodríguez, Z., Corera-Álvarez, E., & Moya-Anegón, F. (2012). Patrones de especialización de la investigación cubana en salud. Revista Cubana de Salud Pública, 38(supl. 5), 734–747.CrossRefGoogle Scholar
  6. Barros, A. J. D. (2006). Scientific output in the collective health area: Journal profile and evaluation by Capes. Revista de Saúde Pública, 40(spe), 43–49.CrossRefGoogle Scholar
  7. Bornmann, L. (2011). Scientific peer review. Annual Review of Information Science and Technology, 45, 199–245.CrossRefGoogle Scholar
  8. Bornmann, L. (2013). The problem of citation impact assessments for recent publication years in institutional evaluations. Journal of Informetrics, 7, 722–729.CrossRefGoogle Scholar
  9. Bornmann, L., de Moya Anegón, F., & Leydesdorff, L. (2010). Do scientific advancements lean on the shoulders of giants? A bibliometric investigation of the Ortega hypothesis. PLoS One, 5(10), e13327.CrossRefGoogle Scholar
  10. Bornmann, L., de Moya Anegón, F., & Leydesdorff, L. (2012). The new excellence indicator in the world report of the SCImago Institutions Rankings 2011. Journal of Informetrics, 6(2), 333–335.CrossRefGoogle Scholar
  11. Bornmann, L., & Moya Anegón, F. (2014). What proportion of excellent papers makes an institution one of the best worldwide? Specifying thresholds for the interpretation of the results of the SCImago Institutions Ranking and the Leiden Ranking. Journal of the Association for Information Science and Technology, 65(4), 732–736.CrossRefGoogle Scholar
  12. Bornmann, L., & Mutz, R. (2011). Further steps towards an ideal method of measuring citation performance: The avoidance of citation (ratio) averages infield-normalization. Journal of Informetrics, 5(1), 228–230.CrossRefGoogle Scholar
  13. Bornmann, L., Stefaner, M., Moya Anegón, F., & Mutz, R. (2014a). Ranking and mapping of universities and research-focused institutions worldwide based on highly-cited papers. A visualisation of results from multi-level models. Online Information Review, 38(1), 43–58.CrossRefGoogle Scholar
  14. Bornmann, L., Stefaner, M., Moya Anegón, F., & Mutz, R. (2014b). What is the effect of country-specific characteristics on the research performance of scientific institutions? Using multi-level statistical models to rank and map universities and research-focused institutions worldwide. Journal of Informetrics, 8, 581–593.CrossRefGoogle Scholar
  15. Calero-Medina, C., López-Illesca, C., Visser, M. S., & Moed, H. F. (2008). Important factors when interpreting bibliometric rankings of world universities: An example in the field of oncology. Research Evaluation, 17(1), 71–81.CrossRefGoogle Scholar
  16. Cañedo Andalia, R., Rodríguez Labrada, R., Dorta Contreras, A. J., & Velázquez Pérez, L. (2014). Producción científica en salud de Cuba registrada en PubMed en el periodo 2010–2012. Revista Cubana de Información en Ciencias de la Salud, 25(2), 157–171.Google Scholar
  17. Centre for Science and Technology Studies (CWTS). (2014). Leiden ranking. Accessed 25 July 2014.
  18. CHE Centre for Higher Education. (2014). U-Multirank. Accessed 25 July 2014.
  19. Chinchilla-Rodríguez, Z., Arencibia-Jorge, R., Moya-Anegón, F., & Corera-Álvarez, E. (2015a). Some patterns of Cuban scientific publication in Scopus: The current situation and challenges. Scientometrics, 103(3), 779–794.CrossRefGoogle Scholar
  20. Chinchilla-Rodríguez, Z., Miguel, S., & Moya-Anegón, F. (2015b). What factors are affecting the visibility of Argentinean publications in human and social sciences in Scopus? Some evidences beyond the geographic realm of the research. Scientometrics, 102(1), 789–810.CrossRefGoogle Scholar
  21. Chinchilla-Rodríguez, Z., Zacca-González, G., Vargas-Quesada, B., & Moya-Anegón, F. (2015c). Latinoamerican scientific output in Public Health: Combined analysis of bibliometric, socioeconomic and health indicators. Scientometrics, 102(1), 609–628.CrossRefGoogle Scholar
  22. Escuela Nacional de Salud Pública (ENSAP). (2015). Accessed 7 August 2015.
  23. Glänzel, W. (2000). Science in Scandinavia: A bibliometric approach. Scientometrics, 48, 121–150.CrossRefGoogle Scholar
  24. Glänzel, W., Thijs, B., Schubert, A., & Debackere, K. (2009). Subfield-specific normalized relative indicators and a new generation of relational charts: Methodological foundations illustrated on the assessment of institutional research performance. Scientometrics, 78(1), 165–188.CrossRefGoogle Scholar
  25. Guerrero, V. P., Olmeda-Gomez, C., & Moya-Anegón, F. (2013). Quantifying the benefits of international scientific collaboration. Journal of the Association for Information Science and Technology, 64(2), 392–404.CrossRefGoogle Scholar
  26. Hazelkorn, E. (2013). How rankings are reshaping higher education. In V. Climent, F. Michavila, & M. Ripolles (Eds.), Los rankings univeritarios: Mitos y realidades. Madrid: Ed. Tecnos.Google Scholar
  27. Hendrix, D. (2008). An analysis of bibliometric indicators, National Institutes of Health funding, and faculty size at Association of American Medical Colleges medical schools, 1997–2007. Journal of Medical Library Association, 96(4), 324–334.CrossRefGoogle Scholar
  28. Huang, M. H. (2012). Exploring the h-index at the institutional level. A practical application in world university rankings. Online Information Review, 36(4), 534–547.CrossRefGoogle Scholar
  29. Huang, M. H., Chang, H. W., & Chen, D. Z. (2006). Research evaluation of research-oriented universities in Taiwan from 1993 to 2003. Scientometrics, 67(3), 419–435.CrossRefGoogle Scholar
  30. Institute of Higher Education, Shanghai Jiao Tong University. (2013). Academic ranking of world universities (ARWU). Accessed 25 July 2014.
  31. Leta, J., Thijs, B., & Glänzel, W. (2013). A macro-level study of science in Brazil: Seven years later. Encontros Biblio: Revista Eletrônica de Biblioteconomia e Ciência da Informaçao, 18(36), 51–66.Google Scholar
  32. Ministerio de Salud Pública. (2013). Anuario Estadístico de Salud. La Habana: Dirección Nacional de Registros Médicos y Estadísticas.Google Scholar
  33. Moed, H. F. (2005). Citation analysis in research evaluation. Dordrecht: Springer.Google Scholar
  34. Moed, H. F., Moya-Anegón, F., López-Illescas, C., & Visser, M. (2011). Is concentration of university research associated with better research performance? Journal of Informetrics, 5(4), 649–658.CrossRefGoogle Scholar
  35. Moya-Anegón, F., Chinchilla-Rodríguez, Z., Bustos-González, A., Corera-Álvarez, E., López-Illescas, C., & Vargas-Quesada, B. (2014). Principales indicadores bibliométricos de la actividad científica de la Pontificia Universidad Católica del Perú. Lima: Pontificia Universidad Católica del Perú.Google Scholar
  36. Moya-Anegón, F., Guerrero-Bote, V. P., Bornmann, L., & Moed, H. F. (2013). The research guarantors of scientific papers and the output counting: A promising new approach. Scientometrics, 97, 421–434.CrossRefGoogle Scholar
  37. Pan American Health Organization/World Health Organization (PAHO/WHO). (2002). Public health in the Americas: Conceptual renewal, performance assessment, and bases for action. Washington, DC: PAHO/WHO.Google Scholar
  38. Rehn, C., Kronman, U., & Wadskog, D. (2007). Bibliometric indicators: Definitions and usage at Karolinska Institutet version 1.0. Stockholm: Karolinska Institutet University Library.Google Scholar
  39. Rodríguez-Navarro, A. (2012). Counting highly cited papers for university research assessment: Conceptual and technical Issues. PLoS One, 7(10), e47210.CrossRefGoogle Scholar
  40. Sarduy Domínguez, Y., Llanusa Ruiz, S. B., Urra González, P., & Antelo Cordovés, J. M. (2014). Caracterización de la producción científica de la Escuela Nacional de Salud Pública referenciada en la base de datos Scopus, 2006–2012. Educación Médica Superior, 28(2).
  41. SCImago Institutions Rankings. (2014). Accessed 4 August 2014.
  42. Sipahi, H., Durusoy, R., Ergin, I., Hassoy, H., Davas, A., & Karababa, A. O. (2012). Publication rates of public health theses in international and national peer-review journals in Turkey. Iranian Journal of Public Health, 41(9), 31–35.Google Scholar
  43. Vega Almeida, R. L., Arencibia Jorge, R., & Araújo Ruiz, J. A. (2007). Producción científica de los institutos de salud de Cuba en el Web of Science en el periodo 2000–2004. Acimed, 16(3).
  44. Vieira, E. S., & Gomes, J. (2010). A research impact indicator for institutions. Journal of Informetrics, 4, 581–590.CrossRefGoogle Scholar
  45. Wagner, C. S., Park, H. W., & Leydesdorff, L. (2015). The continuing growth of global cooperation networks in research: A conundrum for national governments. PLoS One, 10(7), e0131816. doi: 10.1371/journal.pone.0131816.CrossRefGoogle Scholar
  46. Zacca-González, G., Chinchilla-Rodríguez, Z., Vargas-Quesada, B., & de Moya-Anegón, F. (2015). Patrones de comunicación e impacto de la producción científica cubana en Salud Pública. Revista Cubana de Salud Pública, 41(2), 200–216.Google Scholar
  47. Zacca-González, G., Chinchilla-Rodríguez, Z., Vargas-Quesada, B., & Moya-Anegón, F. (2014a). Bibliometric analysis of Latin America’s regional scientific output in Public Health through SCImago Journal and Country Rank. BMC Public Health, 14 ( DOI:  10.1186/1471-2458-14-632), 632.
  48. Zacca-González, G., Vargas-Quesada, B., Chinchilla-Rodríguez, Z., & Moya-Anegón, F. (2014b). Producción científica cubana en Medicina y Salud Pública. Scopus 2003–2011. Transinformação, 26(3), 281–293.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2016

Authors and Affiliations

  • Zaida Chinchilla-Rodríguez
    • 1
    • 4
  • Grisel Zacca-González
    • 2
  • Benjamín Vargas-Quesada
    • 3
    • 4
  • Félix de Moya-Anegón
    • 1
    • 4
  1. 1.CSIC, Institute of Public Goods and PoliciesMadridSpain
  2. 2.Department of Teaching and ResearchNational Medical Sciences Information Centre-InfomedHavanaCuba
  3. 3.Department of Information and CommunicationUniversity of GranadaGranadaSpain
  4. 4.SCImago Research GroupMadridSpain

Personalised recommendations