, Volume 97, Issue 2, pp 421–434 | Cite as

The research guarantors of scientific papers and the output counting: a promising new approach

  • Félix Moya-AnegónEmail author
  • Vicente P. Guerrero-Bote
  • Lutz Bornmann
  • Henk F. Moed


We propose a method for selecting the research guarantor when papers are co-authored. The method is simply based on identifying the corresponding author. The method is here applied to global scientific output based on the SCOPUS database in order to build a new output distribution by country. This new distribution is then compared with previous output distributions by country but which were based on whole or fractional counting, not only for the total output but also for the excellence output (papers belonging to the 10 % most cited papers). The comparison allows one to examine the effect of the different methodological approaches on the scientific performance indicators assigned to countries. In some cases, there was a very large variation in scientific performance between the total output (whole counting) and output as research guarantor. The research guarantor approach is especially interesting when used with the excellence output where the quantity of excellent papers is also a quality indicator. The impact of excellent papers naturally has less variability as they are all top-cited papers.


Corresponding author Scientific excellence Research guarantor Output counting 


  1. Gauffriau, M., Larsen, P. O., Maye, I., Roulin-Perriard, A., & von Ins, M. (2007). Publication, cooperation and productivity measures in scientific research. Scientometrics, 73(2), 175–214. doi: 10.1007/s11192-007-1800-2.CrossRefGoogle Scholar
  2. Goldfinch, S., Dale, T., & De Roue, K. (2003). Science from the periphery: Collaboration network and ‘periphery effects’ in the citation of New Zealand Crown Research Institutes articles, 1992–2000. Scientometrics, 57(3), 321–337.CrossRefGoogle Scholar
  3. Guerrero-Bote, V. P., & Moya-Anegón, F. (2012). A further step forward in measuring journals’ scientific prestige: The SJR2 indicator. Journal of Informetrics, 6, 674–688.Google Scholar
  4. Guerrero-Bote, V. P., Olmeda-Gómez, C., & Moya-Anegón, F. (2013). Quantifying the benefits on impact of International Scientific Collaboration. Journal of the American Society for Information Science and Technology, 64(2), 392–404.CrossRefGoogle Scholar
  5. Huang, M.-H., Lin, C.-S., & Chen, D.-Z. (2011). Counting methods, country rank changes, and counting inflation in the assessment of national research productivity and impact. Journal of the American Society for Information Science and Technology, 62(12), 2427–2436. doi: 10.1002/asi.21625.CrossRefGoogle Scholar
  6. International Committee of Medical Journal Editors. (2010). Uniform requirements for manuscripts submitted to Biomedical Journals. Retrieved 20 Jan 2013, from
  7. Katz, J., & Hicks, D. (1997). How much is a collaboration worth? A calibrated bibliometric model. Scientometrics, 40(3), 541–554.CrossRefGoogle Scholar
  8. Leimu, R., & Koricheva, J. (2005). Does scientific collaboration increase the impact of ecological articles? BioScience, 55, 438–443.CrossRefGoogle Scholar
  9. Man, J. P., Weinkauf, J. G., Tsang, M., & Sin, D. D. (2004). Why do some countries publish more than others? An international comparison of research funding, English proficiency and publication output in highly ranked general medical journals. European Journal of Epidemiology, 19(8), 811–817.CrossRefGoogle Scholar
  10. Mattsson, P., Sundberg, C. J., & Laget, P. (2011). Is correspondence reflected in the author position? A bibliometric study of the relation between corresponding author and byline position. Scientometrics, 87, 99–105.CrossRefGoogle Scholar
  11. Narin, F., Stevens, K., & Whitlow, E. (1991). Scientific cooperation in Europe and the citation of multidomestically authored papers. Scientometrics, 21(3), 313–323.CrossRefGoogle Scholar
  12. Sooryamoorthy, R. (2009). Do types of collaboration change citation? Collaboration and citation patterns of South African science publications. Scientometrics, 81(1), 177–193.CrossRefGoogle Scholar
  13. Vinkler, P. (2010). The evaluation of research by scientometric indicators. Oxford: Chandos Publishing.CrossRefGoogle Scholar
  14. Waltman, L., Calero-Medina, C., Kosten, J., Noyons, E. C. M., Tijssen, R. J. W., van Eck, N. J., et al. (2012). The Leiden ranking 2011/2012: data collection, indicators, and interpretation. Journal of the American Society for Information Science and Technology, 63(12), 2419–2432.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2013

Authors and Affiliations

  • Félix Moya-Anegón
    • 1
    Email author
  • Vicente P. Guerrero-Bote
    • 2
  • Lutz Bornmann
    • 3
  • Henk F. Moed
    • 4
  1. 1.Scimago Group, CSIC, CCHS, IPPMadridSpain
  2. 2.Scimago Group, Department of Information and CommunicationUniversity of ExtremaduraBadajozSpain
  3. 3.Division for Science and Innovation StudiesAdministrative Headquarters of the Max Planck SocietyMunichGermany
  4. 4.ElsevierAmsterdamThe Netherlands

Personalised recommendations