, Volume 101, Issue 2, pp 1179–1194 | Cite as

Measuring the scientific impact of e-research infrastructures: a citation based approach?

  • K. Jonkers
  • G. E. Derrick
  • C. Lopez-Illescas
  • P. Van den Besselaar


This micro-level study explores the extent that citation analysis provides an accurate and representative assessment of the use and impact of bioinformatics e-research infrastructure. The bioinformatic e-research infrastructure studied offers common tools used by life scientists to analyse and interpret genetic and protein sequence information. These e-resources therefore provide an interesting example with which to explore how representative citations are as acknowledgements of knowledge in the life sciences. The examples presented here suggest that there is a relation between number of visits to these databases and number of citations; however, a parallel finding shows how citation analysis frequently underestimates acknowledged use of the resources offered on this e-research infrastructure. The paper discusses the implications of the findings for various aspects of impact measurement and also considers how appropriate citation analysis is as a measurement of knowledge claims.


Citation analysis Research infrastructure Evaluation Bioinformatics 



A shorter version of this paper was presented at the ISSI 2013 conference in Vienna (Jonkers et al. 2013) and at the IWBBIO 2013 conference in Granada. The Spanish Ministry of Economics and Competitiveness funded the project of which this paper forms part through the grant: CSO2011-23508. The first three researchers also received funding from the Ramón y Cajal programme (MINECO) the JAE-DOC programme (CSIC) and the Juan de la Cierva programme (MINECO) of the Spanish Ministry of Economics and Competiveness and the Spanish Research Council (CSIC). The last author acknowledges the EC funded ViBRANT project (Grant RI-261532). SIB Swiss Institute of Bioinformatics allowed for the use of the server web log data used for part of this analysis. We would also like to thank Felix de Moya Anegón for introducing us to the NEXTBIO application “section search” and Isidro Aguillo for advice on the use of Quest’s Funnelweb software. Researchers at the Centre for Science and Technology Studies of Leiden University (NL) provided stimulating ideas in discussions during a research stay of one of the authors. The usual disclaimer applies with respect to those contributions.


The first author worked on this article at the CSIC institute for Public Goods and Policies, but he has currently taken up work at the European Commission. The information and views set out in this chapter do not necessarily reflect the opinion of the first author’s current employer. The European Commission does not guarantee the accuracy of the data included in this study. Neither his current employer nor any person acting on its behalf may be held responsible for the use which may be made of the information contained herein.

Conflict of interest



  1. Appel, R. D., Bairoch, A., Hochstrasser, D. F. (1994). A new generation of information retrieval tools for biologists: the example of the ExPASy WWW server. Trends in Biochemical Sciences, 19(6), 258–260.Google Scholar
  2. Bairoch, A. (2000). The ENZYME database in 2000. Nucleic Acids Research, 28, 304–305.CrossRefGoogle Scholar
  3. Baldi, S. (1998). Normative vs. social constructivist processes in the allocation of citations: A network-analytic model. American Sociological Review, 63(6), 829–846.CrossRefGoogle Scholar
  4. Ball, R., & Tunger, D. (2006). Science indicators revisited–Science Citation Index versus Scopus: A bibliometric comparison of both citation databases. Journal Information Services and Use, 26, 293–301.Google Scholar
  5. Bornmann, L., & Daniel, H. D. (2008). What do citation counts measure? A review of studies on citing behavior. Journal of Documentation, 64(1), 45–80.CrossRefGoogle Scholar
  6. Cozzens, S. E. (1989). What do citations count? The rhetoric-first model. Scientometrics, 15(5–6), 437–447.CrossRefGoogle Scholar
  7. De Jong, S., van Arensbergen, P., Daemen, F., van der Meulen, B., & van den Besselaar, P. (2011). Evaluating research in its context: An approach and two cases. Research Evaluation, 20, 61–72.CrossRefGoogle Scholar
  8. Duin, D., King, D., & van den Besselaar, P. (2012). Identifying audiences of e-infrastructures: Tools for measuring impact. PLoS One, 7(12), e50943. doi: 10.1371/journal.pone.0050943.CrossRefGoogle Scholar
  9. ExPASy. (2012). Accessed June 2012.
  10. Fingerman, S. (2006). Web of science and scopus: Current features and capabilities. Issues in Science and Technology Librarianship. doi: 10.5062/F4G44N7B.
  11. Garfield, E. (1975). The Obliteration Phenomenon. Current Contents, 51(52), 5–7.Google Scholar
  12. Garfield, E. (1998). Random thoughts on citationology, its theory and practice. Scientometrics, 43(1), 69–76.CrossRefGoogle Scholar
  13. Gilbert, N. (1977). Referencing as Persuasion. Social Studies of Science, 7(1), 113–122.CrossRefGoogle Scholar
  14. Gorraiz, J., & Schlögl, C. (2007). Comparison of two counting houses in the field of pharmacology and pharmacy. In Proceedings of the international conference of the international society for scientometrics and informetrics, 1 (pp. 854–855).Google Scholar
  15. Hoogland, C., Mostaguir, K., Sanchez, J. C., Hochstrasser, D. F., & Appel, R. D. (2004). SWISS-2DPAGE, ten years later. Proteomics, 4(8), 2352–2356.CrossRefGoogle Scholar
  16. Jacso, P. (2006). Evaluation of citation enhanced scholarly databases. Journal of Information Processing and Management, 48(12), 763–774.CrossRefGoogle Scholar
  17. Jonkers, K., De Moya Anegon, F., & Aguillo, F. (2012). Measuring the use of research infrastructures as an indicator of research activity. Journal of the American Society of Information Science and Technology, 63(7), 1374–1382.CrossRefGoogle Scholar
  18. Jonkers, K., Derrick, GE, Lopez-Illescas, C., Van den Besselaar, P. (2013) Are citations a complete measure for the usage of e-research infrastructures. In J. Gorraiz, & E. Schiebel et al. (Eds.), Proceedings ISSI 2013 (pp. 136–151). Vienna.Google Scholar
  19. Kaplan, N. (1965). The norms of citation behavior: Prolegomena to the footnote. American Documentation, 16(3), 179–184.CrossRefGoogle Scholar
  20. Lima, T., Auchincloss, A. H., Coudert, E., Keller, G., Michoud, K., Rivoire, C., et al. (2009). HAMAP: a database of completely sequenced microbial proteome sets and manually curated microbial protein families in UniProtKB/Swiss-Prot. Nucleic Acids Research, 37(1), D471–D478. doi: 10.1093/nar/gkn661.CrossRefGoogle Scholar
  21. Lokker, C., Haynes, R. B., Chu, R., McKibbon, K. A., Wilczynski, N. L., & Walter, S. D. (2012). How well are journal and clinical article characteristics associated with the journal impact factor? A retrospective cohort study. Journal of the Medical Library Association, 100(1), 28–33.CrossRefGoogle Scholar
  22. López-Illescas, C., Moya-Anegón, F., & Moed, H. F. (2008). Coverage and citation impact of oncological journals in the Web of Science and Scopus. Journal of Informetrics, 2(4), 304–316.CrossRefGoogle Scholar
  23. Lowry, O. H., Rosebrough, N. J., Farr, A. L., & Randall, R. J. (1951). Protein measurement with the folin phenol reagent. Journal of Biological Chemistry, 193, 265–275.Google Scholar
  24. Martin, B. R., & Irvine, J. (1983). Assessing basic research: Some partial indicators of scientific progress in radio astronomy. Research Policy, 12(2), 61–90.CrossRefGoogle Scholar
  25. Merton, R. K. (1965). On the shoulders of giants: a Shandean postscript (pp. 218–219). New York: Harcourt Brace and World.Google Scholar
  26. Merton, R. K. (1995). The Thomas Theorem and the Matthew effect. Social Forces, 74(2), 379–424.CrossRefGoogle Scholar
  27. Moed, H. F., De Bruin, R. E., & Van Leeuwen, T. N. (1995). New bibliometric tools for the assessment of national research performance: Database description, overview of indicators and first applications. Scientometrics, 33(3), 381–422.CrossRefGoogle Scholar
  28. Moya-Anegon, F., Chinchilla-Rodríguez, Z., Vargas-Quesada, B., Corera-Álvarez, E., Muñoz-Fernández, F. J., & González-Molina, A. (2007). Coverage analysis of Scopus: A journal metric approach. Scientometrics, 73(1), 53–78.CrossRefGoogle Scholar
  29. NEXTBIO. (2012). Section search. Accessed June/October 2012.
  30. Peritz, B. C. (1983). Are methodological papers more cited than theoretical or empirical ones? The case of sociology. Scientometrics, 5(4), 211–218.Google Scholar
  31. Quest. (2010). Funnel Web Analyzer®—overview. Accessed June 2012.
  32. Rehn, C., & Kronman, U. (2008). Bibliometric handbook for Karolinska Institutet V1.05 Accessed June 2012.
  33. Reuters, T. (2012). Accessed June/October 2012.
  34. Scopus. (2012). Accessed June/October 2012.
  35. Senker, J. (1995). Tacit knowledge and Models of Innovation. Industrial and Corporate Change, 4, 425–447.CrossRefGoogle Scholar
  36. Sigrist, C.J.A., de Castro E, Cerutti, L., Cuche, B.A., Hulo, N., Bridge, A., Bougueleret, L., Xenarios, I. (2012). New and continuing developments at PROSITE, Nucleic Acids Research. doi: 10.1093/nar/gks1067.
  37. Weingart, P. (2005). Impact of bibliometrics upon the science system: Inadvertent consequences? Scientometrics, 62(1), 117–131.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2014

Authors and Affiliations

  • K. Jonkers
    • 1
  • G. E. Derrick
    • 1
    • 2
  • C. Lopez-Illescas
    • 3
    • 4
  • P. Van den Besselaar
    • 4
  1. 1.Department of Science and Innovation DynamicsCSIC Institute for Public Goods and PoliciesMadridSpain
  2. 2.Health Economics Research Group (HERG)Brunel UniversityLondonUK
  3. 3.SCImago Group, Department of Information ScienceUniversity of ExtremaduraBadajozSpain
  4. 4.Department of Organization Science and Network InstituteVrije University van AmsterdamAmsterdamThe Netherlands

Personalised recommendations