Measuring the scientific impact of e-research infrastructures: a citation based approach?

Abstract

This micro-level study explores the extent that citation analysis provides an accurate and representative assessment of the use and impact of bioinformatics e-research infrastructure. The bioinformatic e-research infrastructure studied offers common tools used by life scientists to analyse and interpret genetic and protein sequence information. These e-resources therefore provide an interesting example with which to explore how representative citations are as acknowledgements of knowledge in the life sciences. The examples presented here suggest that there is a relation between number of visits to these databases and number of citations; however, a parallel finding shows how citation analysis frequently underestimates acknowledged use of the resources offered on this e-research infrastructure. The paper discusses the implications of the findings for various aspects of impact measurement and also considers how appropriate citation analysis is as a measurement of knowledge claims.

This is a preview of subscription content, access via your institution.

Notes

  1. 1.

    Since both databases are available on the market, the number of papers comparing them from a scientometric perspective has been growing (e.g. López-Illescas et al. 2008; Gorraiz and Schlögl 2007; Jacso 2006). Scopus covers over 19,500 titles from more than 5,000 publishers worldwide. It includes coverage of 18,500 peer-reviewed journals and over 4.9 million conference papers, 400 trade publications and 350 book series. It provides 100 % coverage of Medline. On May 1, 2012, it contained about 47 million records, 70 % with abstracts, of which 26 million records going back to 1996. (Scopus, 2012). Thomson ReutersWeb of Science covers over 12,000 research journals worldwide and provides access to “the Science Citation Index (1900-present), Social Sciences Citation Index (1956-present), Arts and Humanities Citation Index (1975-present), Index Chemicus (1993-present), and www.thomsonscientific.com/products/ccr (1986-present), plus archives 1840–1985 from INPI.” (Reuters 2012).

  2. 2.

    Reviews are included in addition to articles and for this reasons they were also included in our citation analysis.

  3. 3.

    For these analyses we did not include Quicmod, Findpept, Findmod, PeptideMass, T-Coffee, Swiss-PdbViewer, RAxML and Prosite.

  4. 4.

    Bibliometric researchers, for example, often do not acknowledge the Web of Science or Scopus by including a URL in their publications, let alone a formal citation to the articles in which these databases were first introduced. Of the 518 SD publications that were found through NEXTBIO to mention the use of the Scopus databases in their full text, only 12 included the URL (though in some articles the URL may have been in the reference list, which was not analysed through NEXTBIO).

References

  1. Appel, R. D., Bairoch, A., Hochstrasser, D. F. (1994). A new generation of information retrieval tools for biologists: the example of the ExPASy WWW server. Trends in Biochemical Sciences, 19(6), 258–260.

  2. Bairoch, A. (2000). The ENZYME database in 2000. Nucleic Acids Research, 28, 304–305.

    Article  Google Scholar 

  3. Baldi, S. (1998). Normative vs. social constructivist processes in the allocation of citations: A network-analytic model. American Sociological Review, 63(6), 829–846.

    Article  Google Scholar 

  4. Ball, R., & Tunger, D. (2006). Science indicators revisited–Science Citation Index versus Scopus: A bibliometric comparison of both citation databases. Journal Information Services and Use, 26, 293–301.

    Google Scholar 

  5. Bornmann, L., & Daniel, H. D. (2008). What do citation counts measure? A review of studies on citing behavior. Journal of Documentation, 64(1), 45–80.

    Article  Google Scholar 

  6. Cozzens, S. E. (1989). What do citations count? The rhetoric-first model. Scientometrics, 15(5–6), 437–447.

    Article  Google Scholar 

  7. De Jong, S., van Arensbergen, P., Daemen, F., van der Meulen, B., & van den Besselaar, P. (2011). Evaluating research in its context: An approach and two cases. Research Evaluation, 20, 61–72.

    Article  Google Scholar 

  8. Duin, D., King, D., & van den Besselaar, P. (2012). Identifying audiences of e-infrastructures: Tools for measuring impact. PLoS One, 7(12), e50943. doi:10.1371/journal.pone.0050943.

    Article  Google Scholar 

  9. ExPASy. (2012). http://www.expasy.org/proteomics. Accessed June 2012.

  10. Fingerman, S. (2006). Web of science and scopus: Current features and capabilities. Issues in Science and Technology Librarianship. doi:10.5062/F4G44N7B.

  11. Garfield, E. (1975). The Obliteration Phenomenon. Current Contents, 51(52), 5–7.

    Google Scholar 

  12. Garfield, E. (1998). Random thoughts on citationology, its theory and practice. Scientometrics, 43(1), 69–76.

    Article  Google Scholar 

  13. Gilbert, N. (1977). Referencing as Persuasion. Social Studies of Science, 7(1), 113–122.

    Article  Google Scholar 

  14. Gorraiz, J., & Schlögl, C. (2007). Comparison of two counting houses in the field of pharmacology and pharmacy. In Proceedings of the international conference of the international society for scientometrics and informetrics, 1 (pp. 854–855).

  15. Hoogland, C., Mostaguir, K., Sanchez, J. C., Hochstrasser, D. F., & Appel, R. D. (2004). SWISS-2DPAGE, ten years later. Proteomics, 4(8), 2352–2356.

    Article  Google Scholar 

  16. Jacso, P. (2006). Evaluation of citation enhanced scholarly databases. Journal of Information Processing and Management, 48(12), 763–774.

    Article  Google Scholar 

  17. Jonkers, K., De Moya Anegon, F., & Aguillo, F. (2012). Measuring the use of research infrastructures as an indicator of research activity. Journal of the American Society of Information Science and Technology, 63(7), 1374–1382.

    Article  Google Scholar 

  18. Jonkers, K., Derrick, GE, Lopez-Illescas, C., Van den Besselaar, P. (2013) Are citations a complete measure for the usage of e-research infrastructures. In J. Gorraiz, & E. Schiebel et al. (Eds.), Proceedings ISSI 2013 (pp. 136–151). Vienna.

  19. Kaplan, N. (1965). The norms of citation behavior: Prolegomena to the footnote. American Documentation, 16(3), 179–184.

    Article  Google Scholar 

  20. Lima, T., Auchincloss, A. H., Coudert, E., Keller, G., Michoud, K., Rivoire, C., et al. (2009). HAMAP: a database of completely sequenced microbial proteome sets and manually curated microbial protein families in UniProtKB/Swiss-Prot. Nucleic Acids Research, 37(1), D471–D478. doi:10.1093/nar/gkn661.

    Article  Google Scholar 

  21. Lokker, C., Haynes, R. B., Chu, R., McKibbon, K. A., Wilczynski, N. L., & Walter, S. D. (2012). How well are journal and clinical article characteristics associated with the journal impact factor? A retrospective cohort study. Journal of the Medical Library Association, 100(1), 28–33.

    Article  Google Scholar 

  22. López-Illescas, C., Moya-Anegón, F., & Moed, H. F. (2008). Coverage and citation impact of oncological journals in the Web of Science and Scopus. Journal of Informetrics, 2(4), 304–316.

    Article  Google Scholar 

  23. Lowry, O. H., Rosebrough, N. J., Farr, A. L., & Randall, R. J. (1951). Protein measurement with the folin phenol reagent. Journal of Biological Chemistry, 193, 265–275.

    Google Scholar 

  24. Martin, B. R., & Irvine, J. (1983). Assessing basic research: Some partial indicators of scientific progress in radio astronomy. Research Policy, 12(2), 61–90.

    Article  Google Scholar 

  25. Merton, R. K. (1965). On the shoulders of giants: a Shandean postscript (pp. 218–219). New York: Harcourt Brace and World.

  26. Merton, R. K. (1995). The Thomas Theorem and the Matthew effect. Social Forces, 74(2), 379–424.

    Article  Google Scholar 

  27. Moed, H. F., De Bruin, R. E., & Van Leeuwen, T. N. (1995). New bibliometric tools for the assessment of national research performance: Database description, overview of indicators and first applications. Scientometrics, 33(3), 381–422.

    Article  Google Scholar 

  28. Moya-Anegon, F., Chinchilla-Rodríguez, Z., Vargas-Quesada, B., Corera-Álvarez, E., Muñoz-Fernández, F. J., & González-Molina, A. (2007). Coverage analysis of Scopus: A journal metric approach. Scientometrics, 73(1), 53–78.

    Article  Google Scholar 

  29. NEXTBIO. (2012). Section search. http://www.applications.sciverse.com/action/appDetail/293416. Accessed June/October 2012.

  30. Peritz, B. C. (1983). Are methodological papers more cited than theoretical or empirical ones? The case of sociology. Scientometrics, 5(4), 211–218.

  31. Quest. (2010). Funnel Web Analyzer®—overview. http://www.quest.com/funnel-web-analyzer/index.asp. Accessed June 2012.

  32. Rehn, C., & Kronman, U. (2008). Bibliometric handbook for Karolinska Institutet V1.05 http://ki.se/content/1/c6/01/79/31/bibliometric_handbook_karolinska_institutet_v_1.05.pdf. Accessed June 2012.

  33. Reuters, T. (2012). http://thomsonreuters.com. Accessed June/October 2012.

  34. Scopus. (2012). http://www.scopus.com. Accessed June/October 2012.

  35. Senker, J. (1995). Tacit knowledge and Models of Innovation. Industrial and Corporate Change, 4, 425–447.

    Article  Google Scholar 

  36. Sigrist, C.J.A., de Castro E, Cerutti, L., Cuche, B.A., Hulo, N., Bridge, A., Bougueleret, L., Xenarios, I. (2012). New and continuing developments at PROSITE, Nucleic Acids Research. doi:10.1093/nar/gks1067.

  37. Weingart, P. (2005). Impact of bibliometrics upon the science system: Inadvertent consequences? Scientometrics, 62(1), 117–131.

    Article  Google Scholar 

Download references

Acknowledgments

A shorter version of this paper was presented at the ISSI 2013 conference in Vienna (Jonkers et al. 2013) and at the IWBBIO 2013 conference in Granada. The Spanish Ministry of Economics and Competitiveness funded the project of which this paper forms part through the grant: CSO2011-23508. The first three researchers also received funding from the Ramón y Cajal programme (MINECO) the JAE-DOC programme (CSIC) and the Juan de la Cierva programme (MINECO) of the Spanish Ministry of Economics and Competiveness and the Spanish Research Council (CSIC). The last author acknowledges the EC funded ViBRANT project (Grant RI-261532). SIB Swiss Institute of Bioinformatics allowed for the use of the server web log data used for part of this analysis. We would also like to thank Felix de Moya Anegón for introducing us to the NEXTBIO application “section search” and Isidro Aguillo for advice on the use of Quest’s Funnelweb software. Researchers at the Centre for Science and Technology Studies of Leiden University (NL) provided stimulating ideas in discussions during a research stay of one of the authors. The usual disclaimer applies with respect to those contributions.

Disclaimer

The first author worked on this article at the CSIC institute for Public Goods and Policies, but he has currently taken up work at the European Commission. The information and views set out in this chapter do not necessarily reflect the opinion of the first author’s current employer. The European Commission does not guarantee the accuracy of the data included in this study. Neither his current employer nor any person acting on its behalf may be held responsible for the use which may be made of the information contained herein.

Conflict of interest

None.

Author information

Affiliations

Authors

Corresponding author

Correspondence to K. Jonkers.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Jonkers, K., Derrick, G.E., Lopez-Illescas, C. et al. Measuring the scientific impact of e-research infrastructures: a citation based approach?. Scientometrics 101, 1179–1194 (2014). https://doi.org/10.1007/s11192-014-1411-7

Download citation

Keywords

  • Citation analysis
  • Research infrastructure
  • Evaluation
  • Bioinformatics