Scientometrics

, Volume 102, Issue 3, pp 2059–2071 | Cite as

What is the best database for computer science journal articles?

Article

Abstract

We compared general and specialized databases, by searching bibliographic information regarding journal articles in the computer science field, and by evaluating their bibliographic coverage and the quality of the bibliographic records retrieved. We selected a sample of computer science articles from an Italian university repository (AIR) to carry out our comparison. The databases selected were INSPEC, Scopus, Web of Science (WoS), and DBLP. We found that DBLP and Scopus indexed the highest number of unique articles (4.14 and 4.05 % respectively), that each of the four databases indexed a set of unique articles, that 12.95 % of the articles sampled were not indexed in any of the databases selected, that Scopus was better than WoS for identifying computer science publications, and that DBLP had a greater number of unique articles indexed (19.03 %), when compared to INSPEC (11.28 %). We also measured the quality of a set of bibliographic records, by comparing five databases: Scopus, WoS, INSPEC, DBLP and Google Scholar (GS). We found that WoS, INSPEC and Scopus provided better quality indexing and better bibliographic records in terms of accuracy, control and granularity of information, when compared to GS and DBLP. WoS and Scopus also provided more sophisticated tools for measuring trends of scholarly publications.

Keywords

Web of Science Scopus DBLP INSPEC Google Scholar 

References

  1. Abrizah, A., Zainab, A. N., Kiran, K., & Raj, R. G. (2013). LIS journals scientific impact and subject categorization: A comparison between Web of Science and Scopus. Scientometrics, 94(2), 721–740.CrossRefGoogle Scholar
  2. Adriaanse, L. S., & Rensleigh, C. (2013). Web of Science, Scopus and Google Scholar a content comprehensiveness comparison. Electronic Library, 31(6), 727–744.CrossRefGoogle Scholar
  3. Aguillo, I. F. (2012). Is Google Scholar useful for bibliometrics? A webometric analysis. Scientometrics, 91(2), 343–351.CrossRefGoogle Scholar
  4. Archambault, É., Campbell, D., Gingras, Y., & Larivière, V. (2009). Comparing bibliometric statistics obtained from the Web of Science and Scopus. Journal of the American Society for Information Science and Technology, 60(7), 1320–1326.CrossRefGoogle Scholar
  5. Archivio Istituzionale della Ricerca. (2014). Retrieved April 11, 2014 from http://air.unimi.it.articles/000553/article.pdf
  6. Bakkalbasi, N., Bauer, K., Glover, J., & Wang, L. (2006). Three options for citation tracking: Google Scholar, Scopus and Web of Science. Biomedical Digital Libraries, 3(1), 7.CrossRefGoogle Scholar
  7. Bar-Ilan, J. (2008). Which h-Index? A comparison of WoS, Scopus and Google Scholar. Scientometrics, 74, 257–271.CrossRefGoogle Scholar
  8. Bar-Ilan, J. (2010). Citations to the “introduction to informetrics” indexed by WoS. Scopus and Google Scholar. Scientometrics, 82(3), 495–506.CrossRefGoogle Scholar
  9. Bar-Ilan, J., Levene, M., & Lin, A. (2007). Some measures for comparing citation databases. Journal of Informetrics, 1, 26–34.CrossRefGoogle Scholar
  10. Bartol, T., Budimir, G., Dekleva-Smrekar, D., Pusnik, M., & Juznic, P. (2014). Assessment of research fields in Scopus and Web of Science in the view of national research evaluation in Slovenia. Scientometrics, 98(2), 1491–1504.CrossRefGoogle Scholar
  11. Bellini, E., & Nesi, P. (2013). Metadata quality assessment tool for open access cultural heritage institutional repositories.In Proceeding of the ECLAP 2013 Conference, 2nd International Conference on Information Technologies for performing arts, media access and entertainment, ECLAP 2013, Lecture Notes in Computer Science, LNCS 7990 (pp. 90–103) Springer.Google Scholar
  12. Bosman, J., van Mourik, I., Rasch, M., Sieverts, E., & Verhoeff, H. (2006). Scopus reviewed and compared. The coverage and functionality of the citation database Scopus, including comparisons with Web of Science and Google Scholar, Utrecht: Utrecht University Library. Retrieved from http://dspace.library.uu.nl/handle/1874/18247
  13. Cassella, M., & Morando, M. (2012). Fostering new roles for librarians: Skills sets for repository managers—Results of a survey in Italy. LIBER Quarterly, 21, 407–428. Retrieved from http://liber.library.uu.nl/publish/
  14. De Sutter, B., & Van Den Oord, A. (2012). To be or not to be cited in Computer Science. Communications of the ACM, 55(8), 69–75.CrossRefGoogle Scholar
  15. de Winter, J. C. F., Zadpoor, A. A., & Dodou, D. (2014). The expansion of Google Scholar versus Web of Science: A longitudinal study. Scientometrics, 98(2), 1547–1565.CrossRefGoogle Scholar
  16. Fiala, D. (2012). Bibliometric analysis of CiteSeer data for countries. Information Processing and Management, 48(2), 242–253.CrossRefGoogle Scholar
  17. Franceschet, M. (2010). A comparison of bibliometric indicators for computer science scholars and journals on Web of Science and Google Scholar. Scientometrics, 83(1), 243–258.CrossRefGoogle Scholar
  18. Freyne, J., Coyle, L., Smyth, B., & Cunningham, P. (2010). Relative status of journal and conference publications in Computer Science. Communications of the ACM, 53(11), 124–132.CrossRefGoogle Scholar
  19. Gasparyan, A. Y., Ayvazyan, L., & Kitas, G. D. (2013). Multidisciplinary bibliographic databases. Journal of Korean Medical Science, 28(9), 1270–1275.CrossRefGoogle Scholar
  20. Giles, C. L., Bollacker, K. D., & Lawrence, S. (1998). CiteSeer: an automatic citation indexing system. Proceedings of the third ACM conference on Digital libraries: 89–98. doi:10.1145/276675.276685. ISBN 0-89791-965-3. CiteSeerX:10.1.1.30.6847.
  21. González-Alcaide, G., Valderrama-Zurián, J. C., & Aleixandre-Benavent, R. (2012). The impact factor in non-English-speaking countries. Scientometrics, 92(2), 297–311.CrossRefGoogle Scholar
  22. Gorraiz, J., & Schloegl, C. (2008). A bibliometric analysis of pharmacology and pharmacy journals: Scopus versus Web of Science. Journal of Information Science, 34(5), 715–725.Google Scholar
  23. Kousha, K., & Thelwall, M. (2007). Google Scholar citations and Google Web/URL citations: A multi-discipline exploratory analysis. Journal of the American Society for Information Science and Technology, 58(7), 1055–1065.CrossRefGoogle Scholar
  24. Ley, M. (2009). DBLP: Some lessons learned. Proceedings of the VLDB Endowment, 2(2), 1493–1500.CrossRefMathSciNetGoogle Scholar
  25. Liang, L., Rousseau, R., & Zhong, Z. (2013). Non-English journals and papers in physics and chemistry: Bias in citations? Scientometrics, 95(1), 333–350.CrossRefGoogle Scholar
  26. Meho, L., & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: Web of Science versus Scopus and Google Scholar. Journal of the American Society for Information Science and Technology, 58(13), 2105–2125.CrossRefGoogle Scholar
  27. Orduña-Malea, E., Ayllón, J. M., Martín-Martín, A., & López-Cózar, E. D. (2014). Empirical evidences in citation-based search engines: Is microsoft academic search dead? Granada: EC3 Reports, 16: May 21, 2014. arXiv:1404.7045 [cs.DL]. Retrieved from http://arxiv.org/abs/1404.7045
  28. Palavitsinis, N., Manouselis, N., & Sanchez-Alonso, S. (2014). Metadata quality in digital repositories: Empirical results from the cross-domain transfer of a quality assurance process. Journal of the Association for Information Science and Technology, 65(6), 1202–1216.CrossRefGoogle Scholar
  29. Petricek, V., Cox, I. J., Han, H., Councill, I. G., & Giles, C. L. (2005). A comparison of on-line computer science citation databases. In A. Rauber, S. Christodoulakis, & A. M. Tjoa (Eds.) ECDL 2005. LNCS (vol. 3652, pp. 438–449). Heidelberg: Springer.Google Scholar
  30. Sicilia, M., Sánchez-Alonso, S., & García-Barriocanal, E. (2011). Comparing impact factors from two different citation databases: The case of Computer Science. Journal of Informetrics, 5(4), 698–704.CrossRefGoogle Scholar
  31. Top Institutionals. (2014). Retrieved April 11, 2014 from http://repositories.webometrics.info/en/top_Inst?sort=asc&order=scholar
  32. Ugoljni, D., & Casilli, C. (2003). The visibility of Italian journals. Scientometrics, 56(3), 345–355.CrossRefGoogle Scholar
  33. Vieira, E. S., & Gomes, J. A. N. F. (2009). A comparison of Scopus and Web of Science for a typical university. Scientometrics, 81(2), 587–600.CrossRefGoogle Scholar
  34. Wainer, J., Billa, C., & Goldenstein, S. (2011). Invisible work in standard bibliometric evaluation of Computer Science. Communications of the ACM, 54(5), 141–148.CrossRefGoogle Scholar
  35. Whitley, K. M. (2002). Analysis of SciFinder Scholar and Web of Science citation searches. Journal of the American Society for Information Science and Technology, 53(14), 1210–1215.CrossRefGoogle Scholar
  36. Wolfram, D. (2003). Applied informetrics for information retrieval research. Westport: Libraries Unlimited Inc.Google Scholar
  37. Ze, H., & Bo, Y. (2012). Mining Google Scholar citations: An exploratory study. ICIC 2012, LNCS 7389, pp. 182–189. Google Scholar
  38. Zhang, L. (2014). The impact of data source on the ranking of computer scientists based on citation indicators: A comparison of Web of Science and Scopus. Issues in Science and Technology Librarianship, 75. Retrieved from http://www.istl.org/14-winter/refereed2.html.
  39. Zhang, L., & Glänzel, W. (2012). Proceeding papers in journals versus the “regular” journal publications. Journal of Informetrics, 6(1), 88–96.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2014

Authors and Affiliations

  1. 1.Computer Science DepartmentUniversity of MilanMilanItaly

Personalised recommendations