, Volume 94, Issue 3, pp 1057–1075 | Cite as

A preliminary test of Google Scholar as a source for citation data: a longitudinal study of Nobel prize winners

  • Anne-Wil HarzingEmail author


Most governmental research assessment exercises do not use citation data for the Social Sciences and Humanities as Web of Science or Scopus coverage in these disciplines is considered to be insufficient. We therefore assess to what extent Google Scholar can be used as an alternative source of citation data. In order to provide a credible alternative, Google Scholar needs to be stable over time, display comprehensive coverage, and provide non-biased comparisons across disciplines. This article assesses these conditions through a longitudinal study of 20 Nobel Prize winners in Chemistry, Economics, Medicine and Physics. Our results indicate that Google Scholar displays considerable stability over time. However, coverage for disciplines that have traditionally been poorly represented in Google Scholar (Chemistry and Physics) is increasing rapidly. Google Scholar’s coverage is also comprehensive; all of the 800 most cited publications by our Nobelists can be located in Google Scholar, although in four cases there are some problems with the results. Finally, we argue that Google Scholar might provide a less biased comparison across disciplines than the Web of Science. The use of Google Scholar might therefore redress the traditionally disadvantaged position of the Social Sciences in citation analysis.


Google Scholar Web of science Social sciences Citation analysis 


  1. Bar-Ilan, J. (2008). Which h-index?: A comparison of Web of Science, Scopus and Google Scholar. Scientometrics, 74(2), 257–271.CrossRefGoogle Scholar
  2. Bar-Ilan, J. (2010). Citations to the “Introduction to informetrics” indexed by WOS, Scopus and Google Scholar. Scientometrics, 82(3), 495–506.CrossRefGoogle Scholar
  3. Bar-Ilan, J., Levene, M., & Lin, A. (2007). Some measures for comparing citation databases. Journal of Informetrics, 1(1), 26–34.CrossRefGoogle Scholar
  4. Belew, R. K. (2005). Scientific impact quantity and quality: Analysis of two sources of bibliographic data, arXiv:cs.IR/0504036 v1, 11 April 2005.Google Scholar
  5. Bornmann, L., & Daniel, H. D. (2005). Does the h-index for ranking of scientists really work? Scientometrics, 65(3), 391–392.CrossRefGoogle Scholar
  6. Bornmann, L., Marx, W., Schier, H., Rahm, E., Thor, A., & Daniel, H.-D. (2009). Convergent validity of bibliometric Google Scholar data in the field of chemistry: Citation counts for papers that were accepted by Angewandte Chemie International Edition or rejected but published elsewhere, using Google Scholar, Science Citation Index, Scopus, and Chemical Abstracts. Journal of Informetrics, 3(1), 27–35.CrossRefGoogle Scholar
  7. Bosman, J., Mourik, I. van, Rasch, M., Sieverts, E., & Verhoeff, H. (2006). Scopus reviewed and compared. The coverage and functionality of the citation database Scopus, including comparisons with Web of Science and Google Scholar, Utrecht: Utrecht University Library,
  8. Chen, X. (2010). Google Scholar’s dramatic coverage improvement five years after debut. Serials Review., 36(4), 221–226.CrossRefGoogle Scholar
  9. Cronin, B., & Meho, L. (2006). Using the h-index to rank influential information scientists. Journal of the American Society for Information Science and Technology, 57, 1275–1278.CrossRefGoogle Scholar
  10. Franceschet, M. (2010). A comparison of bibliometric indicators for computer science scholars and journals on Web of Science and Google Scholar. Scientometrics, 83, 243–258.CrossRefGoogle Scholar
  11. García-Pérez, M. A. (2010). Accuracy and completeness of publication and Citation Records in the Web of Science, PsycINFO, and Google Scholar: A case study for the computation of h indices in Psychology. Journal of the American Society for Information Science and Technology, 61(10), 2070–2085.CrossRefGoogle Scholar
  12. Hare, J. (2011). Most universities below par on research, The Australian, 1 February, 2011,
  13. Harzing, A. W. (2005). Australian research output in economics & business: High volume, low impact? Australian Journal of Management, 30(2), 183–200.CrossRefGoogle Scholar
  14. Harzing, A. W. (2007). Publish or Perish. Retrieved from
  15. Harzing, A. W. (2010a). Citation analysis across disciplines: The Impact of different data sources and citation metrics, white paper, Retrieved January 31, 2012, from
  16. Harzing, A. W. (2010b). The Publish or Perish Book: Your guide to effective and responsible citation analysis. Melbourne: Tarma Software Research.Google Scholar
  17. Harzing, A. W., & van der Wal, R. (2008). Google Scholar as a new source for citation analysis? Ethics in Science and Environmental Politics, 8(1), 62–71.Google Scholar
  18. Huang, M., & Chang, Y. (2008). Characteristics of research output in social sciences and humanities: From a research evaluation perspective. Journal of the American Society for Information Science and Technology, 59(11), 1819–1828.MathSciNetCrossRefGoogle Scholar
  19. Jacsó, P. (2010). Metadata mega mess in Google Scholar. Online Information Review, 34(1), 175–191.CrossRefGoogle Scholar
  20. Jacsó, P. (2012). Google Scholar author citation tracker: is it too little, too late? Online Information Review, 36(1), 126–141.CrossRefGoogle Scholar
  21. Jump, P. (2011). Free app has the cite stuff for REF, Times Higher Education Supplement. Retrieved June 30, 2011, from
  22. Kousha, K., & Thelwall, M. (2007). Google Scholar citations and Google Web/URL citations: A multi-discipline exploratory analysis. Journal of the American Society for Information Science and Technology, 58(7), 1055–1065.CrossRefGoogle Scholar
  23. Kousha, K., & Thelwall, M. (2008). Sources of Google Scholar citations outside the science citation index: A comparison between four science disciplines. Scientometrics, 74(2), 273–294.CrossRefGoogle Scholar
  24. Kousha, K., Thelwall, M., & Rezaie, S. (2011). Assessing the citation impact of books: The role of Google Books, Google Scholar, and Scopus. Journal of the American Society for Information Science and Technology, 62(11), 2147–2164.CrossRefGoogle Scholar
  25. Levine-Clark, M., & Gil, E. L. (2009). A comparative analysis of social sciences citation tools. Online Information Review, 33(5), 986–996.CrossRefGoogle Scholar
  26. London School of Economics and Political Science. (2011). Impact of the social sciences: Maximizing the impact of academic research. Retrieved from
  27. Mayr, P., & Walter, A.-K. (2007). An exploratory study of Google Scholar. Online Information Review, 31(6), 814–830.CrossRefGoogle Scholar
  28. Meier, J. J., & Conkling, T. W. (2008). Google Scholar’s coverage of the engineering literature: An empirical study. The Journal of Academic Librarianship, 34(3), 196–201.CrossRefGoogle Scholar
  29. Mingers, J., & Lipitakis, E. A. E. C. G. (2010). Counting the citations: a comparison of Web of Science and Google Scholar in the field of business and management. Scientometrics, 85, 613–625.CrossRefGoogle Scholar
  30. Murphy, P. (1996). Determining measures of the quality and impact of journals, Commissioned Report No. 49, Australian Government Publishing Service, Canberra.Google Scholar
  31. Nederhof, A. (2006). Bibliometric monitoring of research performance in the social sciences and the humanities: A review. Scientometrics, 66(1), 81–100.MathSciNetCrossRefGoogle Scholar
  32. Neuhaus, C., & Daniel, H. D. (2008). Data sources for performing citation analysis: An overview. Journal of Documentation, 64(2), 193–210.CrossRefGoogle Scholar
  33. Neuhaus, C., Neuhaus, E., Asher, A., & Wrede, C. (2006). The depth and breadth of Google Scholar: An empirical study portal. Libraries and the Academy, 6(2), 127–141.CrossRefGoogle Scholar
  34. Norris, M., & Oppenheim, C. (2007). Comparing alternatives to the Web of Science for coverage of the social sciences’ literature. Journal of Informetrics, 1(1), 161–169.CrossRefGoogle Scholar
  35. Pauly, D., & Stergiou, K. I. (2005). Equivalence of results from two citation analyses: Thomson ISI’s Citation Index and Google Scholar’s service (pp. 33–35). December: Ethics in Science and Environmental Politics.Google Scholar
  36. Thornley, C. V., Johnson, A. C., Smeaton, A. C., & Lee, H. (2011a). The Scholarly Impact of TRECVid (2003-9). Journal of the American Society for Information Science and Technology, 62(4), 613–627.CrossRefGoogle Scholar
  37. Thornley, C. V., McLoughlin, S. J., Johnson, A. C., & Smeaton, A. F. (2011b). A bibliometric study of video retrieval evaluation benchmarking (TRECVid): A methodological analysis. Journal of Information Science, 37(6), 577–593.CrossRefGoogle Scholar
  38. Vaughan, L., & Shaw, D. (2008). A new look at evidence of scholarly citations in citation indexes and from web sources. Scientometrics, 74(2), 317–330.CrossRefGoogle Scholar
  39. Walters, W. H. (2007). Google Scholar coverage of a multidisciplinary field. Information Processing and Management, 43(4), 1121–1132.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2012

Authors and Affiliations

  1. 1.Department of Management and MarketingUniversity of MelbourneVictoriaAustralia

Personalised recommendations