, Volume 98, Issue 2, pp 797–806 | Cite as

Substance without citation: evaluating the online impact of grey literature

  • David Wilkinson
  • Pardeep Sud
  • Mike ThelwallEmail author


Individuals and organisations producing information or knowledge for others sometimes need to be able to provide evidence of the value of their work in the same way that scientists may use journal impact factors and citations to indicate the value of their papers. There are many cases, however, when organisations are charged with producing reports but have no real way of measuring their impact, including when they are distributed free, do not attract academic citations and their sales cannot be tracked. Here, the web impact report (WIRe) is proposed as a novel solution for this problem. A WIRe consists of a range of web-derived statistics about the frequency and geographic location of online mentions of an organisation’s reports. WIRe data is typically derived from commercial search engines. This article defines the component parts of a WIRe and describes how to collect and analyse the necessary data. The process is illustrated with a comparison of the web impact of the reports of a large UK organisation. Although a formal evaluation was not conducted, the results suggest that WIRes can indicate different levels of web impact between reports and can reveal the type of online impact that the reports have.


Grey literature Web impact Webometrics 



This research is part of the FP7 EU-funded project ACUMEN on assessing web indicators in research evaluation.


  1. Baumgartner, F. (2007). Punctuated equilibrium theory and environmental policy. In R. Repetto (Ed.), Punctuated equilibrium models and environmental policy (pp. 97–116). New Haven: Yale University Press.Google Scholar
  2. Bence, V., & Oppenheim, C. (2004). The influence of peer review on the research assessment exercise. Journal of Information Science, 30(4), 347–368.CrossRefGoogle Scholar
  3. Bennett, R. (2007). Sources and use of marketing information by marketing managers. Journal of Documentation, 63(5), 702–726.CrossRefGoogle Scholar
  4. Bornmann, L. (2013). What is societal impact of research and how can it be assessed? A literature survey. Journal of the American Society for Information Science and Technology, 64(2), 217–233.CrossRefGoogle Scholar
  5. Butler, L. (2003). Explaining Australia’s increased share of ISI publications—The effects of a funding formula based on publication counts. Research Policy, 32(1), 143–155.CrossRefGoogle Scholar
  6. Cronin, B., & Shaw, D. (2002). Banking (on) different forms of symbolic capital. Journal of the American Society for the Information Science, 53(13), 1267–1270.Google Scholar
  7. Cronin, B., Snyder, H. W., Rosenbaum, H., Martinson, A., Callahan, E., et al. (1998). Invoked on the web. Journal of the American Society for Information Science, 49(14), 1319–1328.CrossRefGoogle Scholar
  8. Dutton, W. H., & Elsper, E. J. (2007). The Internet in Britain 2007. Oxford: Oxford Internet Institute.Google Scholar
  9. Geisler, E. (2000). The metrics of science and technology. Westport, CT: Quorum Books.Google Scholar
  10. Gentil-Beccot, A., Mele, S., & Brooks, T. (2010). Citing and reading behaviours in high-energy physics. Scientometrics, 84(2), 345–355.CrossRefGoogle Scholar
  11. Ingwersen, P. (1998). The calculation of web impact factors. Journal of Documentation, 54(2), 236–243.CrossRefGoogle Scholar
  12. Jeffery, K. G. (2000). An architecture for grey literature in a R&D context. International Journal on Grey Literature, 1(2), 64–72.CrossRefGoogle Scholar
  13. Kousha, K., & Thelwall, M. (2007). Google Scholar citations and Google Web/URL citations: A multi-discipline exploratory analysis. Journal of the American Society for Information Science and Technology, 58(7), 1055–1065.CrossRefGoogle Scholar
  14. Lefebvre, R. C., & Flora, J. A. (1988). Social marketing and public health intervention. Health Education Quarterly, 15(3), 299–315.CrossRefGoogle Scholar
  15. Moed, Hf. (2005). Citation analysis in research evaluation. New York: Springer.Google Scholar
  16. Neuendorf, K. (2002). The content analysis guidebook. London: Sage.Google Scholar
  17. Thelwall, M. (2008). Extracting accurate and complete results from search engines: Case study Windows Live. Journal of the American Society for Information Science and Technology, 59(1), 38–50.CrossRefGoogle Scholar
  18. Thelwall, M. (2009). Introduction to webometrics: Quantitative web research for the social sciences. New York: Morgan & Claypool.Google Scholar
  19. Thelwall, M., Vann, K., Fairclough, R., et al. (2006). Web issue analysis: An integrated water resource management case study. Journal of the American Society for Information Science and Technology, 57(10), 1303–1314.CrossRefGoogle Scholar
  20. Thelwall, M., & Wilkinson, D. (2008). A generic lexical URL segmentation framework for counting links, colinks or URLs. Library and Information Science Research, 30(2), 94–101.CrossRefGoogle Scholar
  21. Uyar, A. (2009). Investigation of the accuracy of search engine hit counts. Journal of Information Science, 35(4), 469–480.CrossRefMathSciNetGoogle Scholar
  22. Vaughan, L., & Shaw, D. (2003). Bibliographic and web citations: What is the difference? Journal of the American Society for Information Science and Technology, 54(14), 1313–1322.CrossRefGoogle Scholar
  23. Wilkinson, D., Harries, G., Thelwall, M., Price, E., et al. (2003). Motivations for academic web site interlinking: Evidence for the web as a novel source of information on informal scholarly communication. Journal of Information Science, 29(1), 49–56.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2013

Authors and Affiliations

  1. 1.Statistical Cybermetrics Research GroupSchool of Technology, University of WolverhamptonWolverhamptonUK

Personalised recommendations