Experimental Economics

, Volume 17, Issue 4, pp 512–536 | Cite as

A day without a search engine: an experimental study of online and offline searches

  • Yan Chen
  • Grace YoungJoo Jeon
  • Yong-Mi Kim
Original Paper


With the evolution of the Web and development of web-based search engines, online searching has become a common method for obtaining information. Given this popularity, the question arises as to how much time people save by using search engines for their information needs compared to offline sources, as well as how online searching affects both search experiences and search outcomes. Using a random sample of queries from a major search engine and a sample of reference questions from the Internet Public Library (IPL), we conduct a real-effort experiment to compare online and offline search experiences and outcomes. We find that participants are significantly more likely to find an answer on the Web (100 %), compared to offline searching (between 87 % and 90 %). Restricting our analysis to the set of questions in which participants find answers in both treatments, a Web search takes on average 7 (9) minutes, whereas the corresponding offline search takes 22 (19) minutes for a search-engine (IPL) question. Furthermore, while raters judge library sources to be significantly more trustworthy and authoritative than the corresponding Web sources, they judge Web sources as significantly more relevant. Balancing all factors, we find that the overall source quality is not significantly different between the two treatments for the set of search-engine questions. However, for IPL questions, we find that non-Web sources are judged to have significantly higher overall quality than the corresponding Web sources. In comparison, for factual questions, Web search results are significantly more likely to be correct (66 % vs. 43 %). Lastly, post-search questionnaires reveal that participants find online searching more enjoyable than offline searching.


Search Productivity Experiment 

JEL Classification

C93 H41 



We would like to thank David Campbell, Jacob Goeree, Nancy Kotzian, Jeffrey MacKie-Mason, Karen Markey, Betsy Masiello, Soo Young Rieh and Hal Varian for helpful comments and discussions, Donna Hayward and her colleagues at the University of Michigan Hatcher Graduate Library for facilitating the non-Web treatment, and Ashlee Stratakis and Dan Stuart for excellent research assistance. Two anonymous referees provided insightful comments which significantly improved the paper. Financial support from the National Science Foundation through grants No. SES-0079001 and SES-0962492, and a Google Research Award is gratefully acknowledged.

Supplementary material

10683_2013_9381_MOESM1_ESM.pdf (375 kb)
(PDF 374 kB)


  1. Baily, M. N., Hulten, C., & Campbell, D. (1992). Productivity dynamics in manufacturing plants. Brookings Papers on Economic Activity. Microeconomics, 1992, 187–267. CrossRefGoogle Scholar
  2. Beaulieu, M., Robertson, S., & Rasmussen, E. (1996). Evaluating interactive systems in TREC. Journal of the American Society for Information Science, 47(1), 85–94. CrossRefGoogle Scholar
  3. Belkin, N. J., Cool, C., Kelly, D., Kim, G., Kim, J.-Y., Lee, H.-J., Muresan, G., Tang, M.-C., & Yuan, X.-J. (2003). Query length in interactive information retrieval. In Proceedings of the 26th annual international ACM SIGIR conference on research and development in information retrieval (pp. 205–212). Google Scholar
  4. Borlund, P. (2000). Experimental components for the evaluation of interactive information retrieval systems. Journal of Documentation, 56(1), 71–90. CrossRefGoogle Scholar
  5. Brandts, J., & Cooper, D. J. (2007). It’s what you say, not what you pay: an experimental study of manager-employee relationships in overcoming coordination failure. Journal of the European Economic Association, 5(6), 1223–1268. CrossRefGoogle Scholar
  6. Brin, S., & Page, L. (1998). The anatomy of a large-scale hypertextual Web search engine. Computer Networks and {ISDN} Systems, 30(1–7), 107–117. Proceedings of the seventh international World Wide Web conference. CrossRefGoogle Scholar
  7. Broder, A. (2002). A taxonomy of web search. SIGIR Forum, 36(2), 3–10. CrossRefGoogle Scholar
  8. Brookhart, S. M. (1999). The art and science of classroom assessment: the missing part of pedagogy. ASHE-ERIC Higher Education Report, 27(1). Google Scholar
  9. Brynjolfsson, E., & Oh, J.H. (2013). The attention economy: measuring the value of free digital services on the Internet. Google Scholar
  10. Brynjolfsson, E., & Hitt, L. (1996). Paradox lost? Firm-level evidence on the returns to information systems spending. Management Science, 42(4), 541–558. CrossRefGoogle Scholar
  11. Brynjolfsson, E., & Hitt, L. M. (2000). Beyond computation: information technology, organizational transformation and business performance. The Journal of Economic Perspectives, 14(4), 23–48. CrossRefGoogle Scholar
  12. Bughin, J., Corb, L., Manyika, J., Nottebohm, O., Chui, M., de Muller Barbat, B., & Said, R. (2011). The impact of Internet technologies: search. New York: McKinsey & Co. Google Scholar
  13. Carpenter, J., Matthews, P. H., & Schirm, J. (2010). Tournaments and office politics: evidence from a real effort experiment. The American Economic Review, 100(1), 504–517. CrossRefGoogle Scholar
  14. Chang, H. R., & Holland, M. P. (2005). User satisfaction survey of ask-a-question service at the Internet public library. Internet Reference Services Quarterly, 10(2), 61–73. CrossRefGoogle Scholar
  15. Chen, Y., Ho, T.-H., & Kim, Y.-M. (2010). Knowledge market design: a field experiment at Google answers. Journal of Public Economic Theory, 12(4), 641–664. CrossRefGoogle Scholar
  16. Colón-Aguirre, M., & Fleming-May, R. A. (2012). You just type in what you are looking for. Undergraduates’ use of library resources vs. Wikipedia. The Journal of Academic Librarianship, 38(6), 391–399. CrossRefGoogle Scholar
  17. comScore (2013). comScore releases august 2013 U.S. search engine rankings, September 2013. Google Scholar
  18. Connaway, L. S., Dickey, T. J., & Radford, M. L. (2011). “If it is too inconvenient I’m not going after it:” convenience as a critical factor in information-seeking behaviors. Library & Information Science Research, 33(3), 179–190. CrossRefGoogle Scholar
  19. du Rausas, M. P., Manyika, J., Hazan, E., Bughin, J., Chui, M., & Said, R. (2011). Internet matters: the Net’s sweeping impact on growth, jobs, and prosperity. Technical Report, McKinsey Global Institute. Google Scholar
  20. Fallows, D. (2008). Search engine use. Pew Internet and American Life Project. Google Scholar
  21. Giles, J. (2005). Internet encyclopaedias go head to head. Nature, 438, 900–901. CrossRefGoogle Scholar
  22. Goolsbee, A., & Klenow, P. J. (2006). Valuing consumer products by the time spent using them: an application to the Internet. The American Economic Review, 96(2), 108–113. CrossRefGoogle Scholar
  23. Grohowski, R. B., McGoff, C., Vogel, D. R., Martz, W. B., & Nunamaker, J. F. (1990). Implementation of group support systems at IBM. MIS Quarterly, 14(4), 369–383. CrossRefGoogle Scholar
  24. Hardy, A. P. (1982). The selection of channels when seeking information: cost/benefit vs least-effort. Information Processing & Management, 18(6), 289–293. CrossRefGoogle Scholar
  25. Hargittai, E. (2002). Second-Level Digital Divide: Differences in People’s Online Skills. First Monday 7(4). Google Scholar
  26. Houser, D., & Xiao, E. (2011). Classification of natural language messages using a coordination game. Experimental Economics, 14, 1–14. CrossRefGoogle Scholar
  27. Jansen, B. J., & Mullen, T. (2008). Sponsored search: an overview of the concept, history, and technology. International Journal of Electronic Business, 6(2), 114–131. CrossRefGoogle Scholar
  28. Kelly, D., Cushing, A., Dostert, M., Niu, X., & Gyllstrom, K. (2010). Effects of popularity and quality on the usage of query suggestions during information search. In Proceedings of the 28th international conference on human factors in computing systems (pp. 45–54). Google Scholar
  29. Li, Y., & Belkin, N. J. (2010). An exploration of the relationships between work task and interactive information search behavior. Journal of the American Society for Information Science and Technology, 61(9), 1771–1789. CrossRefGoogle Scholar
  30. Liu, Z. (2006). Print vs. electronic resources: a study of user perceptions, preferences, and use. Information Processing & Management, 42, 583–592. CrossRefGoogle Scholar
  31. McCrea, R. T. (2004). Evaluation of two library-based and one expert reference service on the Web. Library Review, 53(1), 11–16. CrossRefGoogle Scholar
  32. Moskal, B. M. (2000). Scoring rubrics: what, when and how? Practical Assessment, Research & Evaluation, 7(3). Google Scholar
  33. Reavley, N. J., Mackinnon, A. J., Morgan, A. J., Alvarez-Jimenez, M., Hetrick, S. E., Killackey, E., Nelson, B., Purcell, R., Yap, M. B. H., & Jorm, A. F. (2012). Quality of information sources about mental disorders: a comparison of Wikipedia with centrally controlled web and printed sources. Psychological Medicine, 42(8), 1753–1762. CrossRefGoogle Scholar
  34. Rector, L. H. (2008). Comparison of Wikipedia and other encyclopedias for accuracy, breadth, and depth in historical articles. Reference Services Review, 36(1), 7–22. CrossRefGoogle Scholar
  35. Rieh, S. Y. (2002). Judgment of information quality and cognitive authority in the Web. Journal of the American Society for Information Science and Technology, 53(2), 145–161. CrossRefGoogle Scholar
  36. Rose, D. E., & Levinson, D. (2004). Understanding user goals in web search. In WWW ’04: proceeding of the 13th international conference on World Wide Web, New York City, NY. Google Scholar
  37. Ruthven, I. (2008). Interactive information retrieval. Annual Review of Information Science and Technology, 42, 43–91. CrossRefGoogle Scholar
  38. Sathe, N. A., Grady, J. L., & Giuse, N. B. (2002). Print versus electronic journals: a preliminary investigation into the effect of journal format on research processes. Journal of the Medical Library Association, 90(2), 235–243. Google Scholar
  39. Shrout, P. E., & Fleiss, J. L. (1979). Intraclass correlations: uses in assessing rater reliability. Psychological Bulletin, 86(2), 420–428. CrossRefGoogle Scholar
  40. Singer, G., Pruulmann-Vengerfeldt, P., Norbisrath, U., & Lewandowski, D. (2012). The relationship between Internet user type and user performance when carrying out simple vs. complex search tasks. First Monday, 6. Google Scholar
  41. Singer, G., Norbisrath, U., & Lewandowski, D. (2013). Ordinary search engine users carrying out complex search tasks. Journal of Information Science, 39(3), 346–358. CrossRefGoogle Scholar
  42. Tombros, A., Ruthven, I., & Jose, J. M. (2005). How users assess Web pages for information seeking. Journal of the American Society for Information Science and Technology, 56(4), 327–344. CrossRefGoogle Scholar
  43. Toms, E. G., & Latter, C. (2007). How consumers search for health information. Health Informatics Journal, 13(3), 223–235. CrossRefGoogle Scholar
  44. Toms, E. G., Freund, L., Kopak, R., & Bartlett, J. C. (2003). The effect of task domain on search. In Proceedings of the 2003 conference of the centre for advanced studies on collaborative research, CASCON ’03 (pp. 303–312). Google Scholar
  45. Trifts, V. J., & Toms, E. G. (2007). Consumers’ allocation of cognitive resources in Web-based search: an exploratory study. International Journal of Electronic Business, 5(6), 561–575. CrossRefGoogle Scholar
  46. van Deursen, A. J. A. M., & van Diepen, S. (2013). Information and strategic Internet skills of secondary students: a performance test. Computers and Education, 63, 218–226. CrossRefGoogle Scholar
  47. Voorhees, E. M., & Harman, D. K. (2005). The text REtrieval conference. In E. M. Voorhees & D. K. Harman (Eds.), TREC: experiment and evaluation in information retrieval. Cambridge: MIT Press. Google Scholar
  48. White, R. W., Jose, J. M., & Ruthven, I. (2003). A task-oriented study on the influencing effects of query-biased summarisation in Web searching. Information Processing & Management, 39(5), 707–733. CrossRefGoogle Scholar

Copyright information

© Economic Science Association 2013

Authors and Affiliations

  1. 1.School of InformationUniversity of MichiganAnn ArborUSA

Personalised recommendations