Skip to main content

WHOSE – A Tool for Whole-Session Analysis in IIR

  • Conference paper
Advances in Information Retrieval (ECIR 2015)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 9022))

Included in the following conference series:

Abstract

One of the main challenges in Interactive Information Retrieval (IIR) evaluation is the development and application of re-usable tools that allow researchers to analyze search behavior of real users in different environments and different domains, but with comparable results. Furthermore, IIR recently focuses more on the analysis of whole sessions, which includes all user interactions that are carried out within a session but also across several sessions by the same user. Some frameworks have already been proposed for the evaluation of controlled experiments in IIR, but yet no framework is available for interactive evaluation of search behavior from real-world information retrieval (IR) systems with real users. In this paper we present a framework for whole-session evaluation that can also utilize these uncontrolled data sets. The logging component can easily be integrated into real-world IR systems for generating and analyzing new log data. Furthermore, due to a supplementary mapping it is also possible to analyze existing log data. For every IR system different actions and filters can be defined. This allows system operators and researchers to use the framework for the analysis of user search behavior in their IR systems and to compare it with others. Using a graphical user interface they have the possibility to interactively explore the data set from a broad overview down to individual sessions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Baskaya, F., et al.: Time Drives Interaction: Simulating Sessions in Diverse Searching Environments. In: Proceedings of the 35th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 105–114. ACM, New York (2012)

    Google Scholar 

  2. Bates, M.J.: Where Should the Person Stop and the Information Search Interface Start? Inf. Process Manage. 26(5), 575–591 (1990)

    Article  MathSciNet  Google Scholar 

  3. Beckers, T., et al.: ezDL: An Interactive Search and Evaluation System. In: Proceedings of the SIGIR 2012 Workshop on Open Source Information Retrieval, Department of Computer Science, University of Otago, Dunedin, New Zealand, pp. 9–16 (2012)

    Google Scholar 

  4. Belkin, N.J.: On the evaluation of Interactive Information Retrieval Systems (2010)

    Google Scholar 

  5. Bierig, R., et al.: A User-Centered Experiment and Logging Framework for Interactive Information Retrieval. In: Underst. User - Workshop Conjuction SIGIR 2009 (2009)

    Google Scholar 

  6. Cugini, J., Scholtz, J.: VISVIP: 3D Visualization of Paths Through Web Sites. In: Proceedings of the 10th International Workshop on Database & Expert Systems Applications, p. 259. IEEE Computer Society, Washington, DC (1999)

    Chapter  Google Scholar 

  7. Fox, S., et al.: Evaluating Implicit Measures to Improve Web Search. ACM Trans. Inf. Syst. 23(2), 147–168 (2005)

    Article  Google Scholar 

  8. Fuhr, N.: A Probability Ranking Principle for Interactive Information Retrieval. Inf. Retr. 11(3), 251–265 (2008)

    Article  Google Scholar 

  9. Hall, M.M., Toms, E.: Building a Common Framework for IIR Evaluation. In: Forner, P., Müller, H., Paredes, R., Rosso, P., Stein, B., et al. (eds.) CLEF 2013. LNCS, vol. 8138, pp. 17–28. Springer, Heidelberg (2013)

    Chapter  Google Scholar 

  10. Jones, R., Klinkner, K.L.: Beyond the Session Timeout: Automatic Hierarchical Segmentation of Search Topics in Query Logs. In: Proceedings of the 17th ACM Conference on Information and Knowledge Management, pp. 699–708. ACM, New York (2008)

    Google Scholar 

  11. Kanoulas, E., et al.: Evaluating Multi-query Sessions. In: Proceedings of the 34th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 1053–1062. ACM, New York (2011)

    Google Scholar 

  12. Kelly, D., et al.: Evaluation challenges and directions for information-seeking support systems. Computer 42(3), 60–66 (2009)

    Article  Google Scholar 

  13. Kotov, A., et al.: Modeling and Analysis of Cross-session Search Tasks. In: Proceedings of the 34th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 5–14. ACM, New York (2011)

    Google Scholar 

  14. Lam, H., et al.: Session Viewer: Visual Exploratory Analysis of Web Session Logs. In: IEEE VAST, pp. 147–154. IEEE (2007)

    Google Scholar 

  15. Liao, Z., et al.: Evaluating the Effectiveness of Search Task Trails. In: Proceedings of the 21st International Conference on World Wide Web, pp. 489–498. ACM, New York (2012)

    Chapter  Google Scholar 

  16. Liu, C., et al.: Personalization of Search Results Using Interaction Behaviors in Search Sessions. In: Proceedings of the 35th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 205–214. ACM, New York (2012)

    Google Scholar 

  17. Pitkow, J., Bharat, K.A.: Webviz: A Tool For World-Wide Web Access Log Analysis. In: Proceedings of the First International World-Wide Web Conference, pp. 271–277 (1994)

    Google Scholar 

  18. Renaud, G., Azzopardi, L.: SCAMP: A Tool for Conducting Interactive Information Retrieval Experiments. In: Proceedings of the 4th Information Interaction in Context Symposium, pp. 286–289. ACM, New York (2012)

    Chapter  Google Scholar 

  19. Shen, Z., et al.: Visual analysis of massive web session data. In: Barga, R.S., et al. (eds.) LDAV, pp. 65–72. IEEE (2012)

    Google Scholar 

  20. Shneiderman, B.: The Eyes Have It: A Task by Data Type Taxonomy for Information Visualizations. In: Proceedings of the 1996 IEEE Symposium on Visual Languages, pp. 336–343. IEEE Computer Society, Washington, DC (1996)

    Chapter  Google Scholar 

  21. Toms, E.G., et al.: WiIRE: the Web interactive information retrieval experimentation system prototype. Inf. Process. Manag. 40(4), 655–675 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  22. Tran, T.V., Fuhr, N.: Quantitative Analysis of Search Sessions Enhanced by Gaze Tracking with Dynamic Areas of Interest. In: Zaphiris, P., Buchanan, G., Rasmussen, E., Loizides, F. (eds.) TPDL 2012. LNCS, vol. 7489, pp. 468–473. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  23. Voorhees, E.M., Harman, D.K.: TREC: Experiment and Evaluation in Information Retrieval (Digital Libraries and Electronic Publishing). The MIT Press (2005)

    Google Scholar 

  24. Waterson, S.J., et al.: What Did They Do? Understanding Clickstreams with the WebQuilt Visualization System. In: Proceedings of the Working Conference on Advanced Visual Interfaces, pp. 94–102. ACM, New York (2002)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Hienert, D., van Hoek, W., Weber, A., Kern, D. (2015). WHOSE – A Tool for Whole-Session Analysis in IIR. In: Hanbury, A., Kazai, G., Rauber, A., Fuhr, N. (eds) Advances in Information Retrieval. ECIR 2015. Lecture Notes in Computer Science, vol 9022. Springer, Cham. https://doi.org/10.1007/978-3-319-16354-3_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-16354-3_18

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-16353-6

  • Online ISBN: 978-3-319-16354-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics