Abstract
One of the main challenges in Interactive Information Retrieval (IIR) evaluation is the development and application of re-usable tools that allow researchers to analyze search behavior of real users in different environments and different domains, but with comparable results. Furthermore, IIR recently focuses more on the analysis of whole sessions, which includes all user interactions that are carried out within a session but also across several sessions by the same user. Some frameworks have already been proposed for the evaluation of controlled experiments in IIR, but yet no framework is available for interactive evaluation of search behavior from real-world information retrieval (IR) systems with real users. In this paper we present a framework for whole-session evaluation that can also utilize these uncontrolled data sets. The logging component can easily be integrated into real-world IR systems for generating and analyzing new log data. Furthermore, due to a supplementary mapping it is also possible to analyze existing log data. For every IR system different actions and filters can be defined. This allows system operators and researchers to use the framework for the analysis of user search behavior in their IR systems and to compare it with others. Using a graphical user interface they have the possibility to interactively explore the data set from a broad overview down to individual sessions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Baskaya, F., et al.: Time Drives Interaction: Simulating Sessions in Diverse Searching Environments. In: Proceedings of the 35th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 105–114. ACM, New York (2012)
Bates, M.J.: Where Should the Person Stop and the Information Search Interface Start? Inf. Process Manage. 26(5), 575–591 (1990)
Beckers, T., et al.: ezDL: An Interactive Search and Evaluation System. In: Proceedings of the SIGIR 2012 Workshop on Open Source Information Retrieval, Department of Computer Science, University of Otago, Dunedin, New Zealand, pp. 9–16 (2012)
Belkin, N.J.: On the evaluation of Interactive Information Retrieval Systems (2010)
Bierig, R., et al.: A User-Centered Experiment and Logging Framework for Interactive Information Retrieval. In: Underst. User - Workshop Conjuction SIGIR 2009 (2009)
Cugini, J., Scholtz, J.: VISVIP: 3D Visualization of Paths Through Web Sites. In: Proceedings of the 10th International Workshop on Database & Expert Systems Applications, p. 259. IEEE Computer Society, Washington, DC (1999)
Fox, S., et al.: Evaluating Implicit Measures to Improve Web Search. ACM Trans. Inf. Syst. 23(2), 147–168 (2005)
Fuhr, N.: A Probability Ranking Principle for Interactive Information Retrieval. Inf. Retr. 11(3), 251–265 (2008)
Hall, M.M., Toms, E.: Building a Common Framework for IIR Evaluation. In: Forner, P., Müller, H., Paredes, R., Rosso, P., Stein, B., et al. (eds.) CLEF 2013. LNCS, vol. 8138, pp. 17–28. Springer, Heidelberg (2013)
Jones, R., Klinkner, K.L.: Beyond the Session Timeout: Automatic Hierarchical Segmentation of Search Topics in Query Logs. In: Proceedings of the 17th ACM Conference on Information and Knowledge Management, pp. 699–708. ACM, New York (2008)
Kanoulas, E., et al.: Evaluating Multi-query Sessions. In: Proceedings of the 34th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 1053–1062. ACM, New York (2011)
Kelly, D., et al.: Evaluation challenges and directions for information-seeking support systems. Computer 42(3), 60–66 (2009)
Kotov, A., et al.: Modeling and Analysis of Cross-session Search Tasks. In: Proceedings of the 34th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 5–14. ACM, New York (2011)
Lam, H., et al.: Session Viewer: Visual Exploratory Analysis of Web Session Logs. In: IEEE VAST, pp. 147–154. IEEE (2007)
Liao, Z., et al.: Evaluating the Effectiveness of Search Task Trails. In: Proceedings of the 21st International Conference on World Wide Web, pp. 489–498. ACM, New York (2012)
Liu, C., et al.: Personalization of Search Results Using Interaction Behaviors in Search Sessions. In: Proceedings of the 35th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 205–214. ACM, New York (2012)
Pitkow, J., Bharat, K.A.: Webviz: A Tool For World-Wide Web Access Log Analysis. In: Proceedings of the First International World-Wide Web Conference, pp. 271–277 (1994)
Renaud, G., Azzopardi, L.: SCAMP: A Tool for Conducting Interactive Information Retrieval Experiments. In: Proceedings of the 4th Information Interaction in Context Symposium, pp. 286–289. ACM, New York (2012)
Shen, Z., et al.: Visual analysis of massive web session data. In: Barga, R.S., et al. (eds.) LDAV, pp. 65–72. IEEE (2012)
Shneiderman, B.: The Eyes Have It: A Task by Data Type Taxonomy for Information Visualizations. In: Proceedings of the 1996 IEEE Symposium on Visual Languages, pp. 336–343. IEEE Computer Society, Washington, DC (1996)
Toms, E.G., et al.: WiIRE: the Web interactive information retrieval experimentation system prototype. Inf. Process. Manag. 40(4), 655–675 (2004)
Tran, T.V., Fuhr, N.: Quantitative Analysis of Search Sessions Enhanced by Gaze Tracking with Dynamic Areas of Interest. In: Zaphiris, P., Buchanan, G., Rasmussen, E., Loizides, F. (eds.) TPDL 2012. LNCS, vol. 7489, pp. 468–473. Springer, Heidelberg (2012)
Voorhees, E.M., Harman, D.K.: TREC: Experiment and Evaluation in Information Retrieval (Digital Libraries and Electronic Publishing). The MIT Press (2005)
Waterson, S.J., et al.: What Did They Do? Understanding Clickstreams with the WebQuilt Visualization System. In: Proceedings of the Working Conference on Advanced Visual Interfaces, pp. 94–102. ACM, New York (2002)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Hienert, D., van Hoek, W., Weber, A., Kern, D. (2015). WHOSE – A Tool for Whole-Session Analysis in IIR. In: Hanbury, A., Kazai, G., Rauber, A., Fuhr, N. (eds) Advances in Information Retrieval. ECIR 2015. Lecture Notes in Computer Science, vol 9022. Springer, Cham. https://doi.org/10.1007/978-3-319-16354-3_18
Download citation
DOI: https://doi.org/10.1007/978-3-319-16354-3_18
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-16353-6
Online ISBN: 978-3-319-16354-3
eBook Packages: Computer ScienceComputer Science (R0)