Abstract
We investigate if a clipboard as an extension to standard image search improves user interaction and experience. In a task-based summative evaluation with 32 participants, we compare plain Google Image Search against two extensions using a clipboard. One clipboard variant is filled with images based on DCG ranking. In the other variant, the clipboard is filled based on gaze information provided by an eyetracker. We assumed that the eyetracking-based clipboard will significantly outperform the other conditions due to its human-centered filtering of the images. To our surprise, the results show that eyetracking-based clipboard was in almost all tasks worse with respect to user satisfaction. In addition, no significant differences regarding effectiveness and efficiency between the three conditions could be observed.
Chapter PDF
References
Biedert, R., Buscher, G., Schwarz, S., Möller, M., Dengel, A., Lottermann, T.: The Text 2.0 Framework – Writing Web-Based Gaze-Controlled Realtime Applications Quickly and Easily. In: Proc. International Workshop on Eye Gaze in Intelligent Human Machine Interaction, EGIHMI (2010)
Corsato, S., Mosconi, M., Porta, M.: An eye tracking approach to image search activities using rsvp display techniques. In: Proc. Working Conference on Advanced Visual Interfaces, AVI 2008, pp. 416–420. ACM, New York (2008)
Günther Gedigaa, I.D., Hamborg, K.-C.: The IsoMetrics usability inventory: An operationalization of ISO 9241-10 supporting summative and formative evaluation of software systems. Behaviour and Information Technology 18, 151–164 (1999)
Hardoon, D.R., Pasupa, K.: Image ranking with implicit feedback from eye movements. In: Proc. Symposium on Eye-Tracking Research and Applications, ETRA 2010, pp. 291–298. ACM, New York (2010)
Hornof, A.J., Cavender, A.: Eyedraw: enabling children with severe motor impairments to draw with their eyes. In: Proc. Conference on Human Factors in Computing Systems, pp. 161–170. ACM (2005)
Järvelin, K., Kekäläinen, J.: Cumulated gain-based evaluation of ir techniques. ACM Trans. Inf. Syst. 20(4), 422–446 (2002)
Jeff, A.J., Jaimes, R., Pelz, J., Grabowski, T., Babcock, J., Chang, S.-F.: Using human observers’ eye movements in automatic image classifiers. In: Proc. of SPIE Human Vision and Electronic Imaging VI, pp. 373–384 (2001)
Kozma, L., Klami, A., Kaski, S.: GaZIR: gaze-based zooming interface for image retrieval. In: Proc. International Conference on Multimodal Interfaces, ICMI-MLMI 2009, pp. 305–312. ACM, New York (2009)
Pasupa, K., Klami, A., Saunders, C.J., Kaski, S., Szedmak, S., Gunn, S.R.: Learning to rank images from eye movements. In: Proc. 12th International Conference on Computer Vision, pp. 2009–2016 (2009)
Tjondronegoro, D., Spink, A.: Web search engine multimedia functionality. Inf. Process. Manage. 44(1), 340–357 (2008)
Ward, D.J., MacKay, D.J.C.: Fast hands-free writing by gaze direction. Nature 418(6900), 838 (2002)
Zhao, X.A., Guestrin, E.D., Sayenko, D., Simpson, T., Gauthier, M., Popovic, M.R.: Typing with eye-gaze and tooth-clicks. In: Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 341–344. ACM, New York (2012)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kastler, L., Scherp, A. (2013). Can a Clipboard Improve User Interaction and User Experience in Web-Based Image Search?. In: Yamamoto, S. (eds) Human Interface and the Management of Information. Information and Interaction Design. HIMI 2013. Lecture Notes in Computer Science, vol 8016. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39209-2_24
Download citation
DOI: https://doi.org/10.1007/978-3-642-39209-2_24
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-39208-5
Online ISBN: 978-3-642-39209-2
eBook Packages: Computer ScienceComputer Science (R0)