Preliminary Studies on Personalized Preference Prediction from Gaze in Comparing Visualizations
This paper presents a pilot study on the recognition of user preference, manifested as the choice between items, using eye movements. Recently, there have been empirical studies demonstrating user task decoding from eye movements. Such studies promote eye movement signal as a courier of user cognitive state rather than a simple interaction utility, supporting the use of eye movements in demanding cognitive tasks as an implicit cue, obtained unobtrusively. Even though eye movements have been already employed in human-computer interaction (HCI) for a variety of tasks, to the best of our knowledge, they have not been evaluated for personalized preference recognition during visualization comparison. To summarize the contribution, we investigate: “How well do eye movements disclose the user’s preference?” To this end, we build a pilot experiment enforcing high-level cognitive load for the users and record their eye movements and preference choices, asserted explicitly. We then employ Gaussian processes along with other classifiers in order to predict the users’ choices from the eye movements. Our study supports further investigation of the observer preference prediction from eye movements.
KeywordsFixation Duration Fixation Location Pupil Diameter Query Term Image Quality Assessment
The authors would like to acknowledge the support of the Finnish Center of Excellence in Computational Inference Research (COIN), the Revolution of Knowledge Work 2 project, and Academy of Finland decision 295694.
- 2.Maglio, P.P., Barrett, R., Campbell, C.S., Selker, T.: SUITOR: an attentive information system. In: Proceedings of the 5th International Conference on Intelligent User Interfaces (2000)Google Scholar
- 3.Puolamäki, K., Salojärvi, J., Savia, E., Simola, J., Kaski, S.: Combining eye movements and collaborative filtering for proactive information retrieval. In: Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (2005)Google Scholar
- 6.Kanan, C., Ray, N.A., Bseiso, D.N.F., Hsiao, J.H., Cottrell, G.W.: Predicting an observer’s task using multi-fixation pattern analysis. In: Proceedings of the Symposium on Eye Tracking Research and Applications (2014)Google Scholar
- 9.Buscher, G., Cutrell, E., Morris, M.R.: What do you see when you’re surfing? Using eye tracking to predict salient regions of web pages. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (2009)Google Scholar
- 12.Tavakoli, H.R., Yanulevskaya, V., Rahtu, E., Heikkilä, J., Sebe, N.: Emotional valence recognition, analysis of salience and eye movements. In: 22nd International Conference on Pattern Recognition (2014)Google Scholar
- 17.Sattar, H., Mller, S., Fritz, M., Bulling, A.: Prediction of search targets from fixations in open-world settings. In: 2015 IEEE Conference on Computer Vision and Pattern Recognition (2015)Google Scholar
- 21.Xu, S., Jiang, H., Lau, F.C.: Personalized online document, image and video recommendation via commodity eye-tracking. In: Proceedings of the 2008 ACM Conference on Recommender Systems, RecSys 2008, pp. 83–90. ACM, New York (2008)Google Scholar
- 27.Johnson, E.L., MillerSingley, A.T., Peckham, A.D., Johnson, S.L., Bunge, S.A.: Task-evoked pupillometry provides a window into the development of short-term memory capacity. Front. Psychol. 5, 218 (2014)Google Scholar
- 30.Minka, T.: A family of algorithm for approximate Bayesian inference. Ph.D. thesis, MIT (2001)Google Scholar