Preliminary Studies on Personalized Preference Prediction from Gaze in Comparing Visualizations

  • Hamed R.-Tavakoli
  • Hanieh Poostchi
  • Jaakko Peltonen
  • Jorma Laaksonen
  • Samuel Kaski
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10073)

Abstract

This paper presents a pilot study on the recognition of user preference, manifested as the choice between items, using eye movements. Recently, there have been empirical studies demonstrating user task decoding from eye movements. Such studies promote eye movement signal as a courier of user cognitive state rather than a simple interaction utility, supporting the use of eye movements in demanding cognitive tasks as an implicit cue, obtained unobtrusively. Even though eye movements have been already employed in human-computer interaction (HCI) for a variety of tasks, to the best of our knowledge, they have not been evaluated for personalized preference recognition during visualization comparison. To summarize the contribution, we investigate: “How well do eye movements disclose the user’s preference?” To this end, we build a pilot experiment enforcing high-level cognitive load for the users and record their eye movements and preference choices, asserted explicitly. We then employ Gaussian processes along with other classifiers in order to predict the users’ choices from the eye movements. Our study supports further investigation of the observer preference prediction from eye movements.

Notes

Acknowledgement

The authors would like to acknowledge the support of the Finnish Center of Excellence in Computational Inference Research (COIN), the Revolution of Knowledge Work 2 project, and Academy of Finland decision 295694.

References

  1. 1.
    Kelly, D., Teevan, J.: Implicit feedback for inferring user preference: a bibliography. SIGIR Forum 37, 18–28 (2003)CrossRefGoogle Scholar
  2. 2.
    Maglio, P.P., Barrett, R., Campbell, C.S., Selker, T.: SUITOR: an attentive information system. In: Proceedings of the 5th International Conference on Intelligent User Interfaces (2000)Google Scholar
  3. 3.
    Puolamäki, K., Salojärvi, J., Savia, E., Simola, J., Kaski, S.: Combining eye movements and collaborative filtering for proactive information retrieval. In: Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (2005)Google Scholar
  4. 4.
    Yarbus, A.L.: Eye Movements and Vision. Plenum Press, New York (1967)CrossRefGoogle Scholar
  5. 5.
    Borji, A., Itti, L.: Defending Yarbus: eye movements reveal observers’ task. J. Vis. 14, 29 (2014)CrossRefGoogle Scholar
  6. 6.
    Kanan, C., Ray, N.A., Bseiso, D.N.F., Hsiao, J.H., Cottrell, G.W.: Predicting an observer’s task using multi-fixation pattern analysis. In: Proceedings of the Symposium on Eye Tracking Research and Applications (2014)Google Scholar
  7. 7.
    Haji-Abolhassani, A., Clark, J.J.: An inverse Yarbus process: predicting observers’ task from eye movement patterns. Vis. Res. 103, 127–142 (2014)CrossRefGoogle Scholar
  8. 8.
    Bulling, A., Ward, J., Gellersen, H., Troster, G.: Eye movement analysis for activity recognition using electrooculography. PAMI 33, 741–753 (2011)CrossRefGoogle Scholar
  9. 9.
    Buscher, G., Cutrell, E., Morris, M.R.: What do you see when you’re surfing? Using eye tracking to predict salient regions of web pages. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (2009)Google Scholar
  10. 10.
    Tichon, J.G., Mavin, T., Wallis, G., Visser, T.A.W., Riek, S.: Using pupillometry and electromyography to track positive and negative affect during flight simulation. Aviat. Psychol. Appl. Hum. Factors 4, 23–32 (2014)CrossRefGoogle Scholar
  11. 11.
    Simola, J., Fevre, K.L., Torniainen, J., Baccino, T.: Affective processing in natural scene viewing: valence and arousal interactions in eye-fixation-related potentials. NeuroImage 106, 21–33 (2015)CrossRefGoogle Scholar
  12. 12.
    Tavakoli, H.R., Yanulevskaya, V., Rahtu, E., Heikkilä, J., Sebe, N.: Emotional valence recognition, analysis of salience and eye movements. In: 22nd International Conference on Pattern Recognition (2014)Google Scholar
  13. 13.
    R.-Tavakoli, H., Atyabi, A., Rantanen, A., Laukka, S.J., Nefti-Meziani, S., Heikkilä, J.: Predicting the valence of a scene from observers’ eye movements. PLoS ONE 10, e0138198 (2015)CrossRefGoogle Scholar
  14. 14.
    Zelinsky, G.J., Peng, Y., Samaras, D.: Eye can read your mind: decoding gaze fixations to reveal categorical search targets. J. Vis. 13, 10 (2012)CrossRefGoogle Scholar
  15. 15.
    Zelinsky, G., Adeli, H., Peng, Y., Samaras, D.: Modelling eye movements in a categorical search task. Philos. Trans. R Soc. Lond. B Biol. Sci. 368, 20130058 (2013)CrossRefGoogle Scholar
  16. 16.
    Borji, A., Lennartz, A., Pomplun, M.: What do eyes reveal about the mind? Algorithmic inference of search targets from fixations. Neurocomputing 149, Part B, 788–799 (2015)CrossRefGoogle Scholar
  17. 17.
    Sattar, H., Mller, S., Fritz, M., Bulling, A.: Prediction of search targets from fixations in open-world settings. In: 2015 IEEE Conference on Computer Vision and Pattern Recognition (2015)Google Scholar
  18. 18.
    Ponomarenko, N., Jin, L., Ieremeiev, O., Lukin, V., Egiazarian, K., Astola, J., Vozel, B., Chehdi, K., Carli, M., Battisti, F., Kuo, C.C.J.: Image database TID2013: peculiarities, results and perspectives. Sig. Process. Image Commun. 30, 57–77 (2015)CrossRefGoogle Scholar
  19. 19.
    Zhang, L., Shen, Y., Li, H.: VSI: a visual saliency-induced index for perceptual image quality assessment. IEEE Trans. Image Process. 23, 4270–4281 (2014)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Zhang, W., Borji, A., Wang, Z., Callet, P.L., Liu, H.: The application of visual saliency models in objective image quality assessment: a statistical evaluation. IEEE Trans. Neural Netw. Learn. Syst. 27, 1266–1278 (2016)MathSciNetCrossRefGoogle Scholar
  21. 21.
    Xu, S., Jiang, H., Lau, F.C.: Personalized online document, image and video recommendation via commodity eye-tracking. In: Proceedings of the 2008 ACM Conference on Recommender Systems, RecSys 2008, pp. 83–90. ACM, New York (2008)Google Scholar
  22. 22.
    Wadlinger, H., Isaacowitz, D.: Positive mood broadens visual attention to positive stimuli. Motiv. Emot. 30, 87–99 (2006)CrossRefGoogle Scholar
  23. 23.
    Just, M., Carpenter, P.: A theory of reading: from eye fixations to comprehension. Psychol. Rev. 87, 329–354 (1980)CrossRefGoogle Scholar
  24. 24.
    Vitu, F., McConkie, G., Kerr, P., O’Regan, J.: Fixation location effects on fixation durations during reading: an inverted optimal viewing position effect. Vis. Res. 41, 3513–3533 (2001)CrossRefGoogle Scholar
  25. 25.
    Thaler, L., Schütz, A., Goodale, M., Gegenfurtner, K.: What is the best fixation target? The effect of target shape on stability of fixational eye movements. Vis. Res. 76, 31–42 (2013)CrossRefGoogle Scholar
  26. 26.
    Kahneman, D., Beatty, J.: Pupil diameter and load on memory. Science 154, 1583–1585 (1966)CrossRefGoogle Scholar
  27. 27.
    Johnson, E.L., MillerSingley, A.T., Peckham, A.D., Johnson, S.L., Bunge, S.A.: Task-evoked pupillometry provides a window into the development of short-term memory capacity. Front. Psychol. 5, 218 (2014)Google Scholar
  28. 28.
    Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)MATHGoogle Scholar
  29. 29.
    Williams, C.K.I., Barber, D.: Bayesian classification with Gaussian processes. IEEE Trans. Pattern Anal. Mach. Intell. 20, 1342–1351 (1998)CrossRefGoogle Scholar
  30. 30.
    Minka, T.: A family of algorithm for approximate Bayesian inference. Ph.D. thesis, MIT (2001)Google Scholar
  31. 31.
    Vanhatalo, J., Riihimäki, J., Hartikainen, J., Jylänki, P., Tolvanen, V., Vehtari, A.: GPstuff: Bayesian modeling with Gaussian processes. J. Mach. Learn. Res. 14, 1175–1179 (2013)MathSciNetMATHGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Hamed R.-Tavakoli
    • 1
  • Hanieh Poostchi
    • 1
  • Jaakko Peltonen
    • 1
    • 2
  • Jorma Laaksonen
    • 1
  • Samuel Kaski
    • 1
  1. 1.Department of Computer ScienceAalto UniversityEspooFinland
  2. 2.School of Information SciencesUniversity of TampereTampereFinland

Personalised recommendations