International Conference on User Modeling, Adaptation, and Personalization

UMAP 2015: User Modeling, Adaptation and Personalization pp 277-288 | Cite as

Quiet Eye Affects Action Detection from Gaze More Than Context Length

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9146)

Abstract

Every purposive interactive action begins with an intention to interact. In the domain of intelligent adaptive systems, behavioral signals linked to the actions are of great importance, and even though humans are good in such predictions, interactive systems are still falling behind. We explored mouse interaction and related eye-movement data from interactive problem solving situations and isolated sequences with high probability of interactive action. To establish whether one can predict the interactive action from gaze, we 1) analyzed gaze data using sliding fixation sequences of increasing length and 2) considered sequences several fixations prior to the action, either containing the last fixation before action (i.e. the quiet eye fixation) or not. Each fixation sequence was characterized by 54 gaze features and evaluated by an SVM-RBF classifier. The results of the systematic evaluation revealed importance of the quiet eye fixation and statistical differences of quiet eye fixation compared to other fixations prior to the action.

Keywords

Action Intentions Prediction Eye-tracking SVM Mouse interaction Problem solving 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bailey, B.P., Iqbal, S.T.: Understanding changes in mental workload during execution of goal-directed tasks and its application for interruption management. ACM Trans. Comput.-Hum. Interact. 14(4), 21:1–21:28 (2008)CrossRefGoogle Scholar
  2. 2.
    Bartels, M., Marshall, S.P.: Measuring cognitive workload across different eye tracking hardware platforms. In: Proc. of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 161–164. ACM (2012)Google Scholar
  3. 3.
    Bednarik, R., Eivazi, S., Vrzakova, H.: A computational approach for prediction of problem-solving behavior using support vector machines and eye-tracking data. In: Nakano, Y.I., Conati, C., Bader, T. (eds.) Eye Gaze in Intelligent User Interfaces, pp. 111–134. Springer (2013)Google Scholar
  4. 4.
    Bednarik, R., Gowases, T., Tukiainen, M.: Gaze interaction enhances problem solving: Effects of dwell-time based, gaze-augmented, and mouse interaction on problem-solving strategies and user experience. J. of Eye Movement Research 3(1), 1–10 (2009)Google Scholar
  5. 5.
    Bednarik, R., Vrzakova, H., Hradis, M.: What do you want to do next: a novel approach for intent prediction in gaze-based interaction. In: Proc. of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 83–90. ACM (2012)Google Scholar
  6. 6.
    Bixler, R., D’Mello, S.: Toward fully automated person-independent detection of mind wandering. In: Dimitrova, V., Kuflik, T., Chin, D., Ricci, F., Dolog, P., Houben, G.-J. (eds.) UMAP 2014. LNCS, vol. 8538, pp. 37–48. Springer, Heidelberg (2014) Google Scholar
  7. 7.
    Bondareva, D., Conati, C., Feyzi-Behnagh, R., Harley, J.M., Azevedo, R., Bouchet, F.: Inferring learning from gaze data during interaction with an environment to support self-regulated learning. In: Lane, H.C., Yacef, K., Mostow, J., Pavlik, P. (eds.) AIED 2013. LNCS, vol. 7926, pp. 229–238. Springer, Heidelberg (2013) CrossRefGoogle Scholar
  8. 8.
    Bulling, A., Gellersen, H.: Toward mobile eye-based human-computer interaction. IEEE, Pervasive Computing 9(4), 8–12 (2010)CrossRefGoogle Scholar
  9. 9.
    Bulling, A., Roggen, D., Troster, G.: What’s in the eyes for context-awareness? IEEE, Pervasive Computing 10(2), 48–57 (2011)CrossRefGoogle Scholar
  10. 10.
    Eivazi, S., Bednarik, R.: Inferring problem solving strategies using eye-tracking: system description and evaluation. In: Proc. of the 10th Koli Calling Int. Conference on Computing Education Research, pp. 55–61. ACM (2010)Google Scholar
  11. 11.
    Eivazi, S., Bednarik, R., Tukiainen, M., von und zu Fraunberg, M., Leinonen, V., Jääskeläinen, J.E.: Gaze behaviour of expert and novice microneurosurgeons differs during observations of tumor removal recordings. In: Proc. of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 377–380. ACM (2012)Google Scholar
  12. 12.
    Flanagan, J.R., Johansson, R.S.: Action plans used in action observation. Nature 424(6950), 769–771 (2003)CrossRefGoogle Scholar
  13. 13.
    Heckhausen, H., Beckmann, J.: Intentional action and action slips. Psychological Review 97(1), 36–48 (1990)CrossRefGoogle Scholar
  14. 14.
    Hradis, M., Eivazi, S., Bednarik, R.: Voice activity detection from gaze in video mediated communication. In: Proc. of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 329–332. ACM (2012)Google Scholar
  15. 15.
    Jacob, R.J.K., Karn, K.S.: Commentary on section 4. eye tracking in human-computer interaction and usability research: ready to deliver the promises. In: The Mind’s Eye: Cognitive and Applied Aspects of Eye Movement Research, pp. 573–605. Elsevier Science (2003)Google Scholar
  16. 16.
    Just, M.A., Carpenter, P.A.: A theory of reading: From eye fixations to comprehension. Psychological Review 87, 329–354 (1980)CrossRefGoogle Scholar
  17. 17.
    Kardan, S., Conati, C.: Comparing and combining eye gaze and interface actions for determining user learning with an interactive simulation. In: Carberry, S., Weibelzahl, S., Micarelli, A., Semeraro, G. (eds.) UMAP 2013. LNCS, vol. 7899, pp. 215–227. Springer, Heidelberg (2013) CrossRefGoogle Scholar
  18. 18.
    Kautz, H.A., Allen, J.F.: Generalized plan recognition. In: AAAI,vol. 86, pp. 32–37 (1986)Google Scholar
  19. 19.
    Memmert, D.: Pay attention! a review of visual attentional expertise in sport. Int. Review of Sport and Exercise Psychology 2(2), 119–138 (2009)CrossRefGoogle Scholar
  20. 20.
    Modayil, J., Bai, T., Kautz, H.: Improving the recognition of interleaved activities. In: Proc. of the 10th Int. Conf. on Ubiquitous Computing, UbiComp 2008, pp. 40–43. ACM (2008)Google Scholar
  21. 21.
    Norman, D.A.: The Design of Everyday Things. Basic Books, New York (2002)Google Scholar
  22. 22.
    Prabhu, P.V., Prabhu, G.V.: Handbook of Human-Computer Interaction Chapter 22 Human Error and User-Interface Design. Elsevier Science B. V. (1997)Google Scholar
  23. 23.
    Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological Bulletin 124(3), 372 (1998)CrossRefGoogle Scholar
  24. 24.
    Simola, J., Salojärvi, J., Kojo, I.: Using hidden markov model to uncover processing states from eye movements in information search tasks. Cognitive Systems Research 9(4), 237–251 (2008)CrossRefGoogle Scholar
  25. 25.
    Steichen, B., Carenini, G., Conati, C.: User-adaptive information visualization: using eye gaze data to infer visualization tasks and user cognitive abilities. In: Proc. of the 2013 Int. Conf. on Intelligent User Interfaces, pp. 317–328. ACM (2013)Google Scholar
  26. 26.
    Vickers, J.N.: Visual control when aiming at a far target. J. of Experimental Psychology: Human Perception and Performance 22(2), 342 (1996)MathSciNetGoogle Scholar
  27. 27.
    Vrzakova, H., Bednarik, R.: Eyecloud: cloud computing for pervasive eye-tracking. In: PETMEI 2013, 3rd Int. Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (2013)Google Scholar
  28. 28.
    Vrzakova, H., Bednarik, R.: Fast and comprehensive extension to intention prediction from gaze. In: IUI 2013 Workshop on Interacting with Smart Objects (2013)Google Scholar
  29. 29.
    Williams, A.M., Singer, R.N., Frehlich, S.G.: Quiet eye duration, expertise, and task complexity in near and far aiming tasks. Journal of Motor Behavior 34(2), 197–207 (2002)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.University of Eastern FinlandJoensuuFinland

Personalised recommendations