Skip to main content

Quiet Eye Affects Action Detection from Gaze More Than Context Length

  • Conference paper
  • First Online:
User Modeling, Adaptation and Personalization (UMAP 2015)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 9146))

Abstract

Every purposive interactive action begins with an intention to interact. In the domain of intelligent adaptive systems, behavioral signals linked to the actions are of great importance, and even though humans are good in such predictions, interactive systems are still falling behind. We explored mouse interaction and related eye-movement data from interactive problem solving situations and isolated sequences with high probability of interactive action. To establish whether one can predict the interactive action from gaze, we 1) analyzed gaze data using sliding fixation sequences of increasing length and 2) considered sequences several fixations prior to the action, either containing the last fixation before action (i.e. the quiet eye fixation) or not. Each fixation sequence was characterized by 54 gaze features and evaluated by an SVM-RBF classifier. The results of the systematic evaluation revealed importance of the quiet eye fixation and statistical differences of quiet eye fixation compared to other fixations prior to the action.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bailey, B.P., Iqbal, S.T.: Understanding changes in mental workload during execution of goal-directed tasks and its application for interruption management. ACM Trans. Comput.-Hum. Interact. 14(4), 21:1–21:28 (2008)

    Article  Google Scholar 

  2. Bartels, M., Marshall, S.P.: Measuring cognitive workload across different eye tracking hardware platforms. In: Proc. of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 161–164. ACM (2012)

    Google Scholar 

  3. Bednarik, R., Eivazi, S., Vrzakova, H.: A computational approach for prediction of problem-solving behavior using support vector machines and eye-tracking data. In: Nakano, Y.I., Conati, C., Bader, T. (eds.) Eye Gaze in Intelligent User Interfaces, pp. 111–134. Springer (2013)

    Google Scholar 

  4. Bednarik, R., Gowases, T., Tukiainen, M.: Gaze interaction enhances problem solving: Effects of dwell-time based, gaze-augmented, and mouse interaction on problem-solving strategies and user experience. J. of Eye Movement Research 3(1), 1–10 (2009)

    Google Scholar 

  5. Bednarik, R., Vrzakova, H., Hradis, M.: What do you want to do next: a novel approach for intent prediction in gaze-based interaction. In: Proc. of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 83–90. ACM (2012)

    Google Scholar 

  6. Bixler, R., D’Mello, S.: Toward fully automated person-independent detection of mind wandering. In: Dimitrova, V., Kuflik, T., Chin, D., Ricci, F., Dolog, P., Houben, G.-J. (eds.) UMAP 2014. LNCS, vol. 8538, pp. 37–48. Springer, Heidelberg (2014)

    Google Scholar 

  7. Bondareva, D., Conati, C., Feyzi-Behnagh, R., Harley, J.M., Azevedo, R., Bouchet, F.: Inferring learning from gaze data during interaction with an environment to support self-regulated learning. In: Lane, H.C., Yacef, K., Mostow, J., Pavlik, P. (eds.) AIED 2013. LNCS, vol. 7926, pp. 229–238. Springer, Heidelberg (2013)

    Chapter  Google Scholar 

  8. Bulling, A., Gellersen, H.: Toward mobile eye-based human-computer interaction. IEEE, Pervasive Computing 9(4), 8–12 (2010)

    Article  Google Scholar 

  9. Bulling, A., Roggen, D., Troster, G.: What’s in the eyes for context-awareness? IEEE, Pervasive Computing 10(2), 48–57 (2011)

    Article  Google Scholar 

  10. Eivazi, S., Bednarik, R.: Inferring problem solving strategies using eye-tracking: system description and evaluation. In: Proc. of the 10th Koli Calling Int. Conference on Computing Education Research, pp. 55–61. ACM (2010)

    Google Scholar 

  11. Eivazi, S., Bednarik, R., Tukiainen, M., von und zu Fraunberg, M., Leinonen, V., Jääskeläinen, J.E.: Gaze behaviour of expert and novice microneurosurgeons differs during observations of tumor removal recordings. In: Proc. of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 377–380. ACM (2012)

    Google Scholar 

  12. Flanagan, J.R., Johansson, R.S.: Action plans used in action observation. Nature 424(6950), 769–771 (2003)

    Article  Google Scholar 

  13. Heckhausen, H., Beckmann, J.: Intentional action and action slips. Psychological Review 97(1), 36–48 (1990)

    Article  Google Scholar 

  14. Hradis, M., Eivazi, S., Bednarik, R.: Voice activity detection from gaze in video mediated communication. In: Proc. of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 329–332. ACM (2012)

    Google Scholar 

  15. Jacob, R.J.K., Karn, K.S.: Commentary on section 4. eye tracking in human-computer interaction and usability research: ready to deliver the promises. In: The Mind’s Eye: Cognitive and Applied Aspects of Eye Movement Research, pp. 573–605. Elsevier Science (2003)

    Google Scholar 

  16. Just, M.A., Carpenter, P.A.: A theory of reading: From eye fixations to comprehension. Psychological Review 87, 329–354 (1980)

    Article  Google Scholar 

  17. Kardan, S., Conati, C.: Comparing and combining eye gaze and interface actions for determining user learning with an interactive simulation. In: Carberry, S., Weibelzahl, S., Micarelli, A., Semeraro, G. (eds.) UMAP 2013. LNCS, vol. 7899, pp. 215–227. Springer, Heidelberg (2013)

    Chapter  Google Scholar 

  18. Kautz, H.A., Allen, J.F.: Generalized plan recognition. In: AAAI,vol. 86, pp. 32–37 (1986)

    Google Scholar 

  19. Memmert, D.: Pay attention! a review of visual attentional expertise in sport. Int. Review of Sport and Exercise Psychology 2(2), 119–138 (2009)

    Article  Google Scholar 

  20. Modayil, J., Bai, T., Kautz, H.: Improving the recognition of interleaved activities. In: Proc. of the 10th Int. Conf. on Ubiquitous Computing, UbiComp 2008, pp. 40–43. ACM (2008)

    Google Scholar 

  21. Norman, D.A.: The Design of Everyday Things. Basic Books, New York (2002)

    Google Scholar 

  22. Prabhu, P.V., Prabhu, G.V.: Handbook of Human-Computer Interaction Chapter 22 Human Error and User-Interface Design. Elsevier Science B. V. (1997)

    Google Scholar 

  23. Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological Bulletin 124(3), 372 (1998)

    Article  Google Scholar 

  24. Simola, J., Salojärvi, J., Kojo, I.: Using hidden markov model to uncover processing states from eye movements in information search tasks. Cognitive Systems Research 9(4), 237–251 (2008)

    Article  Google Scholar 

  25. Steichen, B., Carenini, G., Conati, C.: User-adaptive information visualization: using eye gaze data to infer visualization tasks and user cognitive abilities. In: Proc. of the 2013 Int. Conf. on Intelligent User Interfaces, pp. 317–328. ACM (2013)

    Google Scholar 

  26. Vickers, J.N.: Visual control when aiming at a far target. J. of Experimental Psychology: Human Perception and Performance 22(2), 342 (1996)

    MathSciNet  Google Scholar 

  27. Vrzakova, H., Bednarik, R.: Eyecloud: cloud computing for pervasive eye-tracking. In: PETMEI 2013, 3rd Int. Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (2013)

    Google Scholar 

  28. Vrzakova, H., Bednarik, R.: Fast and comprehensive extension to intention prediction from gaze. In: IUI 2013 Workshop on Interacting with Smart Objects (2013)

    Google Scholar 

  29. Williams, A.M., Singer, R.N., Frehlich, S.G.: Quiet eye duration, expertise, and task complexity in near and far aiming tasks. Journal of Motor Behavior 34(2), 197–207 (2002)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hana Vrzakova .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Vrzakova, H., Bednarik, R. (2015). Quiet Eye Affects Action Detection from Gaze More Than Context Length. In: Ricci, F., Bontcheva, K., Conlan, O., Lawless, S. (eds) User Modeling, Adaptation and Personalization. UMAP 2015. Lecture Notes in Computer Science(), vol 9146. Springer, Cham. https://doi.org/10.1007/978-3-319-20267-9_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-20267-9_23

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-20266-2

  • Online ISBN: 978-3-319-20267-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics