Abstract
Every purposive interactive action begins with an intention to interact. In the domain of intelligent adaptive systems, behavioral signals linked to the actions are of great importance, and even though humans are good in such predictions, interactive systems are still falling behind. We explored mouse interaction and related eye-movement data from interactive problem solving situations and isolated sequences with high probability of interactive action. To establish whether one can predict the interactive action from gaze, we 1) analyzed gaze data using sliding fixation sequences of increasing length and 2) considered sequences several fixations prior to the action, either containing the last fixation before action (i.e. the quiet eye fixation) or not. Each fixation sequence was characterized by 54 gaze features and evaluated by an SVM-RBF classifier. The results of the systematic evaluation revealed importance of the quiet eye fixation and statistical differences of quiet eye fixation compared to other fixations prior to the action.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bailey, B.P., Iqbal, S.T.: Understanding changes in mental workload during execution of goal-directed tasks and its application for interruption management. ACM Trans. Comput.-Hum. Interact. 14(4), 21:1–21:28 (2008)
Bartels, M., Marshall, S.P.: Measuring cognitive workload across different eye tracking hardware platforms. In: Proc. of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 161–164. ACM (2012)
Bednarik, R., Eivazi, S., Vrzakova, H.: A computational approach for prediction of problem-solving behavior using support vector machines and eye-tracking data. In: Nakano, Y.I., Conati, C., Bader, T. (eds.) Eye Gaze in Intelligent User Interfaces, pp. 111–134. Springer (2013)
Bednarik, R., Gowases, T., Tukiainen, M.: Gaze interaction enhances problem solving: Effects of dwell-time based, gaze-augmented, and mouse interaction on problem-solving strategies and user experience. J. of Eye Movement Research 3(1), 1–10 (2009)
Bednarik, R., Vrzakova, H., Hradis, M.: What do you want to do next: a novel approach for intent prediction in gaze-based interaction. In: Proc. of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 83–90. ACM (2012)
Bixler, R., D’Mello, S.: Toward fully automated person-independent detection of mind wandering. In: Dimitrova, V., Kuflik, T., Chin, D., Ricci, F., Dolog, P., Houben, G.-J. (eds.) UMAP 2014. LNCS, vol. 8538, pp. 37–48. Springer, Heidelberg (2014)
Bondareva, D., Conati, C., Feyzi-Behnagh, R., Harley, J.M., Azevedo, R., Bouchet, F.: Inferring learning from gaze data during interaction with an environment to support self-regulated learning. In: Lane, H.C., Yacef, K., Mostow, J., Pavlik, P. (eds.) AIED 2013. LNCS, vol. 7926, pp. 229–238. Springer, Heidelberg (2013)
Bulling, A., Gellersen, H.: Toward mobile eye-based human-computer interaction. IEEE, Pervasive Computing 9(4), 8–12 (2010)
Bulling, A., Roggen, D., Troster, G.: What’s in the eyes for context-awareness? IEEE, Pervasive Computing 10(2), 48–57 (2011)
Eivazi, S., Bednarik, R.: Inferring problem solving strategies using eye-tracking: system description and evaluation. In: Proc. of the 10th Koli Calling Int. Conference on Computing Education Research, pp. 55–61. ACM (2010)
Eivazi, S., Bednarik, R., Tukiainen, M., von und zu Fraunberg, M., Leinonen, V., Jääskeläinen, J.E.: Gaze behaviour of expert and novice microneurosurgeons differs during observations of tumor removal recordings. In: Proc. of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 377–380. ACM (2012)
Flanagan, J.R., Johansson, R.S.: Action plans used in action observation. Nature 424(6950), 769–771 (2003)
Heckhausen, H., Beckmann, J.: Intentional action and action slips. Psychological Review 97(1), 36–48 (1990)
Hradis, M., Eivazi, S., Bednarik, R.: Voice activity detection from gaze in video mediated communication. In: Proc. of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 329–332. ACM (2012)
Jacob, R.J.K., Karn, K.S.: Commentary on section 4. eye tracking in human-computer interaction and usability research: ready to deliver the promises. In: The Mind’s Eye: Cognitive and Applied Aspects of Eye Movement Research, pp. 573–605. Elsevier Science (2003)
Just, M.A., Carpenter, P.A.: A theory of reading: From eye fixations to comprehension. Psychological Review 87, 329–354 (1980)
Kardan, S., Conati, C.: Comparing and combining eye gaze and interface actions for determining user learning with an interactive simulation. In: Carberry, S., Weibelzahl, S., Micarelli, A., Semeraro, G. (eds.) UMAP 2013. LNCS, vol. 7899, pp. 215–227. Springer, Heidelberg (2013)
Kautz, H.A., Allen, J.F.: Generalized plan recognition. In: AAAI,vol. 86, pp. 32–37 (1986)
Memmert, D.: Pay attention! a review of visual attentional expertise in sport. Int. Review of Sport and Exercise Psychology 2(2), 119–138 (2009)
Modayil, J., Bai, T., Kautz, H.: Improving the recognition of interleaved activities. In: Proc. of the 10th Int. Conf. on Ubiquitous Computing, UbiComp 2008, pp. 40–43. ACM (2008)
Norman, D.A.: The Design of Everyday Things. Basic Books, New York (2002)
Prabhu, P.V., Prabhu, G.V.: Handbook of Human-Computer Interaction Chapter 22 Human Error and User-Interface Design. Elsevier Science B. V. (1997)
Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychological Bulletin 124(3), 372 (1998)
Simola, J., Salojärvi, J., Kojo, I.: Using hidden markov model to uncover processing states from eye movements in information search tasks. Cognitive Systems Research 9(4), 237–251 (2008)
Steichen, B., Carenini, G., Conati, C.: User-adaptive information visualization: using eye gaze data to infer visualization tasks and user cognitive abilities. In: Proc. of the 2013 Int. Conf. on Intelligent User Interfaces, pp. 317–328. ACM (2013)
Vickers, J.N.: Visual control when aiming at a far target. J. of Experimental Psychology: Human Perception and Performance 22(2), 342 (1996)
Vrzakova, H., Bednarik, R.: Eyecloud: cloud computing for pervasive eye-tracking. In: PETMEI 2013, 3rd Int. Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (2013)
Vrzakova, H., Bednarik, R.: Fast and comprehensive extension to intention prediction from gaze. In: IUI 2013 Workshop on Interacting with Smart Objects (2013)
Williams, A.M., Singer, R.N., Frehlich, S.G.: Quiet eye duration, expertise, and task complexity in near and far aiming tasks. Journal of Motor Behavior 34(2), 197–207 (2002)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Vrzakova, H., Bednarik, R. (2015). Quiet Eye Affects Action Detection from Gaze More Than Context Length. In: Ricci, F., Bontcheva, K., Conlan, O., Lawless, S. (eds) User Modeling, Adaptation and Personalization. UMAP 2015. Lecture Notes in Computer Science(), vol 9146. Springer, Cham. https://doi.org/10.1007/978-3-319-20267-9_23
Download citation
DOI: https://doi.org/10.1007/978-3-319-20267-9_23
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-20266-2
Online ISBN: 978-3-319-20267-9
eBook Packages: Computer ScienceComputer Science (R0)