A Computational Approach for Prediction of Problem-Solving Behavior Using Support Vector Machines and Eye-Tracking Data

Abstract

Inference about high-level cognitive states during interaction is a fundamental task in building proactive intelligent systems that would allow effective offloading of mental operations to a computational architecture. We introduce an improved machine-learning pipeline able to predict user interactive behavior and performance using real-time eye-tracking. The inference is carried out using a support-vector machine (SVM) on a large set of features computed from eye movement data that are linked to concurrent high-level behavioral codes based on think aloud protocols. The differences between cognitive states can be inferred from overt visual attention patterns with accuracy over chance levels, although the overall accuracy is still low. The system can also classify and predict performance of the problem-solving users with up to 79 % accuracy. We suggest this prediction model as a universal approach for understanding of gaze in complex strategic behavior. The findings confirm that eye movement data carry important information about problem solving processes and that proactive systems can benefit from real-time monitoring of visual attention.

References

  1. Anderson JR, Bothell D, Douglass S (2004) Eye movements do not reflect retrieval: limits of the eye-mind hypothesis. Psychol Sci 15:225–231 CrossRefGoogle Scholar
  2. Bailey BP, Iqbal ST (2008) Understanding changes in mental workload during execution of goal-directed tasks and its application for interruption management. ACM Trans Comput-Hum Interact 14(4):1–28 CrossRefGoogle Scholar
  3. Bednarik R (2005) Potentials of eye-movement tracking in adaptive systems. In: Proceedings of the fourth workshop on the evaluation of adaptive systems, held in conjunction with the 10th international conference on user modeling (UM’05), pp 1–8 Google Scholar
  4. Bednarik R, Tukiainen M (2008) Temporal eye-tracking data: evolution of debugging strategies with multiple representations. In: Proceedings of the 2008 symposium on eye tracking research & applications. ACM, New York, pp 99–102 CrossRefGoogle Scholar
  5. Bednarik R, Myller N, Sutinen E, Tukiainen M (2006) Analyzing individual differences in program comprehension. Technol Instr Cogn Learn 3(3/4):205 Google Scholar
  6. Bednarik R, Gowases T, Tukiainen M (2009) Gaze interaction enhances problem solving: effects of dwell-time based, gaze-augmented, and mouse interaction on problem-solving strategies and user experience. J Eye Movement Res 3(1):1–10 Google Scholar
  7. Bednarik R, Vrzakova H, Hradis M (2012) What you want to do next: a novel approach for intent prediction in gaze-based interaction. In: Proceedings of the 2012 symposium on eye-tracking research & applications, ETRA’12. ACM, New York Google Scholar
  8. Chang CC, Lin CJ (2011) LibSVM: a library for support vector machines. Science 2(3):1–39 Google Scholar
  9. Conati C, Merten C (2007) Eye-tracking for user modeling in exploratory learning environments: an empirical evaluation. Knowl-Based Syst 20:557–574 CrossRefGoogle Scholar
  10. Davies SP (2003) Initial and concurrent planning in solutions to well-structured problems. Q J Exp Psychol, A Hum Exp Psychol 56(7):1147–1164 CrossRefGoogle Scholar
  11. Eivazi S, Bednarik R (2010) Inferring problem solving strategies using eye-tracking: system description and evaluation. In: Proceedings of the 10th Koli Calling international conference on computing education research, Koli Calling’10. ACM, New York, pp 55–61 CrossRefGoogle Scholar
  12. Eivazi S, Bednarik R (2011) Predicting problem-solving behavior and performance levels from visual attention data. In: Proceedings of 2nd workshop on eye gaze in intelligent human machine interaction at IUI, pp 9–16 Google Scholar
  13. Ericcson KA (1975) Instruction to verbalize as a means to study problem solving process with the 8-puzzle: a preliminary study. Department of Psychology, University of Stockholm Google Scholar
  14. Ericsson KA, Simon HA (1993) Protocol analysis: verbal reports as data revised edition. MIT Press, Cambridge Google Scholar
  15. Glöckner A, Herbold AK (2010) An eye-tracking study on information processing in risky decisions: evidence for compensatory strategies based on automatic processes. J Behav Decis Mak 41(1):71–98 Google Scholar
  16. Goldberg JH, Kotval XP (1999) Computer interface evaluation using eye movements: methods and constructs. Int J Ind Ergon 24:631–645 CrossRefGoogle Scholar
  17. Graf ABA, Borer S (2001) Normalization in support vector machines. In: Proceedings of the 23rd DAGM-symposium on pattern recognition. Springer, London, pp 277–282 Google Scholar
  18. Hsu CW, Chang CC, Lin CJ (2003) A practical guide to support vector classification. Technical report, National Taiwan University Google Scholar
  19. Ishii R, Nakano YI (2008) Estimating user’s conversational engagement based on gaze behaviors. In: Proceedings of the 8th international conference on intelligent virtual agents (IVA’08), pp 200–207 Google Scholar
  20. Just MA, Carpenter PA (1976) Eye fixations and cognitive processes. J Cogn Psychol 8:441–480 CrossRefGoogle Scholar
  21. Kaller CP, Rahm B, Bolkenius K, Unterrainer JM (2009) Eye movements and visuospatial problem solving: identifying separable phases of complex cognition. Psychophysiology 46:818–830 CrossRefGoogle Scholar
  22. Liang Y, Reyes ML, Lee JD (2007) Real-time detection of driver cognitive distraction using support vector machines. IEEE Trans Intell Transp Syst 8:340–350 CrossRefGoogle Scholar
  23. Lipps M, Pelz JB (2004) Yarbus revisited: task-dependent oculomotor behavior. J Vis 4(8):115 CrossRefGoogle Scholar
  24. Liu Y, Hsueh PY, Lai J, Sangin M, Nüssli MA, Dillenbourg P (2009) Who is the expert? Analyzing gaze data to predict expertise level in collaborative applications. In: Proceedings of the 2009 IEEE international conference on multimedia and expo Google Scholar
  25. Loboda TD, Brusilovsky P (2010) User-adaptive explanatory program visualization: evaluation and insights from eye movements. User Model User-Adapt Interact 20:191–226 CrossRefGoogle Scholar
  26. Meyer D, Leischa F, Hornikb K (2003) The support vector machine under test. Neurocomputing 55:169–186 CrossRefGoogle Scholar
  27. Morgan PL, Waldron SM, King SL, Patrick J (2007) Harder to access, better performance? The effects of information access cost on strategy and performance. In: Proceedings of the 2007 conference on human interface: part I. Springer, Berlin, pp 115–125 Google Scholar
  28. O’Hara KP, Payne SJ (1998) The effects of operator implementation cost on planfulness of problem solving and learning. Cogn Psychol 35:34–70 CrossRefGoogle Scholar
  29. Rayner K (1998) Eye movements in reading and information processing: 20 years of research. Psychol Bull 124(3):372–422 CrossRefGoogle Scholar
  30. Salvucci DD (2001) An integrated model of eye movements and visual encoding. J Cogn Syst 1(4):201–220 CrossRefGoogle Scholar
  31. Salvucci DD, Goldberg JH (2000) Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 symposium on eye tracking research & applications, ETRA’00. ACM, New York, pp 71–78 CrossRefGoogle Scholar
  32. Simola J, Salojärvi J, Kojo I (2008) Using hidden Markov model to uncover processing states from eye movements in information search tasks. Cogn Syst Res 9(4):237–251 CrossRefGoogle Scholar
  33. Smith JD, Graham TCN (2006) Use of eye movements for video game control. In: ACM advancements in computer entertainment technology (ACE’06). ACM, New York, article no. 20 Google Scholar
  34. Surakka V, Illi M, Isokoski P (2003) Voluntary eye movements in human-computer interaction. North-Holland, Amsterdam, p 471 (Chap 22) Google Scholar
  35. van Someren MW, Barnard YF, Sandberg JAC (1994) The think aloud method: a practical guide to modelling cognitive processes. Academic Press, San Diego Google Scholar
  36. Velichkovsky BM (1999) From levels of processing to stratification of cognition: converging evidence from three domains of research. Benjamins, Amsterdam Google Scholar
  37. Vidal M, Bulling A, Gellersen H (2011) Analysing EOG signal features for the discrimination of eye movements with wearable devices. In: Proceedings of the 1st international workshop on pervasive eye tracking and mobile eye-based interaction, PETMEI’11. ACM, New York, pp 15–20 Google Scholar
  38. Vrochidis S, Patras I, Kompatsiaris I (2011) An eye-tracking-based approach to facilitate interactive video search. In: Proceedings of the 1st ACM international conference on multimedia retrieval, ICMR’11. ACM, New York, pp 43:1–43:8 Google Scholar
  39. Xu S, Jiang H, Lau FC (2008) Personalized online document, image and video recommendation via commodity eye-tracking. In: Proceedings of the 2008 ACM conference on recommender systems, RecSys’08. ACM, New York, pp 83–90 CrossRefGoogle Scholar
  40. Xu S, Jiang H, Lau FC (2009) User-oriented document summarization through vision-based eye-tracking. In: Proceedings of the 14th international conference on intelligent user interfaces, IUI’09. ACM, New York, pp 7–16 Google Scholar
  41. Yarbus AL (1967) Eye movements during perception of complex objects. Plenum, New York, pp 171–196 (Chap VII) Google Scholar

Copyright information

© Springer-Verlag London 2013

Authors and Affiliations

  1. 1.University of Eastern FinlandJoensuuFinland

Personalised recommendations