Advertisement

Gaze-Based Attention-Aware Cyberlearning Technologies

  • Sidney K. D’Mello
Chapter
Part of the Educational Communications and Technology: Issues and Innovations book series (ECTII)

Abstract

Over a century of cognitive psychology has taught us that attention plays a central role in cognition, especially in learning. Accordingly, the central thesis of this chapter is that next-generation learning technologies should include mechanisms to model and respond to learners’ attentional states. As a step in this direction, this chapter proposes a macro-theoretic framework that encompasses various forms of overt and covert states of attention (e.g., alternative vs. divided attention) and inattention (e.g., zone outs vs. tune outs). It then provides examples of three attention-aware cyberlearning technologies that utilize eye tracking as a window into learners’ attentional states. The first of these is GazeTutor, which uses eye movements to detect overt inattentional lapses and attempts to redirect attention with a set of gaze-reactive dialogue moves. The second system address more covert forms of inattention by using eye movements to detect instances of mind wandering and responding with interpolated questions, self-explanations, and re-reading opportunities. The third example attempts to graduate such technologies from the lab into real-world classrooms by using consumer-off-the-shelf eye trackers as entire classes of students individually interact with a cyberlearning technology. The chapter concludes by suggesting key next-steps for the field of attentional-aware cyberlearning.

Keywords

Attentional computing Attention-aware Cyberlearning Mind wandering Eye tracking 

Notes

Acknowledgements

This research was supported by the National Science Foundation (NSF) (DRL 1235958 and IIS 1523091). Any opinions, findings, and conclusions, or recommendations expressed in this paper are those of the authors and do not necessarily reflect the views of NSF.

References

  1. Anderson, J. R. (2002). Spanning seven orders of magnitude: A challenge for cognitive modeling. Cognitive Science, 26(1), 85–112.Google Scholar
  2. Azevedo, R. (2009). Theoretical, methodological, and analytical challenges in the research on metacognition and self-regulation: A commentary. Metacognition & Learning, 4, 87–95.Google Scholar
  3. Baker, R., D’Mello, S. K., Rodrigo, M., & Graesser, A. (2010). Better to be frustrated than bored: The incidence, persistence, and impact of learners’ cognitive–affective states during interactions with three different computer-based learning environments. International Journal of Human-Computer Studies, 68(4), 223–241.Google Scholar
  4. Blanchard, N., Bixler, R., Joyce, T., & D’Mello, S. K. (2014). Automated physiological-based detection of mind wandering during learning. In S. Trausan-Matu, K. Boyer, M. Crosby, & K. Panourgia (Eds.), Proceedings of the 12th International Conference on Intelligent Tutoring Systems (ITS 2014) (pp. 55–60). Switzerland: Springer.Google Scholar
  5. Bondareva, D., Conati, C., Feyzi-Behnagh, R., Harley, J. M., Azevedo, R., & Bouchet, F. (2013). Inferring learning from gaze data during interaction with an environment to support self-regulated learning. In K. Yacef, C. Lane, J. Mostow, & P. Pavlik (Eds.), Proceedings of the 16th International Conference on Artificial Intelligence in Education (AIED 2013) (pp. 229–238). Berlin: Springer.Google Scholar
  6. Boys, C. V. (1895). Soap bubbles, their colours and the forces which mold them. London: Society for Promoting Christian Knowledge.Google Scholar
  7. Calvo, R. A., & D’Mello, S. K. (2010). Affect detection: An interdisciplinary review of models, methods, and their applications. IEEE Transactions on Affective Computing, 1(1), 18–37.  https://doi.org/10.1109/T-AFFC.2010.1 Google Scholar
  8. Carenini, G., Conati, C., Hoque, E., Steichen, B., Toker, D., & Enns, J. (2014). Highlighting interventions and user differences: Informing adaptive information visualization support. In Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems (pp. 1835–1844). New York: ACM.Google Scholar
  9. Chandler, P., & Sweller, J. (1991). Cognitive load theory and the format of instruction. Cognition and Instruction, 8(4), 293–332.Google Scholar
  10. Cherry, E. C. (1953). Some experiments on the recognition of speech, with one and with two ears. The Journal of the Acoustical Society of America, 25(5), 975–979.Google Scholar
  11. Conati, C., Aleven, V., & Mitrovic, A. (2013). Eye-tracking for student modelling in intelligent tutoring systems. In R. Sottilare, A. Graesser, X. Hu, & H. Holden (Eds.), Design Recommendations for intelligent tutoring systems—Volume 1: Learner modeling (pp. 227–236). Orlando, FL: Army Research Laboratory.Google Scholar
  12. Conati, C., & Merten, C. (2007). Eye-tracking for user modeling in exploratory learning environments: An empirical evaluation. Knowledge-Based Systems, 20(6), 557–574.  https://doi.org/10.1016/j.knosys.2007.04.010 Google Scholar
  13. D’Mello, S., Olney, A., Williams, C., & Hays, P. (2012). Gaze tutor: A gaze-reactive intelligent tutoring system. International Journal of Human-Computer Studies, 70(5), 377–398.Google Scholar
  14. D’Mello, S. K. (2016). Giving eyesight to the blind: Towards attention-aware AIED. International Journal of Artificial Intelligence in Education, 26(2), 645–659.Google Scholar
  15. D’Mello, S. K. (2019). What do we think about when we learn? In K. Millis, J. Magliano, D. Long & K. Wiemer (Eds.), Understanding Deep Learning, Educational Technologies and Deep Learning, and Assessing Deep Learning. New York, NY: Routledge/Taylor and Francis.Google Scholar
  16. D’Mello, S. K., Mills, C., Bixler, R., & Bosch, N. (2017). Zone out no more: Mitigating mind wandering during computerized reading. In X. Hu, T. Barnes, A. Hershkovitz & L. Paquette (Eds.), Proceedings of the 10th International Conference on Educational Data Mining (pp. 8–15). International Educational Data Mining Society.Google Scholar
  17. Damrad-Frye, R., & Laird, J. D. (1989). The experience of boredom: The role of the self-perception of attention. Journal of Personality and Social Psychology, 57(2), 315.Google Scholar
  18. Deubel, H., & Schneider, W. X. (1996). Saccade target selection and object recognition: Evidence for a common attentional mechanism. Vision Research, 36(12), 1827–1837.Google Scholar
  19. Dong, Y., Hu, Z., Uchimura, K., & Murayama, N. (2011). Driver inattention monitoring system for intelligent vehicles: A review. IEEE Transactions on Intelligent Transportation Systems, 12(2), 596–614.Google Scholar
  20. Drummond, J., & Litman, D. (2010). In the zone: Towards detecting student zoning out using supervised machine learning. In V. Aleven, J. Kay & J. Mostow (Eds.), Intelligent tutoring systems (Vol. 6095, pp. 306–308). Berlin: Springer.Google Scholar
  21. Egeth, H. E., & Yantis, S. (1997). Visual attention: Control, representation, and time course. Annual Review of Psychology, 48(1), 269–297.Google Scholar
  22. Faber, M., Bixler, R., & D’Mello, S. K. (2018). An automated behavioral measure of mind wandering during computerized reading. Behavior Research Methods, 50(1), 134–150.Google Scholar
  23. Fisher, C. D. (1993). Boredom at Work—A neglected concept. Human Relations, 46(3), 395–417.Google Scholar
  24. Forbes-Riley, K., & Litman, D. (2011). When does disengagement correlate with learning in spoken dialog computer tutoring? In S. Bull & G. Biswas (Eds.), Proceedings of the 15th International Conference on Artificial Intelligence in Education (pp. 81–89). Berlin: Springer.Google Scholar
  25. Franklin, M. S., Smallwood, J., & Schooler, J. W. (2011). Catching the mind in flight: Using behavioral indices to detect mindless reading in real time. Psychonomic Bulletin & Review, 18(5), 992–997.Google Scholar
  26. Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59–109.Google Scholar
  27. Gluck, K. A., Anderson, J. R., & Douglass, S. A. (2000). Broader bandwidth in student modeling: What if ITS were “Eye” TS? In C. Gauthier, C. Frasson, & K. VanLehn (Eds.), Proceedings of the 5th International Conference on Intelligent Tutoring Systems (pp. 504–513). Berlin: Springer.Google Scholar
  28. Graesser, A., Louwerse, M., McNamara, D., Olney, A., Cai, Z., & Mitchell, H. (2007). Inference generation and cohesion in the construction of situation models: Some connections with computational linguistics. In F. Schmalhofer & C. Perfetti (Eds.), Higher level language processes in the brain: Inferences and comprehension processes. Mahwah, NJ: Erlbaum.Google Scholar
  29. Graesser, A., Lu, S., Olde, B., Cooper-Pye, E., & Whitten, S. (2005). Question asking and eye tracking during cognitive disequilibrium: Comprehending illustrated texts on devices when the devices break down. Memory and Cognition, 33, 1235–1247.  https://doi.org/10.3758/BF03193225 Google Scholar
  30. Guthrie, J. T., & Wigfield, A. (2000). Engagement and motivation in reading. In M. L. Kamil, P. D. Pearson, & R. Barr (Eds.), Handbook of reading research (Vol. 3, pp. 403–422). Mahwah, NJ: Lawrence Erlbaum.Google Scholar
  31. Hegarty, M., & Just, M. (1993). Constructing mental models of machines from text and diagrams. Journal of Memory and Language, 32(6), 717–742.Google Scholar
  32. Hoffman, J. E., & Subramaniam, B. (1995). The role of visual attention in saccadic eye movements. Attention, Perception, & Psychophysics, 57(6), 787–795.Google Scholar
  33. Hutt, S., Mills, C., Bosch, N., Krasich, K., Brockmole, J. R., & D’Mello, S. K. (2017). Out of the Fr-Eye-ing Pan: Towards gaze-based models of attention during learning with technology in the classroom. In M. Bielikova, E. Herder, F. Cena, & M. Desmarais (Eds.), Proceedings of the 2017 Conference on User Modeling, Adaptation, and Personalization (pp. 94–103). New York: ACM.Google Scholar
  34. Hutt, S., Mills, C., White, S., Donnelly, P. J., & D’Mello, S. K. (2016). The eyes have it: Gaze-based detection of mind wandering during learning with an intelligent tutoring system. In Proceedings of the 9th International Conference on Educational Data Mining (EDM 2016) (pp. 86–93). International Educational Data Mining Society.Google Scholar
  35. Jaques, N., Conati, C., Harley, J. M., & Azevedo, R. (2014). Predicting affect from gaze data during interaction with an intelligent tutoring system. Paper presented at the Intelligent Tutoring Systems.Google Scholar
  36. Kane, M. J., Brown, L. H., McVay, J. C., Silvia, P. J., Myin-Germeys, I., & Kwapil, T. R. (2007). For whom the mind wanders, and when an experience-sampling study of working memory and executive control in daily life. Psychological Science, 18(7), 614–621.Google Scholar
  37. Kardan, S., & Conati, C. (2012). Exploring gaze data for determining user learning with an interactive simulation. In S. Carberry, S. Weibelzahl, A. Micarelli, & G. Semeraro (Eds.), Proceedings of the 20th International Conference on User Modeling, Adaptation, and Personalization (UMAP 2012) (pp. 126–138). Berlin: Springer.Google Scholar
  38. Kinchla, R. A. (1992). Attention. Annual Review of Psychology, 43, 711–743.Google Scholar
  39. Knoblich, G., Öllinger, M., & Spivey, M. J. (2005). Tracking the eyes to obtain insight into insight problem solving. In G. Underwood (Ed.), Cognitive processes in eye guidance (pp. 355–375). Oxford: Oxford University Press.Google Scholar
  40. Larson, R. W., & Richards, M. H. (1991). Boredom in the middle school years—Blaming schools versus blaming students. American Journal of Education, 99(4), 418–443.Google Scholar
  41. Linnenbrink, E. (2007). The role of affect in student learning: A mulit-dimensional approach to considering the interaction of affect, motivation and engagement. In P. Schutz & R. Pekrun (Eds.), Emotions in education (pp. 107–124). San Diego, CA: Academic Press.Google Scholar
  42. Liu, N.-H., Chiang, C.-Y., & Chu, H.-C. (2013). Recognizing the degree of human attention using EEG signals from mobile sensors. Sensors, 13(8), 10273–10286.Google Scholar
  43. Mann, S., & Robinson, A. (2009). Boredom in the lecture theatre: An investigation into the contributors, moderators and outcomes of boredom amongst university students. British Educational Research Journal, 35(2), 243–258.Google Scholar
  44. Marshall, S. P. (2005). Assessing cognitive engagement and cognitive state from eye metrics. In D. D. Schmorrow (Ed.), Foundations of augmented cognition (pp. 312–320). Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
  45. Mathews, M., Mitrovic, A., Lin, B., Holland, J., & Churcher, N. (2012). Do your eyes give it away? Using eye tracking data to understand students’ attitudes towards open student model representations. In S. A. Cerri, W. J. Clancey, G. Papadourakis, & K.-K. Panourgia (Eds.), Proceedings of the 11th International Conference on Intelligent Tutoring Systems (pp. 422–427). Berlin: Springer.Google Scholar
  46. Mehu, M., & Scherer, K. (2012). A psycho-ethological approach to social signal processing. Cognitive Processing, 13(2), 397–414.Google Scholar
  47. Mills, C., Gregg, J., Bixler, R., & D’Mello, S. K. (in prep.). Dynamic “deep” attentional reengagement during reading via automated mind wandering detection.Google Scholar
  48. Mills, C., & D’Mello, S. K. (2015). Toward a real-time (day) dreamcatcher: Detecting mind wandering episodes during online reading. In C. Romero, M. Pechenizkiy, J. Boticario, & O. Santos (Eds.), Proceedings of the 8th International Conference on Educational Data Mining (EDM 2015). International Educational Data Mining Society.Google Scholar
  49. Moreno, R. (2005). Instructional technology: Promise and pitfalls. In L. PytlikZillig, M. Bodvarsson, & R. Bruning (Eds.), Technology-based education: Bringing researchers and practitioners together (pp. 1–19). Greenwich, CT: Information Age Publishing.Google Scholar
  50. Moreno, R., & Mayer, R. (2007). Interactive multimodal learning environments. Educational Psychology Review, 19(3), 309–326.  https://doi.org/10.1007/s10648-007-9047-2 Google Scholar
  51. Moss, J., Schunn, C. D., Schneider, W., & McNamara, D. S. (2013). The nature of mind wandering during reading varies with the cognitive control demands of the reading strategy. Brain Research, 1539, 48–60.Google Scholar
  52. Moss, J., Schunn, C. D., VanLehn, K., Schneider, W., McNamara, D. S., & Jarbo, K. (2008). They were trained, but they did not all learn: Individual differences in uptake of learning strategy training. In B. C. Love, K. McRae, & V. M. Sloutsky (Eds.), Proceedings of the 30th Annual Meeting of the Cognitive Science Society (pp. 1389–1395). Austin, TX: Cognitive Science Society.Google Scholar
  53. Muir, M., & Conati, C. (2012). An analysis of attention to student–adaptive hints in an educational game. In S. A. Cerri, W. J. Clancey, G. Papadourakis, & K. Panourgia (Eds.), Proceedings of the International Conference on Intelligent Tutoring Systems (pp. 112–122). Berlin: Springer.Google Scholar
  54. Navalpakkam, V., Kumar, R., Li, L., & Sivakumar, D. (2012). Attention and selection in online choice tasks. Paper presented at the Proceedings of the International Conference on User Modeling, Adaptation, and Personalization.Google Scholar
  55. Olney, A., D’Mello, A., Person, N., Cade, W., Hays, P., Williams, C., et al. (2012). Guru: A computer tutor that models expert human tutors. In S. Cerri, W. Clancey, G. Papadourakis, & K. Panourgia (Eds.), Proceedings of the 11th International Conference on Intelligent Tutoring Systems (pp. 256–261). Berlin: Springer.Google Scholar
  56. Paas, F., Renkl, A., & Sweller, J. (2003). Cognitive load theory and instructional design: Recent developments. Educational Psychologist, 38(1), 1–4.Google Scholar
  57. Patall, E., Cooper, H., & Robinson, J. (2008). The effects of choice on intrinsic motivation and related outcomes: A meta-analysis of research findings. Psychological Bulletin, 134(2), 270–300.Google Scholar
  58. Pekrun, R., Goetz, T., Daniels, L., Stupnisky, R. H., & Perry, R. (2010). Boredom in achievement settings: Exploring control–value antecedents and performance outcomes of a neglected emotion. Journal of Educational Psychology, 102(3), 531–549.  https://doi.org/10.1037/a0019243 Google Scholar
  59. Pham, P., & Wang, J. (2015). AttentiveLearner: Improving mobile MOOC learning via implicit heart rate tracking. In International Conference on Artificial Intelligence in Education (pp. 367–376). Berlin: Springer.Google Scholar
  60. Picard, R. (1997). Affective computing. Cambridge, MA: MIT Press.Google Scholar
  61. Ponce, H. R., & Mayer, R. E. (2014). Qualitatively different cognitive processing during online reading primed by different study activities. Computers in Human Behavior, 30(1), 121–130.Google Scholar
  62. Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 124(3), 372–422.Google Scholar
  63. Roda, C., & Thomas, J. (2006). Attention aware systems: Theories, applications, and research agenda. Computers in Human Behavior, 22(4), 557–587.  https://doi.org/10.1016/j.chb.2005.12.005 Google Scholar
  64. Schnotz, W. (2005). An integrated model of text and picture comprehension. In R. Mayer (Ed.), The Cambridge handbook of multimedia learning (pp. 49–69). New York: Cambridge University Press.Google Scholar
  65. Schnotz, W., & Bannert, M. (2003). Construction and interference in learning from multiple representation. Learning and Instruction, 13(2), 141–156.Google Scholar
  66. Seli, P., Risko, E. F., & Smilek, D. (2016). On the necessity of distinguishing between unintentional and intentional mind wandering. Psychological Science, 27(5), 685–691.Google Scholar
  67. Shernoff, D. J., Csikszentmihalyi, M., Shneider, B., & Shernoff, E. S. (2003). Student engagement in high school classrooms from the perspective of flow theory. School Psychology Quarterly, 18(2), 158.Google Scholar
  68. Sibert, J. L., Gokturk, M., & Lavine, R. A. (2000). The reading assistant: Eye gaze triggered auditory prompting for reading remediation. In Proceedings of the 13th annual ACM Symposium on User Interface Software and Technology (pp. 101–107). New York, NY: ACM.Google Scholar
  69. Smallwood, J., Fishman, D. J., & Schooler, J. W. (2007). Counting the cost of an absent mind: Mind wandering as an underrecognized influence on educational performance. Psychonomic Bulletin & Review, 14(2), 230–236.Google Scholar
  70. Smallwood, J., & Schooler, J. W. (2015). The science of mind wandering: Empirically navigating the stream of consciousness. Annual Review of Psychology, 66, 487–518.Google Scholar
  71. Sottilare, R., Graesser, A., Hu, X., & Holden, H. K. (Eds.). (2013). Design recommendations for intelligent tutoring systems: Volume 1: Learner modeling. Orlando, FL: U.S. Army Research Laboratory.Google Scholar
  72. Sparfeldt, J. R., Buch, S. R., Schwarz, F., Jachmann, J., & Rost, D. H. (2009). “Maths is boring”—Boredom in mathematics in elementary school children. Psychologie in Erziehung und Unterricht, 56(1), 16–26.Google Scholar
  73. St. John, M., Kobus, D. A., Morrison, J. G., & Schmorrow, D. (2004). Overview of the DARPA augmented cognition technical integration experiment. International Journal of Human Computer Interaction, 17(2), 131–149.Google Scholar
  74. Stawarczyk, D., Majerus, S., Maj, M., Van der Linden, M., & D’Argembeau, A. (2011). Mind-wandering: Phenomenology and function as assessed with a novel experience sampling method. Acta Psychologica, 136(3), 370–381.Google Scholar
  75. Steichen, B., Wu, M. M., Toker, D., Conati, C., & Carenini, G. (2014). Te, Te, Hi, Hi: Eye gaze sequence analysis for informing user-adaptive information visualizations. In V. Dimitrova, T. Kuflik, D. Chin, F. Ricci, P. Dolog, & G.-J. Houben (Eds.), Proceedings of the 22nd International Conference on User Modeling, Adaptation, and Personalization (pp. 183–194). Basel: Springer.Google Scholar
  76. Stewart, A., Bosch, N., Chen, H., Donnelly, P., & D’Mello, S. (2017). Face forward: Detecting mind wandering from video during narrative film comprehension. In E. André, R. Baker, X. Hu, M. Rodrigo, & B. du Boulay (Eds.), Proceedings of the 18th International Conference on Artificial Intelligence in Education (AIED 2017) (pp. 359–370). Berlin: Springer.Google Scholar
  77. Strain, A., Azevedo, R., & D’Mello, S. K. (2013). Using a false biofeedback methodology to explore relationships between learners’ affect, metacognition, and performance. Contemporary Educational Psychology, 38(1), 22–39.Google Scholar
  78. Strain, A., & D’Mello, S. (2014). Affect regulation during learning: The enhancing effect of cognitive reappraisal. Applied Cognitive Psychology, 29(1), 1–19.  https://doi.org/10.1002/acp.3049 Google Scholar
  79. Sun, J. C.-Y., & Yeh, K. P.-C. (2017). The effects of attention monitoring with EEG biofeedback on university students’ attention and self-efficacy: The case of anti-phishing instructional materials. Computers & Education, 106, 73–82.Google Scholar
  80. Szpunar, K. K., Khan, N. Y., & Schacter, D. L. (2013). Interpolated memory tests reduce mind wandering and improve learning of online lectures. Proceedings of the National Academy of Sciences, 110(16), 6313–6317.Google Scholar
  81. Tobias, S. (1994). Interest, prior knowledge, and learning. Review of Educational Research, 64, 37–54.Google Scholar
  82. van Gog, T., Jarodzka, H., Scheiter, K., Gerjets, P., & Paas, F. (2009). Attention guidance during example study via the model’s eye movements. Computers in Human Behavior, 25(3), 785–791.  https://doi.org/10.1016/j.chb.2009.02.007 Google Scholar
  83. van Gog, T., & Scheiter, K. (2010). Eye tracking as a tool to study and enhance multimedia learning. Learning and Instruction, 20(2), 95–99.Google Scholar
  84. Vinciarelli, A., Pantic, M., & Bourlard, H. (2009). Social signal processing: Survey of an emerging domain. Image and Vision Computing, 27(12), 1743–1759.Google Scholar
  85. Wang, H., Chignell, M., & Ishizuka, M. (2006). Empathic tutoring software agents using real-time eye tracking. In Proceedings of the 2006 Symposium on Eye Tracking Research & Applications (pp. 73–78). New York: ACM.Google Scholar
  86. Whitehill, J., Serpell, Z., Lin, Y.-C., Foster, A., & Movellan, J. (2014). The faces of engagement: Automatic recognition of student engagement from facial expressions. IEEE Transactions on Affective Computing, 5(1), 86–98.Google Scholar
  87. Yonetani, R., Kawashima, H., & Matsuyama, T. (2012). Multi-mode saliency dynamics model for analyzing gaze and attention. Paper presented at the Proceedings of the Symposium on Eye Tracking Research and Applications.Google Scholar

Copyright information

© Association for Educational Communications and Technology 2019

Authors and Affiliations

  1. 1.Institute of Cognitive Science, University of Colorado BoulderBoulderUSA

Personalised recommendations