Advertisement

Interactive Visualization for Understanding of Attention Patterns

  • Truong-Huy D. NguyenEmail author
  • Magy Seif El-Nasr
  • Derek M. Isaacowitz
Conference paper
Part of the Mathematics and Visualization book series (MATHVISUAL)

Abstract

Discovering users’ behavior via eye-tracking data analysis is a common task that has important implications in many domains including marketing, design, behavior study, and psychology. In our project, we are interested in analyzing eye-tracking data to investigate differences between age groups in emotion regulation using visual attention. To achieve this goal, we adopted a general-purposed interactive visualization method, namely Glyph, to conduct temporal analysis on participants’ fixation data. Glyph facilitates comparison of abstract data sequences to understand group and individual patterns. In this article, we show how a visualization system adopting the Glyph method can be constructed, allowing us to understand how users shift their fixations and dwelling given different stimuli, and how different user groups differ in terms of these temporal eye-tracking patterns. The discussion demonstrates the utility of Glyph not only for the purpose of our project, but also for other eye-tracking data analyses that require exploration within the space of temporal patterns.

Keywords

Emotion Regulation Video Clip State Graph Dynamic Time Warping Emotion Regulation Strategy 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgements

This work was supported in part by NIA grant R21 AG044961.

References

  1. 1.
    Anderson, N.C., Anderson, F., Kingstone, A., Bischof, W.F.: A comparison of scanpath comparison methods. Behav. Res. Methods (2014). doi:10.3758/s13428-014-0550-3Google Scholar
  2. 2.
    Atkins, M.S., Jiang, X., Tien, G., Zheng, B.: Saccadic delays on targets while watching videos. In: Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA’12), Santa Barbara, p. 405 (2012). doi:10.1145/2168556.2168648Google Scholar
  3. 3.
    Berg, D.J., Boehnke, S.E., Marino, R.A., Munoz, D.P., Itti, L.: Free viewing of dynamic stimuli by humans and monkeys. J. Vis. 9 (5), 19.1–15 (2009). doi:10.1167/9.5.19CrossRefGoogle Scholar
  4. 4.
    Berndt, D.J., Clifford, J.: Using dynamic time warping to find patterns in time series. In: Proceedings of KDD’94: AAAI Workshop on Knowledge Discovery in Databases, Seattle, vol. 10, pp. 359–370 (1994)Google Scholar
  5. 5.
    Blascheck, T., Kurzhals, K., Raschke, M., Burch, M., Weiskopf, D., Ertl, T.: State-of-the-art of visualization for eye tracking data. In: Eurographics Conference on Visualization (EuroVis), Swansea (2014)Google Scholar
  6. 6.
    Blascheck, T., Raschke, M., Ertl, T.: Circular heat map transition diagram. In: Proceedings of the 2013 Conference on Eye Tracking South Africa (ETSA’13). ACM, New York, pp. 58–61 (2013). doi:10.1145/2509315.2509326Google Scholar
  7. 7.
    Bojko, A.: Informative or Misleading? Heatmaps Deconstructed. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 5610, pp. 30–39 (2009). doi:10.1007/978-3-642-02574-7_4Google Scholar
  8. 8.
    Brasel, S.A., Gips, J.: Points of view: where do we look when we watch TV? Perception 37 (12), 1890–1894 (2008). doi:10.1068/p6253CrossRefGoogle Scholar
  9. 9.
    Clement, J.: Visual influence on in-store buying decisions: an eye-track experiment on the visual influence of packaging design. J. Mark. Manag. 23 (9–10), 917–928 (2007). doi:10.1362/026725707X250395CrossRefGoogle Scholar
  10. 10.
    Dorr, M., Martinetz, T., Gegenfurtner, K.R., Barth, E.: Variability of eye movements when viewing dynamic natural scenes. J. Vis. 10 (10), 28 (2010). doi:10.1167/10.10.28CrossRefGoogle Scholar
  11. 11.
    Duchowski, A.T., McCormick, B.H.: Gaze-contingent video resolution degradation. Hum. Vis. Electron. Imaging III 3299, 318–329 (1998). doi:10.1117/12.320122CrossRefGoogle Scholar
  12. 12.
    Duchowski, A.T., Price, M.M., Meyer, M., Orero, P.: Aggregate gaze visualization with real-time heatmaps. In: Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA’12). ACM, New York, p. 13 (2012). doi:10.1145/2168556.2168558Google Scholar
  13. 13.
    Goldberg, J., Helfman, J.: Visual scanpath representation. In: Proceedings of the 2010 Symposium on Eye-Tracking Research and Applications, Austin, pp. 203–210 (2010). doi:10.1145/1743666.1743717Google Scholar
  14. 14.
    Grindinger, T., Duchowski, A.T., Sawyer, M.: Group-wise similarity and classification of aggregate scanpaths. In: Eye Tracking Research & Applications (ETRA) Symposium, Austin, pp. 101–104 (2010). doi:10.1145/1743666.1743691Google Scholar
  15. 15.
    Gross, J.J.: The emerging field of emotion regulation: an integrative review. Rev. Gen. Psychol. 2 (5), 271–299 (1998). doi:10.1037/1089-2680.2.3.271CrossRefGoogle Scholar
  16. 16.
    Havre, S., Hetzler, E., Whitney, P., Nowell, L.: ThemeRiver: visualizing thematic changes in large document collections. IEEE Trans. Visual. Comput. Graph. 8 (1), 9–20 (2002). doi:10.1109/2945.981848CrossRefGoogle Scholar
  17. 17.
    Hennessey, C., Fiset, J.: Long range eye tracking: bringing eye tracking into the living room.In: Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA’12), Santa Barbara, pp. 249–252 (2012). DOI:10.1145/2168556.2168608Google Scholar
  18. 18.
    Hervet, G., Guérard, K., Tremblay, S., Chtourou, M.S.: Is banner blindness genuine? Eye tracking internet text advertising. Appl. Cognit. Psychol. 25, 708–716 (2011). doi:10.1002/acp.1742CrossRefGoogle Scholar
  19. 19.
    Holmqvist, K., Holsanova, J., Barthelson, M., Lundqvist, D.: Reading or scanning? A study of newspaper and net paper reading. In: Hyönä, J., Radach, R., Deubel, H. (eds.) The Mind’s Eye, pp. 657–670. Elsevier Science BV, Amsterdam, The Netherlands (2003)CrossRefGoogle Scholar
  20. 20.
    Hurter, C., Ersoy, O., Fabrikant, S.I., Klein, T.R., Telea, A.C.: Bundled visualization of dynamic graph and trail data. IEEE Trans. Visual. Comput. Graph. 20 (8), 1141–1157 (2014). doi:10.1109/TVCG.2013.246CrossRefGoogle Scholar
  21. 21.
    Isaacowitz, D.M.: Mood regulation in real time: age differences in the role of looking. Curr. Dir. Psychol. Sci. 21, 237–242 (2012). doi:10.1177/0963721412448651CrossRefGoogle Scholar
  22. 22.
    Isaacowitz, D.M., Wadlinger, H.A., Goren, D., Wilson, H.R.: Selective preference in visual fixation away from negative images in old age? An eye-tracking study. Psychol. Aging 21 (1), 40–48 (2006). doi:10.1037/0882-7974.21.2.221CrossRefGoogle Scholar
  23. 23.
    Jacob, R.J.K., Karn, K.S.: Eye tracking in human computer interaction and usability research: ready to deliver the promises. In: The Mind’s Eye: Cognitive and Applied Aspects of Eye Movement Research, pp. 573–605, Amsterdam, Boston (2003). doi:10.1016/B978-044451020-4/50031-1Google Scholar
  24. 24.
    Konstantopoulos, P., Chapman, P., Crundall, D.: Driver’s visual attention as a function of driving experience and visibility. Using a driving simulator to explore drivers’ eye movements in day, night and rain driving. Accid. Anal. Prev. 42 (3), 827–34 (2010). doi:10.1016/j.aap.2009.09.022Google Scholar
  25. 25.
    Kurzhals, K., Weiskopf, D.: Space-time visual analytics of eye-tracking data for dynamic stimuli. IEEE Trans. Visual. Comput. Graph. 19 (12), 2129–2138 (2013). doi:10.1109/TVCG.2013.194CrossRefGoogle Scholar
  26. 26.
    Lankford, C.: Gazetracker: software designed to facilitate eye movement analysis. In: Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA’00), Palm Beach Gardens, pp. 51–55 (2000). doi:10.1145/355017.355025Google Scholar
  27. 27.
    Lethaus, F., Rataj, J.: Do eye movements reflect driving manoeuvres? IET Intell. Transp. Syst. 1 (3), 199 (2007). doi:10.1049/iet-its:20060058CrossRefGoogle Scholar
  28. 28.
    Mat Zain, N., Abdul Razak, F., Jaafar, A., Zulkipli, M.: Eye tracking in educational games environment: evaluating user interface design through eye tracking patterns. In: Visual Informatics: Sustaining Research and Innovations, Selangor, vol. 7067, pp. 64–73 (2011). doi:10.1007/978-3-642-25200-6_7CrossRefGoogle Scholar
  29. 29.
    Mital, P.K., Smith, T.J., Hill, R.L., Henderson, J.M.: Clustering of gaze during dynamic scene viewing is predicted by motion. Cognit. Comput. 3 (1), 5–24 (2011). doi:10.1007/s12559-010-9074-zCrossRefGoogle Scholar
  30. 30.
    Nguyen, T.H.D., Seif El-Nasr, M., Canossa, A.: Glyph: visualization tool for understanding problem solving strategies in puzzle games. In: Foundations of Digital Games (FDG), Pacific Grove (2015)Google Scholar
  31. 31.
    Pieters, R., Wedel, M.: A review of eye-tracking in marketing research. In: Review of Marketing Research, pp. 123–147 (2008). http://dx.doi.org/10.1108/S1548-6435(2008)0000004009
  32. 32.
    Räihä, K.j., Aula, A., Majaranta, P., Rantala, H., Koivunen, K.: Static visualization of temporal eye-tracking data. In: IFIP International Federation for Information Processing, pp. 946–949. Springer, New york (2005). doi:10.1007/11555261_76Google Scholar
  33. 33.
    Reed, A.E., Carstensen, L.L.: The theory behind the age-related positivity effect. Front. Psychol. 3 (SEP) (2012). doi:10.3389/fpsyg.2012.00339Google Scholar
  34. 34.
    Schmid, P.C., Mast, M.S., Bombari, D., Mast, F.W., Lobmaier, J.S.: How mood states affect information processing during facial emotion recognition: an eye tracking study. Swiss J. Psychol. 70 (4), 223–231 (2011). doi:10.1024/1421-0185/a000060Google Scholar
  35. 35.
    Smith, T.J., Mital, P.K.: Attentional synchrony and the influence of viewing task on gaze behavior in static and dynamic scenes. J. Vis. 13 (8) (2013). http://www.ncbi.nlm.nih.gov/pubmed/23863509
  36. 36.
    Stellmach, S., Nacke, L.E., Dachselt, R.: Advanced gaze visualizations for three-dimensional virtual environments. In: Proceedings of the 2010 Symposium on Eyetracking Research Applications, Austin, pp. 109–112 (2010). doi:10.1145/1743666.1743693. http://portal.acm.org/citation.cfm?doid=1743666.1743693
  37. 37.
    Tory, M., Atkins, M.S., Kirkpatrick, A.E., Nicolaou, M., Yang, G.Z.: Eyegaze analysis of displays with combined 2D and 3D views. In: Proceedings of the IEEE Visualization Conference, Minneapolis, p. 66 (2005). doi:10.1109/VIS.2005.37Google Scholar
  38. 38.
    Wagner, R.A., Fischer, M.J.: The string-to-string correction problem (1974). doi:10.1145/321796.321811Google Scholar
  39. 39.
    Wedel, M., Pieters, R.: Eye tracking for visual marketing. Found. Trends®; Mark. 1 (4), 231–320 (2006). doi:10.1561/1700000011Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Truong-Huy D. Nguyen
    • 1
    Email author
  • Magy Seif El-Nasr
    • 2
  • Derek M. Isaacowitz
    • 2
  1. 1.Texas A&M University-CommerceCommerceUSA
  2. 2.Northeastern UniversityBostonUSA

Personalised recommendations