Eye Fixation Metrics for Large Scale Evaluation and Comparison of Information Visualizations

  • Zoya Bylinskii
  • Michelle A. Borkin
  • Nam Wook Kim
  • Hanspeter Pfister
  • Aude Oliva
Conference paper
Part of the Mathematics and Visualization book series (MATHVISUAL)


An observer’s eye movements are often informative about how the observer interacts with and processes a visual stimulus. Here, we are specifically interested in what eye movements reveal about how the content of information visualizations is processed. Conversely, by pooling over many observers’ worth of eye movements, what can we learn about the general effectiveness of different visualizations and the underlying design principles employed? The contribution of this manuscript is to consider these questions at a large data scale, with thousands of eye fixations on hundreds of diverse information visualizations. We survey existing methods and metrics for collective eye movement analysis, and consider what each can tell us about the overall effectiveness of different information visualizations and designs at this large data scale.


Fixation Duration Information Visualization Textual Element Visualization Design Observer Attention 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



This work was partly funded by awards from Google and Xerox to A.O., NSERC Postgraduate Doctoral Scholarship (PGS-D) to Z.B., NSF Graduate Research Fellowship Program and NSERC Discovery grant to M.B., and a Kwanjeong Educational Foundation grant to N.K.


  1. 1.
    Andrienko, G., Andrienko, N., Burch, M., Weiskopf, D.: Visual analytics methodology for eye movement studies. IEEE TVCG 18 (12), 2889–2898 (2012)Google Scholar
  2. 2.
    Berg, A.C., Berg, T.L., Daume III, H., Dodge, J., Goyal, A., Han, X., Mensch, A., Mitchell, M., Sood, A., Stratos, K., et al.: Understanding and predicting importance in images. In: Computer Vision and Pattern Recognition, pp. 3562–3569. IEEE, Providence, RI (2012)Google Scholar
  3. 3.
    Blascheck, T., Kurzhals, K., Raschke, M., Burch, M., Weiskopf, D., Ertl, T.: State-of-the-art of visualization for eye tracking data. In: Proceedings of EuroVis, vol. 2014, Swansea (2014)Google Scholar
  4. 4.
    Borji, A., Sihite, D.N., Itti, L.: Quantitative analysis of human-model agreement in visual saliency modeling: a comparative study. IEEE Trans. Image Process. 22 (1), 55–69 (2013)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Borji, A., Tavakoli, H.R., Sihite, D.N., Itti, L.: Analysis of scores, datasets, and models in visual saliency prediction. In: IEEE International Conference on Computer Vision, Sydney (2013)CrossRefGoogle Scholar
  6. 6.
    Borkin, M., Bylinskii, Z., Kim, N., Bainbridge, C.M., Yeh, C., Borkin, D., Pfister, H., Oliva, A.: Beyond memorability: visualization recognition and recall. IEEE TVCG 22 (1), 519–528 (2016)Google Scholar
  7. 7.
    Borkin, M.A., Vo, A.A., Bylinskii, Z., Isola, P., Sunkavalli, S., Oliva, A., Pfister, H.: What makes a visualization memorable? IEEE TVCG 19 (12), 2306–2315 (2013)Google Scholar
  8. 8.
    Burch, M., Konevtsova, N., Heinrich, J., Hoeferlin, M., Weiskopf, D.: Evaluation of traditional, orthogonal, and radial tree diagrams by an eye tracking study. IEEE TVCG 17 (12), 2440–2448 (2011)Google Scholar
  9. 9.
    Bylinskii, Z., Judd, T., Borji, A., Itti, L., Durand, F., Oliva, A., Torralba, A.: MIT Saliency Benchmark.
  10. 10.
    Byrne, M.D., Anderson, J.R., Douglass, S., Matessa, M.: Eye tracking the visual search of click-down menus. In: SIGCHI, pp. 402–409. ACM, New York (1999)Google Scholar
  11. 11.
    Carpenter, P.A., Shah, P.: A model of the perceptual and conceptual processes in graph comprehension. J. Exp. Psychol. Appl. 4 (2), 75 (1998)CrossRefGoogle Scholar
  12. 12.
    Cowen, L., Ball, L.J., Delin, J.: An eye movement analysis of web page usability. In: People and Computers XVI, pp. 317–335. Springer, London (2002)Google Scholar
  13. 13.
    Duchowski, A.T.: A breadth-first survey of eye-tracking applications. Behav. Res. Methods Instrum. Comput. 34 (4), 455–470 (2002)CrossRefGoogle Scholar
  14. 14.
    Fawcett, T.: An introduction to ROC analysis. Pattern Recognit. Lett. 27 (8), 861–874 (2006)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Fitts, P.M., Jones, R.E., Milton, J.L.: Eye movements of aircraft pilots during instrument-landing approaches. Ergon. Psychol. Mech. Models Ergon 3, 56 (2005)Google Scholar
  16. 16.
    Frintrop, S., Rome, E., Christensen, H.I.: Computational visual attention systems and their cognitive foundations: a survey. ACM Trans. Appl. Percept. (TAP) 7 (1), 6 (2010)Google Scholar
  17. 17.
    Goldberg, J.H., Helfman, J.I.: Comparing information graphics: a critical look at eye tracking. In: BELIV’10, Atlanta, pp. 71–78. ACM (2010)Google Scholar
  18. 18.
    Goldberg, J.H., Helfman, J.I.: Scanpath clustering and aggregation. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, Austin, pp. 227–234. ACM (2010)Google Scholar
  19. 19.
    Goldberg, J.H., Kotval, X.P.: Computer interface evaluation using eye movements: methods and constructs. Int. J. Ind. Ergon. 24 (6), 631–645 (1999)CrossRefGoogle Scholar
  20. 20.
    Graf, W., Krueger, H.: Ergonomic evaluation of user-interfaces by means of eye-movement data. In: Proceedings of the Third International Conference on Human-Computer Interaction, Boston, pp. 659–665. Elsevier Science Inc. (1989)Google Scholar
  21. 21.
    Grant, E.R., Spivey, M.J.: Eye movements and problem solving guiding attention guides thought. Psychol. Sci. 14 (5), 462–466 (2003)CrossRefGoogle Scholar
  22. 22.
    Gygli, M., Grabner, H., Riemenschneider, H., Nater, F., Gool, L.: The interestingness of images. In: International Conference on Computer Vision, Sydney, pp. 1633–1640 (2013)Google Scholar
  23. 23.
    Hayhoe, M.: Advances in relating eye movements and cognition. Infancy 6 (2), 267–274 (2004)CrossRefGoogle Scholar
  24. 24.
    Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., Van de Weijer, J.: Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press, Oxford/New York (2011)Google Scholar
  25. 25.
    Huang, W.: Using eye tracking to investigate graph layout effects. In: APVIS’07, Sydney, pp. 97–100 (2007)Google Scholar
  26. 26.
    Huang, W., Eades, P.: How people read graphs. In: APVIS’05, Sydney, vol. 45, pp. 51–58 (2005)Google Scholar
  27. 27.
    Huang, W., Eades, P., Hong, S.-H.: A graph reading behavior: geodesic-path tendency. In: PacificVis’09, Kyoto, pp. 137–144 (2009)Google Scholar
  28. 28.
    Isola, P., Xiao, J., Torralba, A., Oliva, A.: What makes an image memorable? In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Colorado Springs, pp. 145–152. IEEE (2011)Google Scholar
  29. 29.
    Jacob, R., Karn, K.S.: Eye tracking in human-computer interaction and usability research: ready to deliver the promises. Mind 2 (3), 4 (2003)Google Scholar
  30. 30.
    Jiang, M., Huang, S., Duan, J., Zhao, Q.: Salicon: saliency in context. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston (2015)Google Scholar
  31. 31.
    Josephson, S., Holmes, M.E.: Visual attention to repeated internet images: testing the scanpath theory on the world wide web. In: Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, pp. 43–49. ACM (2002)Google Scholar
  32. 32.
    Just, M.A., Carpenter, P.A.: Eye fixations and cognitive processes. Cogn. Psychol. 8 (4), 441–480 (1976)CrossRefGoogle Scholar
  33. 33.
    Karayev, S., Trentacoste, M., Han, H., Agarwala, A., Darrell, T., Hertzmann, A., Winnemoeller, H.: Recognizing image style (2013). In: Proceedings British Machine Vision Conference (2014)Google Scholar
  34. 34.
    Khosla, A., Raju, A.S., Torralba, A., Oliva, A.: Understanding and predicting image memorability at a large scale. In: Proceedings of the IEEE International Conference on Computer Vision, Santiago, pp. 2390–2398 (2015)Google Scholar
  35. 35.
    Khosla, A., Xiao, J., Torralba, A., Oliva, A.: Memorability of image regions. In: NIPS, Lake Tahoe, pp. 305–313 (2012)Google Scholar
  36. 36.
    Kim, N.W., Bylinskii, Z., Borkin, M.A., Oliva, A., Gajos, K.Z., Pfister, H.: A crowdsourced alternative to eye-tracking for visualization understanding. In: CHI’15 Extended Abstracts, Seoul, pp. 1349–1354. ACM (2015)Google Scholar
  37. 37.
    Kim, S.-H., Dong, Z., Xian, H., Upatising, B., Yi, J.S.: Does an eye tracker tell the truth about visualizations? Findings while investigating visualizations for decision making. IEEE TVCG 18 (12), 2421–2430 (2012)Google Scholar
  38. 38.
    Kimura, A., Yonetani, R., Hirayama, T.: Computational models of human visual attention and their implementations: a survey. IEICE Trans. Inf. Syst. 96-D, 562-578 (2013)CrossRefGoogle Scholar
  39. 39.
    Körner, C.: Eye movements reveal distinct search and reasoning processes in comprehension of complex graphs. Appl. Cogn. Psychol. 25 (6), 893–905 (2011)CrossRefGoogle Scholar
  40. 40.
    Kowler, E.: The role of visual and cognitive processes in the control of eye movement. Rev. Oculomot. Res. 4, 1–70 (1989)Google Scholar
  41. 41.
    Lankford, C.: Gazetracker: software designed to facilitate eye movement analysis. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, pp. 51–55. ACM (2000)Google Scholar
  42. 42.
    Le Meur, O., Baccino, T.: Methods for comparing scanpaths and saliency maps: strengths and weaknesses. Behav. Res. Methods 45 (1), 251–266 (2013)Google Scholar
  43. 43.
    Le Meur, O., Baccino, T., Roumy, A.: Prediction of the inter-observer visual congruency (IOVC) and application to image ranking. In: Proceedings of the 19th ACM International Conference on Multimedia, pp. 373–382. ACM, New York (2011)Google Scholar
  44. 44.
    Loftus, G.R., Mackworth, N.H.: Cognitive determinants of fixation location during picture viewing. J. Exp. Psychol. Hum. Percept. Perform. 4 (4), 565 (1978)CrossRefGoogle Scholar
  45. 45.
    Noton, D., Stark, L.: Scanpaths in saccadic eye movements while viewing and recognizing patterns. Vis. Res. 11 (9), 929 (1971)CrossRefGoogle Scholar
  46. 46.
    O’Donovan, P., Agarwala, A., Hertzmann, A.: Learning layouts for single-page graphic designs. IEEE TVCG 20 (8), 1200–1213 (2014)Google Scholar
  47. 47.
    Pan, B., Hembrooke, H.A., Gay, G.K., Granka, L.A., Feusner, M.K., Newman, J.K.: The determinants of web page viewing behavior: an eye-tracking study. In: Proceedings of the 2004 Symposium on Eye Tracking Research & Applications, San Antonio, pp. 147–154. ACM (2004)Google Scholar
  48. 48.
    Pelz, J.B., Canosa, R., Babcock, J.: Extended tasks elicit complex eye movement patterns. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, pp. 37–43. ACM (2000)Google Scholar
  49. 49.
    Pohl, M., Schmitt, M., Diehl, S.: Comparing the readability of graph layouts using eyetracking and task-oriented analysis. In: Computational Aesthetics in Graphics, Visualization and Imaging, Lisbon, pp. 49–56 (2009)Google Scholar
  50. 50.
    Pomplun, M., Ritter, H., Velichkovsky, B.: Disambiguating complex visual information: towards communication of personal views of a scene. Perception 25, 931–948 (1996)CrossRefGoogle Scholar
  51. 51.
    Poole, A., Ball, L.J.: Eye tracking in HCI and usability research. Encycl. Hum. Comput. Interact. 1, 211–219 (2006)CrossRefGoogle Scholar
  52. 52.
    Poole, A., Ball, L.J., Phillips, P.: In search of salience: a response-time and eye-movement analysis of bookmark recognition. In: People and Computers XVIII, pp. 363–378. Springer, London (2004)Google Scholar
  53. 53.
    Raschke, M., Blascheck, T., Richter, M., Agapkin, T., Ertl, T.: Visual analysis of perceptual and cognitive processes. In: IEEE International Conference on Information Visualization Theory and Applications (IVAPP), pp. 284–291 (2014).Google Scholar
  54. 54.
    Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychol. Bull. 124 (3), 372 (1998)CrossRefGoogle Scholar
  55. 55.
    Rayner, K., Rotello, C.M., Stewart, A.J., Keir, J., Duffy, S.A.: Integrating text and pictorial information: eye movements when looking at print advertisements. J. Exp. Psychol. Appl. 7 (3), 219 (2001)CrossRefGoogle Scholar
  56. 56.
    Reinecke, K., Yeh, T., Miratrix, L., Mardiko, R., Zhao, Y., Liu, J., Gajos, K.Z.: Predicting users’ first impressions of website aesthetics with a quantification of perceived visual complexity and colorfulness. In: SIGCHI, San Jose, pp. 2049–2058. ACM (2013)Google Scholar
  57. 57.
    Ristovski, G., Hunter, M., Olk, B., Linsen, L.: EyeC: coordinated views for interactive visual exploration of eye-tracking data. In: 17th International Conference on Information Visualisation, London, pp. 239–248 (2013)Google Scholar
  58. 58.
    Rudoy, D., Goldman, D.B., Shechtman, E., Zelnik-Manor, L.: Crowdsourcing gaze data collection (2012). arXiv preprint arXiv:1204.3367Google Scholar
  59. 59.
    Russell, B.C., Torralba, A., Murphy, K.P., Freeman, W.T.: LabelMe: a database and web-based tool for image annotation. Int. J. Comput. Vis. 77 (1–3), 157–173 (2008)CrossRefGoogle Scholar
  60. 60.
    Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, pp. 71–78. ACM (2000)Google Scholar
  61. 61.
    Shen, C., Zhao, Q.: Webpage saliency. In: European Conference on Computer Vision, Zurich, pp. 33–46. Springer (2014)Google Scholar
  62. 62.
    Siirtola, H., Laivo, T., Heimonen, T., Raiha, K.-J. Visual perception of parallel coordinate visualizations. In: International Conference on Information Visualisation, Barcelona, pp. 3–9 (2009)Google Scholar
  63. 63.
    SR Research Ltd.: EyeLink Data Viewer User’s Manual, Version 1.8.402 (2008)Google Scholar
  64. 64.
    Tsang, H.Y., Tory, M., Swindells, C.: eSeeTrack: visualizing sequential fixation patterns. IEEE TVCG 16 (6), 953–962 (2010)Google Scholar
  65. 65.
    Tsotsos, J.K., Rothenstein, A.: Computational models of visual attention. Scholarpedia 6 (1), 6201 (2011)CrossRefGoogle Scholar
  66. 66.
    West, J.M., Haake, A.R., Rozanski, E.P., Karn, K.S.: eyePatterns: software for identifying patterns and similarities across fixation sequences. In: Proceedings of the 2006 Symposium on Eye Tracking Research & Applications, San Diego, pp. 149–154. ACM (2006)Google Scholar
  67. 67.
    Wilming, N., Betz, T., Kietzmann, T.C., König, P.: Measures and limits of models of fixation selection. PLoS ONE 6 (9), e24038 (2011)CrossRefGoogle Scholar
  68. 68.
    Wooding, D.S.: Eye movements of large populations: deriving regions of interest, coverage, and similarity using fixation maps. Behav. Res. Methods Instrum. Comput. 34 (4), 518–528 (2002)CrossRefGoogle Scholar
  69. 69.
    Wu, M.M.A., Munzner, T.: SEQIT: visualizing sequences of interest in eye tracking data. IEEE TVCG 22 (1), 449–458 (2015)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Zoya Bylinskii
    • 1
  • Michelle A. Borkin
    • 2
  • Nam Wook Kim
    • 3
  • Hanspeter Pfister
    • 3
  • Aude Oliva
    • 1
  1. 1.Computer Science and Artificial Intelligence LabMassachusetts Institute of TechnologyBostonUSA
  2. 2.College of Computer and Information ScienceNortheastern UniversityBostonUSA
  3. 3.School of Engineering & Applied SciencesHarvard UniversityBostonUSA

Personalised recommendations