Gaze Visualization for Immersive Video

  • Thomas LöweEmail author
  • Michael Stengel
  • Emmy-Charlotte Förster
  • Steve Grogorick
  • Marcus Magnor
Conference paper
Part of the Mathematics and Visualization book series (MATHVISUAL)


In contrast to traditional video, immersive video allows viewers to interactively control their field of view in a 360 panoramic scene. However, established methods for the comparative evaluation of gaze data for video require that all participants observe the same viewing area. We therefore propose new specialized visualizations and a novel visual analytics framework for the combined analysis of head movement and gaze data. A novel View Similarity visualization highlights viewing areas branching and joining over time, while three additional visualizations provide global and spatial context. These new visualizations, along with established gaze evaluation techniques, allow analysts to investigate the storytelling of immersive videos. We demonstrate the usefulness of our approach using head movement and gaze data recorded for both amateur panoramic videos, as well as professionally composited immersive videos.


Head Orientation Viewing Direction Roller Coaster Traditional Video Participant View 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



The authors thank Laura Saenger, Flávio Bezerra and Eduard Tucholke for permission to use the short film “UM MENINO”. The authors gratefully acknowledge funding by the German Science Foundation from project DFG MA2555/6-2 within the strategic research initiative on Scalable Visual Analytics and funding from the European Union’s Seventh Framework Programme FP7/2007-2013 under grant agreement no. 256941, Reality CG.


  1. 1.
    Ábaco Digital Zaragoza, VIDEO 360: RAFTING, 2015., vis. 27 Jul 2015
  2. 2.
    Blascheck, T., Kurzhals, K., Raschke, M., Burch, M., Weiskopf, D., Ertl, T.: State-of-the-art of visualization for eye tracking data. In: Proceedings of EuroVis, vol. 2014 (2014)Google Scholar
  3. 3.
    Burch, M., Kull, A., Weiskopf, D.: Aoi rivers for visualizing dynamic eye gaze frequencies. In: Computer Graphics Forum, vol. 32, pp. 281–290. Wiley Online Library, Chichester (2013)Google Scholar
  4. 4.
    Cheon, M., Lee, J.-S.: Temporal resolution vs. visual saliency in videos: analysis of gaze patterns and evaluation of saliency models. Signal Process. Image Commun. 39, 405–417 (2015)Google Scholar
  5. 5.
    ChimpanZés de Gaveta, Um Menino Trilha: ChimpanZés de Gaveta, 2015., vis. 18 Feb 2016
  6. 6.
    Cox, T., Cox, A.: Multidimensional Scaling, 2nd edn. Taylor & Francis, Boca Raton (2010)Google Scholar
  7. 7.
    Duchowski, A.T., Medlin, E., Cournia, N., Gramopadhye, A., Melloy, B., Nair, S.: 3d eye movement analysis for vr visual inspection training. In: Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, pp. 103–110. ACM (2002)Google Scholar
  8. 8.
    Duchowski, A.T., Price, M.M., Meyer, M., Orero, P.: Aggregate gaze visualization with real-time heatmaps. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 13–20. ACM (2012)Google Scholar
  9. 9.
    FOVE: The world’s first eye tracking virtual reality headset, 2015., vis. 29 Jul 2015
  10. 10.
    Google Jump, 2015., vis. 29 Jul 2015
  11. 11.
    Itoh, K., Hansen, J.P., Nielsen, F.: Cognitive modelling of a ship navigator based on protocol and eye-movement analysis. Le Travail Humain, pp. 99–127. Presses Universitaires de France, Paris (1998)Google Scholar
  12. 12.
    Itoh, K., Tanaka, H., Seki, M.: Eye-movement analysis of track monitoring patterns of night train operators: effects of geographic knowledge and fatigue. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 44, pp. 360–363. SAGE Publications (2000)Google Scholar
  13. 13.
    Kurzhals, K., Weiskopf, D.: Space-time visual analytics of eye-tracking data for dynamic stimuli. IEEE Trans. Vis. Comput. Graph. 19 (12), 2129–2138 (2013)CrossRefGoogle Scholar
  14. 14.
    Kurzhals, K., Weiskopf, D.: Aoi transition trees. In: Proceedings of the 41st Graphics Interface Conference, pp. 41–48. Canadian Information Processing Society (2015)Google Scholar
  15. 15.
    LaViola Jr, J.J.: A discussion of cybersickness in virtual environments. ACM SIGCHI Bull. 32 (1), 47–56 (2000)Google Scholar
  16. 16.
    Mackworth, J.F., Mackworth, N.: Eye fixations recorded on changing visual scenes by the television eye-marker. JOSA 48 (7), 439–444 (1958)CrossRefGoogle Scholar
  17. 17.
    Noton, D., Stark, L.: Scanpaths in eye movements during pattern perception. Science 171 (3968), 308–311 (1971)CrossRefGoogle Scholar
  18. 18.
    Nyström, M., Holmqvist, K.: Effect of compressed offline foveated video on viewing behavior and subjective quality. ACM Trans. Multimed. Comput. Commun. Appl. (TOMM) 6 (1), 4 (2010)Google Scholar
  19. 19.
    Oculus Story Studio, 2015., vis. 29 Jul 2015
  20. 20.
    Perazzi, F., Sorkine-Hornung, A., Zimmer, H., Kaufmann, P., Wang, O., Watson, S., Gross, M.: Panoramic video from unstructured camera arrays. In: Computer Graphics Forum, vol. 34, pp. 57–68. Wiley Online Library, Chichester (2015)Google Scholar
  21. 21.
    Pfeiffer, T.: Measuring and visualizing attention in space with 3D attention volumes. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 29–36. ACM (2012)Google Scholar
  22. 22.
    Ramloll, R., Trepagnier, C., Sebrechts, M., Beedasy, J.: Gaze data visualization tools: opportunities and challenges. In: Proceedings of the Eighth International Conference on Information Visualisation, IV 2004, pp. 173–180. IEEE (2004)Google Scholar
  23. 23.
    Schulz, C., Schneider, E., Fritz, L., Vockeroth, J., Hapfelmeier, A., Brandt, T., Kochs, E., Schneider, G.: Visual attention of anaesthetists during simulated critical incidents. Br. J. Anaesth. 106 (6), 807–813 (2011)CrossRefGoogle Scholar
  24. 24.
    Shoemake, K.: Arcball: a user interface for specifying three-dimensional orientation using a mouse. In: Graphics Interface, vol. 92, pp. 151–156. Morgan Kaufmann Publishers, San Francisco (1992)Google Scholar
  25. 25.
    Smith, T., Henderson, J.: Attentional synchrony in static and dynamic scenes. J. Vis. 8 (6), 773–773 (2008)CrossRefGoogle Scholar
  26. 26.
    Soyka, F., Kokkinara, E., Leyrer, M., Buelthoff, H., Slater, M., Mohler, B.: Turbulent motions cannot shake vr. In: Virtual Reality (VR), pp. 33–40. IEEE (2015)Google Scholar
  27. 27.
    Stellmach, S., Nacke, L., Dachselt, R.: Advanced gaze visualizations for three-dimensional virtual environments. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, pp. 109–112. ACM (2010)Google Scholar
  28. 28.
    Stengel, M., Grogorick, S., Rogge, L., Magnor, M.: A nonobscuring eye tracking solution for wide field-of-view head-mounted displays. In: Eurographics 2014-Posters, pp. 7–8. The Eurographics Association (2014)Google Scholar
  29. 29.
    Tory, M., Atkins, M.S., Kirkpatrick, A.E., Nicolaou, M., Yang, G.-Z.: Eyegaze analysis of displays with combined 2D and 3D views. In: Visualization (VIS’05), pp. 519–526. IEEE (2005)Google Scholar
  30. 30.
  31. 31.
    Weibel, N., Fouse, A., Emmenegger, C., Kimmich, S., Hutchins, E.: Let’s look at the cockpit: exploring mobile eye-tracking for observational research on the flight deck. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 107–114. ACM (2012)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Thomas Löwe
    • 1
    Email author
  • Michael Stengel
    • 1
  • Emmy-Charlotte Förster
    • 1
  • Steve Grogorick
    • 1
  • Marcus Magnor
    • 1
  1. 1.Computer Graphics LabTU BraunschweigBraunschweigGermany

Personalised recommendations