Visualizing Dynamic Ambient/Focal Attention with Coefficient \(K\)

  • A. T. DuchowskiEmail author
  • K. Krejtz
Conference paper
Part of the Mathematics and Visualization book series (MATHVISUAL)


Using coefficient \(\mathcal{K}\), defined on a parametric scale, derived from processing a traditionally eye-tracked time course of eye movements, we propose a straightforward method of visualizing ambient/focal fixations in both scanpath and heatmap visualizations. The \(\mathcal{K}\) coefficient indicates the difference of fixation duration and following saccade amplitude expressed in standard deviation units, facilitating parametric statistical testing. Positive and negative ordinates of \(\mathcal{K}\) indicate focal or ambient fixations, respectively, and are colored by luminance variation depicting relative intensity of focal fixation.


Gaussian Mixture Model Fixation Duration Saccade Amplitude Parallel Search Serial Search 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



We thank Dr. Helena Duchowska (MD, retired) for her help in reading the CXR images and pinpointing the anomalies contained therein.

This work was partially supported by a 2015 research grant “Influence of affect on visual attention dynamics during visual search” from the SWPS University of Social Sciences and Humanities.


  1. 1.
    Biele, C., Kopacz, A., Krejtz, K.: Shall we care about the user’s feelings? Influence of affect and engagement on visual attention. In: Proceedings of the International Conference on Multimedia, Interaction, Design and Innovation, MIDI’13, pp. 7:1–7:8. ACM, New York (2013). doi:10.1145/2500342.2500349.
  2. 2.
    Blascheck, T., Kurzhals, K., Raschke, M., Burch, M., Weiskopf, D., Ertl, T.: Start-of-the-art of visualization for eye tracking data. In: Borgo, R., Maciejewski, R., Viola, I. (eds.) EuroGraphics Conference on Visualization (EuroVis). EuroVis STAR—State of the Art Report (2014)Google Scholar
  3. 3.
    Borland, D., Taylor II, R.M.: Rainbow color map (still) considered harmful. IEEE Comput. Graph. Appl. 27 (2), 14–17 (2007)CrossRefGoogle Scholar
  4. 4.
    Breslow, L.A., Ratwani, R.M., Trafton, J.G.: Cognitive models of the influence of color scale on data visualization tasks. Hum. Factors 51 (3), 321–338 (2009)CrossRefGoogle Scholar
  5. 5.
    Brewer, C., Harrower, M., Woodruff, A., Heyman, D.: Colorbrewer 2.0: color advice for maps. Online Resource (2009). Last accessed Dec 2010
  6. 6.
    Duchowski, A.T., Price, M.M., Meyer, M., Orero, P.: Aggregate gaze visualization with real-time heatmaps. In: Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA’12, pp. 13–20. ACM, New York (2012). doi:10.1145/2168556.2168558.
  7. 7.
    Elias, G., Sherwin, G., Wise, J.: Eye movements while viewing NTSC format television. Technical report, SMPTE Psychophysics Subcommittee (1984)Google Scholar
  8. 8.
    Follet, B., Le Meur, O., Baccino, T.: New insights on ambient and focal visual fixations using an automatic classification algorithm. i-Perception 2 (6), 592–610 (2011)Google Scholar
  9. 9.
    Irwin, D.E., Zelinsky, G.J.: Eye movements and scene perception: memory for things observed. Percept. Psychophys. 64, 882–895 (2002)CrossRefGoogle Scholar
  10. 10.
    Krejtz, I., Szarkowska, A., Krejtz, K., Walczak, A., Duchowski, A.: Audio description as an aural guide of children’s visual attention: evidence from an eye-tracking study. In: ETRA’12: Proceedings of the 2012 Symposium on Eye Tracking Research & Applications, ETRA’12, pp. 99–106. ACM, New York (2012). doi:10.1145/2168556.2168572Google Scholar
  11. 11.
    Krejtz, K., Duchowski, A., Krejtz, I., Szarkowska, A., Kopacz, A.: Discerning ambient/focal attention with coefficient \(\mathcal{K}\). Trans. Appl. Percept. 13 (3), Article 11 (2016).
  12. 12.
    Krejtz, K., Duchowski, A.T., Çöltekin, A.: High-level gaze metrics from map viewing: charting ambient/focal visual attention. In: Kiefer, P., Giannopoulos, I., Raubal, M., Krüger, A. (eds.) Proceedings of the 2nd International Workshop on Eye Tracking for Spatial Research (ET4S), Vienna (2014)Google Scholar
  13. 13.
    LimeSurvey Project Team/Carsten Schmitz: LimeSurvey: an open source survey tool. LimeSurvey Project, Hamburg (2012).
  14. 14.
    Lin, S., Fortuna, J., Kulkarni, C., Stone, M., Heer, J.: Selecting semantically-resonant colors for data visualization. In: Proceedings of the 15th Eurographics Conference on Visualization, EuroVis’13, pp. 401–410. Eurographics/John Wiley, Chichester (2013). doi:10.1111/cgf.12127.
  15. 15.
    Mello-Thoms, C., Nodine, C.F., Kundel, H.L.: What attracts the eye to the location of missed and reported breast cancers? In: ETRA’02: Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, pp. 111–117. ACM, New York (2002).
  16. 16.
    Mital, P.K., Smith, T.J., Hill, R.L., Henderson, J.M.: Clustering of gaze during dynamic scene viewing is predicted by motion. Cogn. Comput. 3, 5–24 (2011)CrossRefGoogle Scholar
  17. 17.
    Nodine, C.F., Kundel, H.L., Toto, L.C., Krupinski, E.A.: Recording and analyzing eye-position data using a microcomputer workstation. Behav. Res. Methods 24 (3), 475–485 (1992)CrossRefGoogle Scholar
  18. 18.
    Nothdurft, H.C.: Focal attention in visual search. Vis. Res. 39, 2305–2310 (1999)CrossRefGoogle Scholar
  19. 19.
    Pannasch, S., Helmert, J.R., Roth, K., Herbold, A.K., Walter, H.: Visual fixation durations and saccade amplitudes: shifting relationship in a variety of conditions. J. Eye Mov. Res. 2 (2), 1–19 (2008)Google Scholar
  20. 20.
    Paris, S., Durand, F.: A fast approximation of the bilateral filter using a signal processing approach. Technical report, MIT-CSAIL-TR-2006-073, Massachusetts Institute of Technology (2006)Google Scholar
  21. 21.
    Pomplun, M., Ritter, H., Velichkovsky, B.: Disambiguating complex visual information: towards communication of personal views of a scene. Perception 25 (8), 931–948 (1996)CrossRefGoogle Scholar
  22. 22.
    R Development Core Team: R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna (2011)Google Scholar
  23. 23.
    Ratwani, R.M., Trafton, J.G., Boehm-Davis, D.A.: Thinking graphically: connecting vision and cognition during graph comprehension. J. Exp. Psychol. Appl. 14 (1), 36–49 (2008)CrossRefGoogle Scholar
  24. 24.
    Rogowitz, B., Treinish, L.: Data visualization: the end of the rainbow. IEEE Spectr. 35 (12), 52–59 (1998)CrossRefGoogle Scholar
  25. 25.
    Treisman, A., Gelade, G.: A feature integration theory of attention. Cogn. Psychol. 12, 97–136 (1980)CrossRefGoogle Scholar
  26. 26.
    Unema, P.J.A., Pannasch, S., Joos, M., Velichkovsky, B.: Time course of information processing during scene perception. Vis. Cogn. 12 (3), 473–494 (2005)CrossRefGoogle Scholar
  27. 27.
    van Gisbergen, M.S., van der Most, J., Aelen, P.: Visual attention to online search engine results. Technical report, De Vos & Jansen in cooperation with Checkit (2007). Last accessed Dec 2011
  28. 28.
    Velichkovsky, B.M., Joos, M., Helmert, J.R., Pannasch, S.: Two visual systems and their eye movements: evidence from static and dynamic scene perception. In: CogSci 2005: Proceedings of the XXVII Conference of the Cognitive Science Society, Stresa, pp. 2283–2288 (2005)Google Scholar
  29. 29.
    Vitak, S.A., Ingram, J.E., Duchowski, A.T., Ellis, S., Gramopadhye, A.K.: Gaze-augmented think-aloud as an aid to learning. In: Proceedings of the SIGCHI Conference on Human Factors in computing systems, CHI’12, pp. 1253–1262. ACM, New York (2012). doi:
  30. 30.
    Wooding, D.S.: Fixation maps: quantifying eye-movement traces. In: ETRA’02: Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, pp. 31–36. ACM, New York (2002). doi:

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Clemson UniversityClemsonUSA
  2. 2.Department of PsychologySWPS University of Social Sciences and HumanitiesWarsawPoland

Personalised recommendations