Advertisement

Behavior Research Methods

, Volume 50, Issue 5, pp 2004–2015 | Cite as

Gaze3DFix: Detecting 3D fixations with an ellipsoidal bounding volume

  • Sascha Weber
  • Rebekka S. Schubert
  • Stefan Vogt
  • Boris M. Velichkovsky
  • Sebastian Pannasch
Article

Abstract

Nowadays, the use of eyetracking to determine 2-D gaze positions is common practice, and several approaches to the detection of 2-D fixations exist, but ready-to-use algorithms to determine eye movements in three dimensions are still missing. Here we present a dispersion-based algorithm with an ellipsoidal bounding volume that estimates 3D fixations. Therefore, 3D gaze points are obtained using a vector-based approach and are further processed with our algorithm. To evaluate the accuracy of our method, we performed experimental studies with real and virtual stimuli. We obtained good congruence between stimulus position and both the 3D gaze points and the 3D fixation locations within the tested range of 200–600 mm. The mean deviation of the 3D fixations from the stimulus positions was 17 mm for the real as well as for the virtual stimuli, with larger variances at increasing stimulus distances. The described algorithms are implemented in two dynamic linked libraries (Gaze3D.dll and Fixation3D.dll), and we provide a graphical user interface (Gaze3DFixGUI.exe) that is designed for importing 2-D binocular eyetracking data and calculating both 3D gaze points and 3D fixations using the libraries. The Gaze3DFix toolkit, including both libraries and the graphical user interface, is available as open-source software at https://github.com/applied-cognition-research/Gaze3DFix.

Keywords

Binocular 3D eye tracking 3D gaze points 3D fixations Eye movement analysis Methodology Open-source software 

Supplementary material

13428_2017_969_MOESM1_ESM.pdf (439 kb)
ESM 1 (PDF 439 kb)

References

  1. Behrens, F., MacKeben, M., & Schröder-Preikschat, W. (2010). An improved algorithm for automatic detection of saccades in eye movement data and for calculating saccade parameters. Behavior Research Methods, 42, 701–708.  https://doi.org/10.3758/BRM.42.3.701 CrossRefPubMedGoogle Scholar
  2. Blythe, H. I., Holliman, N. S., Jainta, S., Tbaily, L. W., & Liversedge, S. P. (2012). Binocular coordination in response to two-dimensional, three-dimensional and stereoscopic visual stimuli. Ophthalmic and Physiological Optics, 32, 397–411.  https://doi.org/10.1111/j.1475-1313.2012.00926.x CrossRefPubMedGoogle Scholar
  3. Collewijn, H., Erkelens, C. J., & Steinman, R. M. (1997). Trajectories of the human binocular fixation point during conjugate and non-conjugate gaze-shifts. Vision Research, 37, 1049–1069.CrossRefGoogle Scholar
  4. Collewijn, H., Steinman, R. M., Erkelens, C. J., Pizlo, Z., & van der Steen, J. (1992). Effect of freeing the head on eye movement characteristics during three-dimensional shifts of gaze and tracking. In A. Berthoz, W. Graf, & P. P. Vidal (Eds.), The head–neck sensory motor system (pp. 412–418). Oxford, UK: Oxford University Press.CrossRefGoogle Scholar
  5. Daugherty, B. C., Duchowski, A. T., House, D. H., & Ramasamy, C. (2010). Measuring vergence over stereoscopic video with a remote eye tracker. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (pp. 97–100). New York, NY, USA: ACM Press.CrossRefGoogle Scholar
  6. Diaz, G., Cooper, J., Kit, D., & Hayhoe, M. (2013). Real-time recording and classification of eye movements in an immersive virtual environment. Journal of Vision, 13(12), 5.  https://doi.org/10.1167/13.12.5 CrossRefPubMedPubMedCentralGoogle Scholar
  7. Duchowski, A. T. (2007). Eye tracking methodology: Theory and practice (Vol. 373). New York, NY: Springer Science & Business Media.Google Scholar
  8. Duchowski, A. T., House, D. H., Gestring, J., Congdon, R., Świrski, L., Dodgson, N. A., … Krejtz, I. (2014). Comparing estimated gaze depth in virtual and physical environments. In P. Qvarfordt & D. Witzner Hansen (Eds.), Proceedings of the Symposium on Eye Tracking Research and Applications (pp. 103–110). New York, NY, USA: ACM Press.CrossRefGoogle Scholar
  9. Duchowski, A. T., Medlin, E., Cournia, N., Murphy, H., Gramopadhye, A., Nair, S., … Melloy, B. (2002). 3D eye movement analysis. Behavior Research Methods, Instruments, & Computers, 34, 573–591.  https://doi.org/10.3758/BF03195486 CrossRefGoogle Scholar
  10. Duchowski, A. T., Medlin, E., Gramopadhye, A., Melloy, B., & Nair, S. (2001). Binocular eye tracking in VR for visual inspection training. In C. Shaw, W. Wang, & M. Green (Eds.), Proceedings of the ACM Symposium on Virtual Reality Software and Technology (pp. 1–8). New York, NY, USA: ACM Press.Google Scholar
  11. Duchowski, A. T., Pelfrey, B., House, D. H., & Wang, R. I. (2011). Measuring gaze depth with an eye tracker during stereoscopic display. In S. N. Spencer (Ed.), Proceedings of the Symposium on Applied Perception in Graphics and Visualization (pp. 15–22). New York, NY, USA: ACM Press.Google Scholar
  12. Durgin, F. H., & Li, Z. (2010). Controlled interaction: Strategies for using virtual reality to study perception. Behavior Research Methods, 42, 414–420.  https://doi.org/10.3758/BRM.42.2.414 CrossRefPubMedGoogle Scholar
  13. Epelboim, J., Steinman, R. M., Kowler, E., Edwards, M., Pizlo, Z., Erkelens, C. J., & Collewijn, H. (1995). The function of visual search and memory in sequential looking tasks. Vision Research, 35, 3401–3422.CrossRefGoogle Scholar
  14. Essig, K., Pomplun, M., & Ritter, H. (2006). A neural network for 3D gaze recording with binocular eye trackers. International Journal of Parallel, Emergent and Distributed Systems, 21, 79–95.  https://doi.org/10.1080/17445760500354440 CrossRefGoogle Scholar
  15. Grosjean, M., Rinkenauer, G., & Jainta, S. (2012). Where do the eyes really go in the hollow-face illusion? PLoS ONE, 7, e44706.  https://doi.org/10.1371/journal.pone.0044706 CrossRefPubMedPubMedCentralGoogle Scholar
  16. Hammoud, R. I. (2008). Passive eye monitoring: Algorithms, applications and experiments. New York, NY: Springer Science & Business Media.CrossRefGoogle Scholar
  17. Hansen, D. W., & Ji, Q. (2010). In the eye of the beholder: A survey of models for eyes and gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32, 478–500.CrossRefGoogle Scholar
  18. Helo, A., Pannasch, S., Sirri, L., & Rama, P. (2014). The maturation of eye movement behavior: Scene viewing characteristics in children and adults. Vision Research, 103C, 83–91.  https://doi.org/10.1016/j.visres.2014.08.006.CrossRefGoogle Scholar
  19. Hennessey, C., & Lawrence, P. (2008). 3D point-of-gaze estimation on a volumetric display. In K.-J. Räihä & A.T. Duchowski (Eds.), Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (p. 59). New York, NY: ACM Press.CrossRefGoogle Scholar
  20. Huckauf, A., Watrin, L., Yuras, G., & Koepsel, A. (2013). Brightness and contrast effects on binocular coordination. Paper presented at the 55th Tagung experimentell arbeitender Psychologen, Vienna, Austria.Google Scholar
  21. Komogortsev, O. V., Gobert, D. V., Jayarathna, S., Koh, D. H., & Gowda, S. M. (2010). Standardization of automated analyses of oculomotor fixation and saccadic behaviors. IEEE Transactions on Biomedical Engineering, 57, 2635–2645.  https://doi.org/10.1109/TBME.2010.2057429 CrossRefGoogle Scholar
  22. Land, M. F. (2006). Eye movements and the control of actions in everyday life. Progress in Retinal and Eye Research, 25, 296–324.CrossRefGoogle Scholar
  23. Lappi, O. (2015). Eye tracking in the wild: The good, the bad and the ugly. Journal of Eye Movement Research, 8, 1–21.  10.16910/jemr.8.5.1 CrossRefGoogle Scholar
  24. Lappi, O. (2016). Eye movements in the wild: Oculomotor control, gaze behavior and frames of reference. Neuroscience & Biobehavioral Reviews, 69, 49–68.CrossRefGoogle Scholar
  25. LC Technologies. (2014). Eyegaze edge analysis system: Programmer’s manual. Fairfax, VA, USA: LC Technologies, Inc.Google Scholar
  26. Levine, T. R., & Hullett, C. R. (2002). Eta squared, partial eta squared, and misreporting of effect size in communication research. Human Communication Research, 28, 612–625.  https://doi.org/10.1111/j.1468- 2958.2002.tb00828.x CrossRefGoogle Scholar
  27. Liversedge, S. P., Rayner, K., White, S. J., Findlay, J. M., & McSorley, E. (2006). Binocular coordination of the eyes during reading. Current Biology, 16, 1726–1729.  https://doi.org/10.1016/j.cub.2006.07.051 CrossRefPubMedGoogle Scholar
  28. Mansouryar, M., Steil, J., Sugano, Y., & Bulling, A. (2016). 3D gaze estimation from 2D pupil positions on monocular head-mounted eye trackers. Procceedings of the International Symposium on Eye Tracking Research and Applications (ETRA). 197–200Google Scholar
  29. Munn, S. M., Stefano, L., & Pelz, J. B. (2008). Fixation-identification in dynamic scenes: Comparing an automated algorithm to manual coding. In S. Creem-Regehr & K. Myszkowski (Eds.), Proceedings of the 5th Symposium on Applied Perception in Graphics and Visualization (pp. 33–42). New York, NY: ACM Press.CrossRefGoogle Scholar
  30. Nuthmann, A., & Kliegl, R. (2009). An examination of binocular reading fixations based on sentence corpus data. Journal of Vision, 9(5), 31.  https://doi.org/10.1167/9.5.31 CrossRefPubMedGoogle Scholar
  31. Nyström, M., & Holmqvist, K. (2010). An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data. Behavior Research Methods, 42, 188–204.  https://doi.org/10.3758/brm.42.1.188 CrossRefPubMedPubMedCentralGoogle Scholar
  32. Pelz, J. B., & Canosa, R. (2001). Oculomotor behavior and perceptual strategies in complex tasks. Vision Research, 41, 3587–3596.CrossRefGoogle Scholar
  33. Pfeiffer, T., Latoschik, M. E., & Wachsmuth, I. (2009). Evaluation of binocular eye trackers and algorithms for 3D gaze interaction in virtual reality environments. Journal of Virtual Reality and Broadcasting, 5, 1660.Google Scholar
  34. Pobuda, M., & Erkelens, C. J. (1993). The relationship between absolute disparity and ocular vergence. Biological Cybernetics, 68, 221–228.  https://doi.org/10.1007/BF00224855 CrossRefPubMedGoogle Scholar
  35. Rambold, H., Neumann, G., Sander, T., & Helmchen, C. (2006). Age-related changes of vergence under natural viewing conditions. Neurobiology of Aging, 27, 163–172.  https://doi.org/10.1016/j.neurobiolaging.2005.01.002 CrossRefPubMedGoogle Scholar
  36. Reimer, B., & Sodhi, M. (2006). Detecting eye movements in dynamic environments. Behavior Research Methods, 38, 667–682.CrossRefGoogle Scholar
  37. Salvucci, D. D., & Goldberg, J. H. (2000). Identifying fixations and saccades in eye-tracking protocols. Proceedings of the 2000 Symposium on Eye Tracking Research and Applications (pp. 71–78). New York, NY, USA: ACM Press.Google Scholar
  38. Semmlow, J. L., Hung, G. K., & Ciuffreda, K. J. (1986). Quantitative assessment of disparity vergence components. Investigative Ophthalmology and Visual Science, 27, 558–564.PubMedGoogle Scholar
  39. Shic, F., Scassellati, B., & Chawarska, K. (2008). The incomplete fixation measure. Proceedings of the 2008 Symposium on Eye Tracking Research and Applications (pp. 111–114). New York, NY, USA: ACM.Google Scholar
  40. SR Research LTD. (2009). EyeLink® 1000 user manual—Version 1.5.0. Mississauga, Ontario, Canada.Google Scholar
  41. Wang, R. I., Pelfrey, B., Duchowski, A. T., & House, D. H. (2012). Online gaze disparity via bioncular eye tracking on stereoscopic displays. In Proceedings of the 2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission (pp. 184–191). Washington, DC, USA: IEEE Computer Society.CrossRefGoogle Scholar
  42. Wang, R. I., Pelfrey, B., Duchowski, A. T., & House, D. H. (2014). Online 3D gaze localization on stereoscopic displays. ACM Transactions on Applied Perception, 11, 1–21.  https://doi.org/10.1145/2593689 CrossRefGoogle Scholar
  43. Wibirama, S., & Hamamoto, K. (2012). A geometric model for measuring depth perception in immersive virtual environment. In Proceedings of the 10th Asia Pacific Conference on Computer Human Interaction (pp. 325–330). New York, NY: ACM Press.Google Scholar
  44. Wibirama, S., & Hamamoto, K. (2014). 3D gaze tracking on stereoscopic display using optimized geometric method. IEEE Transactions on Electronics, Information and Systems, 134, 345–352.CrossRefGoogle Scholar
  45. Wismeijer, D., van Ee, R., & Erkelens, C. J. (2008). Depth cues, rather than perceived depth, govern vergence. Experimental Brain Research, 184, 61–70.  https://doi.org/10.1007/s00221-007-1081-2 CrossRefPubMedGoogle Scholar
  46. Wojdziak, J., Kammer, D., Franke, I. S., & Groh, R. (2011). BiLL: An interactive computer system for visual analytics. In Proceedings of the 3rd ACM SIGCHI Symposium on Engineering Interactive Computing Systems (p. 264). New York, NY: ACM Press.Google Scholar

Copyright information

© Psychonomic Society, Inc. 2017

Authors and Affiliations

  • Sascha Weber
    • 1
  • Rebekka S. Schubert
    • 1
  • Stefan Vogt
    • 1
  • Boris M. Velichkovsky
    • 1
    • 2
    • 3
  • Sebastian Pannasch
    • 1
  1. 1.Faculty of PsychologyTechnische Universität DresdenDresdenGermany
  2. 2.Institute of Cognitive StudiesKurchatov Research CenterMoscowRussian Federation
  3. 3.Moscow Institute for Physics and TechnologyMoscowRussian Federation

Personalised recommendations