Advertisement

Eye and Head Tracking for Focus of Attention Control in the Cockpit

  • Mohammad Mehdi Moniri
  • Michael FeldEmail author
Chapter
Part of the Human–Computer Interaction Series book series (HCIS)

Abstract

The driver’s focus of attention is a key factor to be considered for building novel, intuitive user interaction concepts, and enhancing the current infotainment and safety applications in the vehicle. In this chapter we present several topics related to the development of application and systems that incorporate the user’s visual focus of attention. In the presented real-life experiments, 3D representations of both the vehicle’s interior and the outside environment are used. A real-time evaluation concerning the object in the driver’s visual focus in these environments is also performed. We describe the functionality and the accuracy of the presented systems, which is integrated in a fully functional vehicle in an actual traffic setting. In addition, several analyses concerning accuracy of the off-the-shelf eye trackers regarding peripheral vision or direct interaction with urban objects are presented.

Keywords

Traffic Sign Advance Driver Assistant System Head Tracking Visualization Module Reference Resolution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. Alt, F., Schneegass, S., Auda, J., Rzayev, R., & Broy, N. (2014). Using eye tracking to support interaction with layered 3D interfaces on stereoscopic displays. In Proceedings of the 19th international conference on Intelligent User Interfaces (p. 267272). ACM.Google Scholar
  2. Dinh, B. -T. (2015). Accuracy measurement of low-cost eye trackers in 2D and 3D environments. Bachelors thesis, Computer Science Institute, Saarland University, Saarbrücken, Germany.Google Scholar
  3. Dinh, B. -T. (2015). Framework for analyzing drivers focus-of-attention based on dynamic 3D Map Data. Bachelors thesis, Computer Science Institute, Saarland University, Saarbrücken, Germany.Google Scholar
  4. Duchowski, A. (2007). Eye tracking methodology: Theory and practice (Vol. 373). Springer Science & Business Media.Google Scholar
  5. Fanelli, G., Weise, T., Gall, J., & Gool, L. J. V. (2011). Real time head pose estimation from consumer depth cameras. In R. Mester & M. Felsberg (Eds.), DAGM-Symposium. Lecture Notes in Computer Science (Vol. 6835, p. 101110). Springer.Google Scholar
  6. Fletcher, L., & Zelinsky, A. (2009). Driver inattention detection based on eye gaze–road event correlation. International Journal of Robotics Research, 28(6), 774801.CrossRefGoogle Scholar
  7. Gernoth, T., Martínez, K. A., Gooßen, A., & Grigat, R. -R. (2010). Facial pose estimation using active appearance models and a generic face model. In P. Richard & J. Braz (Eds.), VISAPP (Vol. 2, p. 499506). INSTICC Press.Google Scholar
  8. Hatada, T., Sakata, H., & Kusaka, H. (1980). Psychophysical analysis of the "sensation of reality" induced by a visual wide-field display. SMPTE Journal, 89(8), 560569.CrossRefGoogle Scholar
  9. Jiménez, P., Bergasa, L. M., Nuevo, J., & Alcantarilla, P. F. (2012). Face pose estimation with automatic 3D model creation in challenging scenarios. Image and Vision Computing, 30(9), 589602.CrossRefGoogle Scholar
  10. Kang, S., Kim B., Han, S., & Kim, H. (2015). Do you see what I see: Towards a gaze-based surroundings query processing system. In Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’15) (pp. 93–100.). New York, NY, USA: ACM. http://dx.doi.org/10.1145/2799250.2799285
  11. Kern, D., Mahr, A., Castronovo, S., Schmidt, A., & Müller, C. A. (2010). Making use of drivers’ glances onto the screen for explicit gaze-based interaction. In A. K. Dey, A. Schmidt, S. Boll, & A. L. Kun (Eds.), AutomotiveUI (p. 110116). ACM.Google Scholar
  12. Math, R., Mahr, A., Moniri, M. M., & Müller, C. (2012). Opends: A new open-source driving simulator for research. Adjunct Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Appilcations (p. 78). Portsmouth: NH, USA.Google Scholar
  13. Merkel, D. (2015). System für die Aufmerksamkeitsanalyse des Fahrers basierend auf Blickdaten. Bachelors thesis, Computer Science Institute, Saarland University, Saarbrücken, Germany.Google Scholar
  14. Moniri, M. M. (2011). Multimodal reference resolution for mobile spatial interaction in urban environments (Masters thesis, Computer Science Institute, Saarland University, Saarbrücken, Germany).Google Scholar
  15. Moniri, M. M., & Müller, C. (2014). EyeVIUS: intelligent vehicles in intelligent urban spaces. In Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’14) (pp. 1–6). New York, NY, USA: ACM. http://dx.doi.org/10.1145/2667239.2667265
  16. Moniri, M. M., Feld, M., & Müller, C. A. (2012). Personalized in-vehicle information systems: Building an application infrastructure for smart cars in smart spaces (p. 379382). In Intelligent Environments: IEEE.Google Scholar
  17. Nakao, M., Terada, T., & Tsukamoto, M. (2014). An information presentation method for head mounted display considering surrounding environments. In Proceedings of the 5th Augmented Human International Conference, (p. 47). ACM.Google Scholar
  18. Toyama, T., Dengel, A., Suzuki, W., & Kise, K. (2013). Wearable reading assist system: Augmented reality document combining document retrieval and eye tracking. In 2013 12th International Conference on Document Analysis and Recognition (ICDAR) (p. 34). IEEE.Google Scholar
  19. Toyama, T., Sonntag, D., Orlosky, J., & Kiyokawa, K. (2014). A natural interface for multi-focal plane head mounted displays using 3D gaze. In Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces (p. 2532). ACM.Google Scholar
  20. Toyama, T., Sonntag, D., Orlosky, J., & Kiyokawa, K. (2015). Attention engagement and cognitive state analysis for augmented reality text display functions. In Proceedings of the 20th International Conference on Intelligent User Interfaces (p. 322332). ACM .Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.German Research Center for Artificial IntelligenceSaarbrueckenGermany

Personalised recommendations