Abstract
This contribution presents our work towards a system that autonomously guides the user’s visual attention on important information (e.g., traffic situation or in-car system status signal, etc.) in error prone situations while driving a car. Therefore we use a highly accurate head-mounted eye-tracking system to estimate the driver’s current focus of visual attention. Based on this data, we present our strategies to guide the driver’s attention to where he should focus his attention. These strategies use both graphical animations in form of a guiding point on the Graphical User Interface as well as auditory animation that are present via headphones using a Virtual Acoustics system. In the end of this contribution, we present the results from a usability study.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Bardins, S., Poitschke, T., Kohlbecher, S.: Gaze-based Interaction in various Environments. In: Proceedings of 1st ACM International Workshop on Vision Networks for Behaviour Analysis, VNBA 2008, Vancouver, Canada (October 31, 2008)
Brooke, J.: SUS - A quick and dirty usability scale. Technical report, Redhatch Consulting Ltd. (1996)
DaimlerChrysler, A.G.: Research and Technology. Lane Change Test 1.2 User Guide (2004)
Eilers, K., Nachreiner, F., Hänecke, K.: Entwicklung und Überprüfung einer Skala zur Erfassung subjektiv erlebter Anstrengung. Zeitschrift für Arbeitswissenschaft 40, 215–224 (1986)
Fastl, H., Zwicker, E.: Psychoacoustics, Facts and Models. Springer, Heidelberg (1990)
Kuratorium für Verkehrssicherheit. Verkehrsunfallstatistik (2007), http://www.kfv.at
Heinecke, A.: Mensch-Computer-Interaktion. Hanser Fachbuch Verlag, Leipzig (2004)
Laquai, F., Ablassmeier, M., Poitschke, T., Rigoll, G.: Using 3D Touch Interaction for a Multimodal Zoomable User Interface. In: Proceedings of the International Conference on Human-Computer Interactional HCI International 2009, San Diego, USA (2009)
Matthews, G., Desmond, P.: Stress and Driving Performance: Implications for Design and Training. In: Hancock, P., Desmond, P. (eds.) Stress, Workload and Fatigue, ch. 1.8. Lawrence Erlbaum Associates, Mahwah (2001)
National Highway Traffic Safety Administration – NHTSA. NHTSA Driver Distraction Research: Past, Present, and Future (July 2000), http://www-nrd.nhtsa.dot.gov/departments/nrd-13/driver-distraction/PDF/233.PDF
Stutts, J., Reinfurt, D., Staplin, L.: The Role of Driver Disraction in Traffic Crashes. In: AAA Foundation for Traffic Safety (2001)
Völk, F., Kerber, S., Fastl, H., Reifinger, S.: Design und Realisierung von virtueller Akustik für ein Augmented-Reality-Labor. In: Tagungsband Fortschritte der Akustik – DAGA 2007, Stuttgart, March 19-22, pp. 559–560. DEGA (2007)
EyeSeeCam Homepage, http://www.eyeseecam.com
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Poitschke, T., Laquai, F., Rigoll, G. (2009). Guiding a Driver’s Visual Attention Using Graphical and Auditory Animations. In: Harris, D. (eds) Engineering Psychology and Cognitive Ergonomics. EPCE 2009. Lecture Notes in Computer Science(), vol 5639. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02728-4_45
Download citation
DOI: https://doi.org/10.1007/978-3-642-02728-4_45
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-02727-7
Online ISBN: 978-3-642-02728-4
eBook Packages: Computer ScienceComputer Science (R0)