Abstract
Autonomous vehicles are increasingly becoming a reality, but for the driving vision information allocation is not clear enough. Driver distraction and inattention are the main causes of accidents. This paper presents experimental results of an eye tracker under real driving conditions, and analyses them from the design point of view. Clustering eye tracking data for each stage resulted in a pattern classification, as well as each type of driver’s visual and psychological characteristics, from which common eye gaze practices could be summed up. The improved concept depends on the drivers’ needs and desires and can serve as a basis for view management concepts of future HUD.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Haeuslschmid, R., Shou, Y., O’Donovan, J.: First steps towards a view management concept for large-sized head-up displays with continuous depth. In: 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. ACM Digital Library, New York (2016)
Kiefer, P., Giannopoulos, I., Raubal, M., Duchowski, A.: Eye tracking for spatial research: cognition, computation, challenges. Spat. Cognit. Comput. J. 17(1–2), 1–19 (2017)
Smith, M., Gabbard, J.L., Conley, C.: Head-up vs. head-down displays: examining traditional methods of display assessment while driving. In: 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. ACM Digital Library, New York (2016)
Victor, T., BärgMan, J., Boda, C.N., et al.: Analysis of Naturalistic Driving Study Data: Safer Glances, Driver Inattention, and Crash Risk. Transportation Research Board, Washington, DC (2014)
Larsson, P., Niemand, M.: Using sound to reduce visual distraction from in-vehicle human-machine interfaces. J. Traffic Inj. Prev. 16(sup1), S25–S30 (2015)
Villalobos-ZúÑIga, G., Kujala, T., Oulasvirta, A.: T9 + HUD: physical keypad and HUD can improve driving performance while typing and driving. In: 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. ACM Digital Library, New York (2016)
Loomis, J.M., Klatzky, R.L., Giudice, N.A.: Representing 3D space in working memory: spatial images from vision, hearing, touch, and language. J. Multisen. Imag. 131–155 (2013)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Wang, X., Wang, Y., Zhang, X., Liu, C. (2018). Research on Driver’s Visual and Psychological Characteristics Under Right-Turning Scenario for Car Head-Up Displays. In: Nunes, I. (eds) Advances in Human Factors and Systems Interaction. AHFE 2017. Advances in Intelligent Systems and Computing, vol 592. Springer, Cham. https://doi.org/10.1007/978-3-319-60366-7_21
Download citation
DOI: https://doi.org/10.1007/978-3-319-60366-7_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-60365-0
Online ISBN: 978-3-319-60366-7
eBook Packages: EngineeringEngineering (R0)