Advertisement

Analyzing Gaze Behavior Prior to Interacting with a Multimedia Interface in a Car

  • Bastian HinterleitnerEmail author
  • Thomas Hammer
  • Stefan Mayer
  • Frederik Naujoks
  • Nadja Schömig
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10903)

Abstract

With the increasing number of functionalities of In-Vehicle Information Systems (IVIS) their complexity is rising. In addition new input modalities are used to control the car. Many manufacturers switch from using haptic input devices to touchscreens. On the other hand there are increasingly more sensors available in a car that could support the driver in their interaction with an IVIS by analyzing the driver’s behavior. In this study gaze behavior prior to an interaction with was analyzed. Therefore 32 participants completed different tasks with an IVIS in a high-fidelity driving simulator. While driving, the participants were asked to navigate to a specific entry in a multi-level menu. Task and driving difficulty were modified as well as the input modality. One experimental group used a touchscreen and another one a rotary knob. Between the tasks, a short status information was presented to the participant either auditory or on a display. In 60.19% of the interactions, at least two preparatory glances were made. When using a rotary knob drivers had significantly less fixations on the touchscreen but those fixations lasted significantly longer. Difficulty of the driving task had no effect on the number of glances. When presenting information, the first glance is directed to the display the information appeared in. Using only sound and no visual information, the first glance is directed mostly to the center display rather than the cluster display. These findings should be considered for future design of IVIS and can help develop a more natural user interface.

Keywords

Gesture and eye–gaze based interaction Metrics for HCI Multimodal interface 

References

  1. 1.
    Jæger, M.G., Skov, M.B., Thomassen, N.G.: You can touch, but you can’t look. In: Burnett, M. (ed.) The 26th Annual CHI Conference on Human Factors in Computing Systems, CHI 2008: Conference Proceedings, Florence, Italy, 5–10 April 2008, p. 1139. ACM, New York (2008)Google Scholar
  2. 2.
    Hinterleitner, B., Gauer, L.: Natürliche Blickfolgen vor einer Fahrt im Fahrzeug. Mensch und Computer 2017-Workshopband, pp. 441–449 (2017).  https://doi.org/10.18420/muc2017-ws09-0305
  3. 3.
    Tretten, P., Normark, C.J., Gärling, A.: Warnings and placement positions in automobiles. In: World Congress on Ergonomics. Chinese Ergonomics Society (2009)Google Scholar
  4. 4.
    Antin, J.F., Dingus, T.A., Hulse, M.C., et al.: An evaluation of the effectiveness and efficiency of an automobile moving-map navigational display. Int. J. Man Mach. Stud. 33(5), 581–594 (1990).  https://doi.org/10.1016/s0020-7373(05)80054-9CrossRefGoogle Scholar
  5. 5.
    Dingus, T.A., McGehee, D.V., Hulse, M.C., et al.: TravTek evaluation task C3-Camera car study (1995)Google Scholar
  6. 6.
    Dingus, T.A.: A meta-analysis of driver eye-scanning behavior while navigating. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 39, pp. 1127–1131. SAGE Publications, Los Angeles (1995)CrossRefGoogle Scholar
  7. 7.
    National Highway Traffic Safety Administration: Visual-manual NHTSA driver distraction guidelines for in-vehicle electronic devices (2012)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Bastian Hinterleitner
    • 1
    Email author
  • Thomas Hammer
    • 2
  • Stefan Mayer
    • 1
  • Frederik Naujoks
    • 2
  • Nadja Schömig
    • 2
  1. 1.Audi Electronics Venture GmbHGaimersheimGermany
  2. 2.Würzburger Institut für Verkehrswissenschaften GmbHVeitshöchheimGermany

Personalised recommendations