Design and evaluation of auditory-supported air gesture controls in vehicles

  • Jason Sterkenburg
  • Steven Landry
  • Myounghoon JeonEmail author
Original Paper


Using touchscreens while driving introduces competition for visual attention that increases crash risk. To resolve this issue, we have developed an auditory-supported air gesture system. We conducted two experiments using the driving simulator to investigate the influence of this system on driving performance, eye glance behavior, secondary task performance, and driver workload. In Experiment 1 we investigated the impact of menu layout and auditory displays with 23 participants. In Experiment 2 we compared the best systems from Experiment 1 with equivalent touchscreen systems with 24 participants. Results from Experiment 1 showed that menus arranged in 2 × 2 grids outperformed systems with 4 × 4 grids across all measures and also demonstrated that auditory displays can be used to reduce visual demands of in-vehicle controls. In Experiment 2 auditory-supported air gestures allowed drivers to look at the road more, showed equivalent driver workload and driving performance, and slightly decreased secondary task performance compared to touchscreens. Implications are discussed with multiple resources theory and Fitts’s law.


Driving In-air gesture controls Auditory displays Driving safety 



  1. 1.
    Horrey W, Wickens C (2007) In-vehicle glance duration: distributions, tails, and model of crash risk. Transp Res Rec J Transp Res Board 2018:22–28CrossRefGoogle Scholar
  2. 2.
    Klauer SG et al (2006) The impact of driver inattention on near-crash/crash risk: an analysis using the 100-car naturalistic driving study data. (FHWA-HRT-04-138). National Highway Traffic Safety Administration, Washington. Retrieved from
  3. 3.
    Olson RL, Hanowski RJ, Hickman JS, Bocanegra J (2009) Driver distraction in commercial vehicle operations (No. FMCSA-RRT-09-042). United States, Federal Motor Carrier Safety AdministrationGoogle Scholar
  4. 4.
    Green P (2000) Crashes induced by driver information systems and what can be done to reduce them. In: SAE conference proceedings; 1999Google Scholar
  5. 5.
    Burnett Gary E, Summerskill Steve J, Porter Jack M (2004) On-the-move destination entry for vehicle navigation systems: unsafe by any means? Behav Inf Technol 23(4):265–272CrossRefGoogle Scholar
  6. 6.
    Sodnik Jaka et al (2008) A user study of auditory versus visual interfaces for use while driving. Int J Hum Comput Stud 66(5):318–332CrossRefGoogle Scholar
  7. 7.
    Riener Andreas (2012) Gestural interaction in vehicular applications. Computer 4:42–47CrossRefGoogle Scholar
  8. 8.
    May KR, Gable TM, Walker BN (2014) A multimodal air gesture interface for in vehicle menu navigation. In: Adjunct proceedings of the 6th international conference on automotive user interfaces and interactive vehicular applications. ACMGoogle Scholar
  9. 9.
    Gable TM et al (2015) Exploring and evaluating the capabilities of Kinect v2 in a driving simulator environment. In: Proceedings of the 7th international conference on automotive user interfaces and interactive vehicular applications. ACMGoogle Scholar
  10. 10.
    Wickens C (2002) Multiple resources and performance prediction. Theor Issues Ergon Sci 3(2):159–177CrossRefGoogle Scholar
  11. 11.
    Gaver WW (1989) The SonicFinder: an interface that uses auditory icons. Hum Comput Interact 4(1):67–94MathSciNetCrossRefGoogle Scholar
  12. 12.
    Edwards AD (1989) Soundtrack: an auditory interface for blind users. Hum Comput Interact 4(1):45–66CrossRefGoogle Scholar
  13. 13.
    Jeon M, Walker BN (2009) “Spindex”: accelerated initial speech sounds improve navigation performance in auditory menus. In: Proceedings of the human factors and ergonomics society annual meeting, vol 53, no. 17. SAGE PublicationsGoogle Scholar
  14. 14.
    Jeon Myounghoon, Walker Bruce N (2011) Spindex (speech index) improves auditory menu acceptance and navigation performance. ACM Trans Access Comput (TACCESS) 3(3):10Google Scholar
  15. 15.
    Fitts P (1954) The information capacity of the human motor system in controlling the amplitude of movement. J Exp Psychol 47(6):381CrossRefGoogle Scholar
  16. 16.
    Fitts P, Peterson J (1964) Information capacity of discrete motor responses. J Exp Psychol 67(2):103CrossRefGoogle Scholar
  17. 17.
    MacKenzie I Scott (1992) Fitts’ law as a research and design tool in human–computer interaction. Humancomput Interact 7(1):91–139CrossRefGoogle Scholar
  18. 18.
    Akyol S, Canzler U, Bengler K, Hahn W (2000) Gesture control for use in automobiles. In: IAPR MVA workshop, pp 349–352Google Scholar
  19. 19.
    Ohn-bar E, Tran C, Trivedi M (2012) Hand gesturebased visual user interface for infotainment. In: AutomotiveUI’12, pp 111–115Google Scholar
  20. 20.
    Cairnie N, Ricketts IW, Mckenna SJ, Mcallister G (2000) Using finger-pointing to operate secondary controls in automobiles. In: Intelligent vehicles symposium, pp 550–555Google Scholar
  21. 21.
    Rahman ASM, Saboune J, El Saddik A, Ave KE (2011) Motion-path based in car gesture control of the multimedia devices. In: MPH, pp 69–75Google Scholar
  22. 22.
    Alpern M, Minardo K (2003) Developing a car gesture interface for use as a secondary task. In: CHI’03, p 932Google Scholar
  23. 23.
    Wu S, Gable T, May K, Choi YM, Walker BN (2016) Comparison of surface gestures and air gestures for in-vehicle menu navigation. Arch Des Res 29(4):65–80Google Scholar
  24. 24.
    May K, Gable TM, Wu X, Sardesai RR, Walker BN (2016) Choosing the right air gesture: impacts of menu length and air gesture type on driver workload. In: Adjunct proceedings of the 8th international conference on automotive user interfaces and interactive vehicular applications, pp 69–74Google Scholar
  25. 25.
    Department of Transportation (2012) National highway traffic safety administration. Visual-manual NHTSA driver distraction guidelines for in-vehicle electronic devicesGoogle Scholar
  26. 26.
    Sterkenburg J, Landry S, Jeon M, Johnson J (2016) Towards an in-vehicle sonically-enhanced gesture control interface: a pilot study. In: Proceedings of the International Conference on Auditory Display.
  27. 27.
    Sterkenburg J, Landry S, Jeon M (2017) Influences of visual and auditory displays on aimed movements using air gesture controls. In: Proceedings of the International Conference on Auditory Display, pp 81–85.
  28. 28.
    Hart S, Staveland L (1988) Development of NASATLX (Task Load Index): results of empirical and theoretical research. Adv Psychol 52:139–183CrossRefGoogle Scholar
  29. 29.
    Alm Håkan, Nilsson Lena (1995) The effects of a mobile telephone task on driver behaviour in a car following situation. Accid Anal Prev 27(5):707–715CrossRefGoogle Scholar
  30. 30.
    Strayer David L, Drew Frak A (2004) Profiles in driver distraction: effects of cell phone conversations on younger and older drivers. Hum Factors 46(4):640–649CrossRefGoogle Scholar
  31. 31.
    Hatfield BC, Wyatt WR, Shea JB (2010) Effects of auditory feedback on movement time in a Fitts task. J Mot Behav 42(5):289–293CrossRefGoogle Scholar
  32. 32.
    Zhao S, Dragicevic P, Chignell M, Balakrishnan R, Baudisch P (2007) Earpod: eyes-free menu selection using touch input and reactive audio feedback. In: CHI proceedings of the SIGCHI conference on human factors in computing systems, pp 1395–1404Google Scholar
  33. 33.
    Drews FA, Yazdani H, Godfrey CN, Cooper JM, Strayer DL (2009) Text messaging during simulated driving. Hum Factors 51:762–770CrossRefGoogle Scholar
  34. 34.
    Parasuraman R, Sheridan TB, Wickens CD (2008) Situation awareness, mental workload, and trust in automation: viable, empirically supported cognitive engineering constructs. J Cogn Eng Decis Mak 2(2):140–160CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of Cognitive and Learning SciencesMichigan Technological UniversityHoughtonUSA
  2. 2.Department of Industrial and Systems EngineeringVirginia TechBlacksburgUSA

Personalised recommendations