Skip to main content

How Does Eye-Gaze Relate to Gesture Movement in an Automotive Pointing Task?

  • Conference paper
  • First Online:
Advances in Human Aspects of Transportation (AHFE 2017)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 597))

Included in the following conference series:

Abstract

With the recent advances in the gesture and eye-gaze tracking technologies, such trackers are increasingly finding applications in the automotive domain. They not only offer the means to measure the driver’s workload (manual, visual and cognitive), their outputs can also be used as input modalities to improve the usability of in-vehicle interactive displays, such as touch-screens. This paper presents a preliminary study on analyzing the relationship between the motor and visual attention associated with the secondary task of interacting with an in-car display whilst driving. In particular, the response time and misalignment between the motor and visual attention during the pointing-selection tasks are considered. They are shown to be highly affected by the driving conditions. This study serves the purpose of devising effective approaches to combining data from gesture and eye-gaze trackers to simplify interacting with displays in vehicles.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 349.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 449.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Burnett, G.E., Porter, J.M.: Ubiquitous computing within cars: designing controls for non-visual use. Int. J. Hum Comput Stud. 55(4), 521–531 (2001)

    Article  MATH  Google Scholar 

  2. Harvey, C., Stanton, N.A.: Usability Evaluation for In-Vehicle Systems. CRC Press, Boca Raton (2013)

    Book  Google Scholar 

  3. Burnett, G., Lawson, G., Millen, L., Pickering, C.: Designing touchpad user-interfaces for vehicles: which tasks are most suitable? Behav. Inf. Technol. 30(3), 403–414 (2011)

    Article  Google Scholar 

  4. Jaeger, M.G., Skov, M.B., Thomassen, N.G.: You can touch, but you can’t look: interacting with in-vehicle systems. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1139–1148, April (2008)

    Google Scholar 

  5. Bark, K., Tran, C., Fujimura, K., Ng-Thow-Hing, V.: Personal Navi: benefits of an augmented reality navigational aid using a See-Thru 3D Volumetric HUD. In: Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, pp. 1–8, September (2014)

    Google Scholar 

  6. Broy, N., Guo, M., Schneegass, S., Pfleging, B., Alt, F.: Introducing novel technologies in the car: conducting a real-world study to test 3D dashboards. In: Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, pp. 179–186, September (2015)

    Google Scholar 

  7. Jaguar Land Rover: XE in-car technology. http://www.jaguar.co.uk/jaguar-range/xe/features. Accessed 18 Mar 2016

  8. Ahmad, B.I., Langdon, P.M., Godsill, S.J., Hardy, R., Skrypchuk, L., Donkor, R.: September. Touchscreen usability and input performance in vehicles under different road conditions: an evaluative study. In: Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, pp. 47–54 (2015)

    Google Scholar 

  9. NHTSA: Visual-Manual NHTSA Driver Distraction Guidelines for In-Vehicle Electronic Devices. NHTSA-2010-0053 (2013)

    Google Scholar 

  10. Klauer, S.G., Dingus, T.A., Neale, V.L., Sudweeks, J.D., Ramsey, D.J.: The impact of driver inattention on near-crash/crash risk: an analysis using the 100-car naturalistic driving study data. Natl Highw. Traffic Saf. Adm. HS 810, 5942006 (2006)

    Google Scholar 

  11. Ohn-Bar, E., Trivedi, M.M.: Hand gesture recognition in real time for automotive interfaces: a multimodal vision-based approach and evaluations. IEEE Trans. Intell. Transp. Syst. 15(6), 2368–2377 (2014)

    Article  Google Scholar 

  12. Doshi, A., Trivedi, M.M.: On the roles of eye gaze and head dynamics in predicting driver’s intent to change lanes. IEEE Trans. Intell. Transp. Syst. 10(3), 453–462 (2009)

    Article  Google Scholar 

  13. Zheng, R., Nakano, K., Ishiko, H., Hagita, K., Kihira, M., Yokozeki, T.: Eye-Gaze tracking analysis of driver behavior while interacting with navigation systems in an urban area. IEEE Tran. Hum. Mach. Syst. 46(4), 546–556 (2016)

    Article  Google Scholar 

  14. Ahmad, B.I., Murphy, J.K., Langdon, P.M., Godsill, S.J., Hardy, R., Skrypchuk, L.: Intent inference for hand pointing gesture-based interactions in vehicles. IEEE Trans. Cybern. 46(4), 878–889 (2016)

    Article  Google Scholar 

  15. Ahmad, B.I., Langdon, P.M., Godsill, S.J., Donkor, R., Wilde, R., Skrypchuk, L.: You do not have to touch to select: a study on predictive in-car touchscreen with mid-air selection. In: Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, pp. 113–120, October (2016)

    Google Scholar 

  16. Ahmad, B.I., Murphy, J.K., Godsill, S.J., Langdon, P.M., Hardy, R.: Intelligent interactive displays in vehicles with intent prediction: a Bayesian framework. IEEE Signal Process. Mag. 43(2), 82–94 (2017)

    Article  Google Scholar 

  17. Wilson, J.R., Sharples, S. (eds.): Evaluation of Human Work. CRC Press, London (2015)

    Google Scholar 

  18. Biswas, P., Langdon, P.: Multimodal intelligent eye-gaze tracking system. Int. J. Hum. Comput. Interact. 31(4), 277–294 (2015)

    Article  Google Scholar 

  19. Stellmach, S., Sellen, A., Blake, A.: The costs and benefits of combining gaze and hand gestures for remote interaction. In: Human-Computer Interaction, pp. 570–577. Springer International Publishing, Cham

    Google Scholar 

  20. Cockburn, A., Ahlström, D., Gutwin, C.: Understanding performance in touch selections: Tap, drag and radial pointing drag with finger, stylus and mouse. Int. J. Hum Comput Stud. 70(3), 218–233 (2012)

    Article  Google Scholar 

  21. Fitts, P.M., Peterson, J.R.: Information capacity of discrete motor responses. J. Exp. Psychol. 67(2), 103 (1964)

    Article  Google Scholar 

  22. McGuffin, M.J., Balakrishnan, R.: Fitts’ law and expanding targets: Experimental studies and designs for user interfaces. ACM Trans. Comput. Hum. Interact. (TOCHI) 12(4), 388–422 (2005)

    Article  Google Scholar 

  23. Soukoreff, R.W., MacKenzie, I.S.: Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts’ law research in HCI. Int. J. Hum Comput Stud. 61(6), 751–789 (2004)

    Article  Google Scholar 

  24. Large, D.R., Crundall, E., Burnett, G., Skrypchuk, L.: Predicting the visual demand of finger-touch pointing tasks in a driving context. In: Proceedings of the International Conferenece on Automotive User Interfaces and Interactive Vehicular Applications, pp. 221–224, September (2015)

    Google Scholar 

  25. Hick, W.E.: On the rate of gain of information. Q. J. Exp. Psychol. 4(1), 11–26 (1952)

    Article  Google Scholar 

  26. Hyman, R.: Stimulus information as a determinant of reaction time. J. Exp. Psychol. 45(3), 188 (1953)

    Article  Google Scholar 

  27. Hoffmann, E.R., Lim, J.T.: Concurrent manual-decision tasks. Ergonomics 40(3), 293–318 (1997)

    Article  Google Scholar 

  28. Smart Eye: http://www.smarteye.se. Accessed 10 Mar 2017

  29. Kim, H., Kwon, S., Heo, J., Lee, H., Chung, M.K.: The effect of touch-key size on the usability of In-Vehicle Information Systems and driving safety during simulated driving. Appl. Ergon. 45(3), 379–388 (2014)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bashar I. Ahmad .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Cite this paper

Ahmad, B.I., Langdon, P.M., Skrypchuk, L., Godsill, S.J. (2018). How Does Eye-Gaze Relate to Gesture Movement in an Automotive Pointing Task?. In: Stanton, N. (eds) Advances in Human Aspects of Transportation. AHFE 2017. Advances in Intelligent Systems and Computing, vol 597. Springer, Cham. https://doi.org/10.1007/978-3-319-60441-1_42

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-60441-1_42

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-60440-4

  • Online ISBN: 978-3-319-60441-1

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics