Abstract
With the recent advances in the gesture and eye-gaze tracking technologies, such trackers are increasingly finding applications in the automotive domain. They not only offer the means to measure the driver’s workload (manual, visual and cognitive), their outputs can also be used as input modalities to improve the usability of in-vehicle interactive displays, such as touch-screens. This paper presents a preliminary study on analyzing the relationship between the motor and visual attention associated with the secondary task of interacting with an in-car display whilst driving. In particular, the response time and misalignment between the motor and visual attention during the pointing-selection tasks are considered. They are shown to be highly affected by the driving conditions. This study serves the purpose of devising effective approaches to combining data from gesture and eye-gaze trackers to simplify interacting with displays in vehicles.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Burnett, G.E., Porter, J.M.: Ubiquitous computing within cars: designing controls for non-visual use. Int. J. Hum Comput Stud. 55(4), 521–531 (2001)
Harvey, C., Stanton, N.A.: Usability Evaluation for In-Vehicle Systems. CRC Press, Boca Raton (2013)
Burnett, G., Lawson, G., Millen, L., Pickering, C.: Designing touchpad user-interfaces for vehicles: which tasks are most suitable? Behav. Inf. Technol. 30(3), 403–414 (2011)
Jaeger, M.G., Skov, M.B., Thomassen, N.G.: You can touch, but you can’t look: interacting with in-vehicle systems. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1139–1148, April (2008)
Bark, K., Tran, C., Fujimura, K., Ng-Thow-Hing, V.: Personal Navi: benefits of an augmented reality navigational aid using a See-Thru 3D Volumetric HUD. In: Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, pp. 1–8, September (2014)
Broy, N., Guo, M., Schneegass, S., Pfleging, B., Alt, F.: Introducing novel technologies in the car: conducting a real-world study to test 3D dashboards. In: Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, pp. 179–186, September (2015)
Jaguar Land Rover: XE in-car technology. http://www.jaguar.co.uk/jaguar-range/xe/features. Accessed 18 Mar 2016
Ahmad, B.I., Langdon, P.M., Godsill, S.J., Hardy, R., Skrypchuk, L., Donkor, R.: September. Touchscreen usability and input performance in vehicles under different road conditions: an evaluative study. In: Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, pp. 47–54 (2015)
NHTSA: Visual-Manual NHTSA Driver Distraction Guidelines for In-Vehicle Electronic Devices. NHTSA-2010-0053 (2013)
Klauer, S.G., Dingus, T.A., Neale, V.L., Sudweeks, J.D., Ramsey, D.J.: The impact of driver inattention on near-crash/crash risk: an analysis using the 100-car naturalistic driving study data. Natl Highw. Traffic Saf. Adm. HS 810, 5942006 (2006)
Ohn-Bar, E., Trivedi, M.M.: Hand gesture recognition in real time for automotive interfaces: a multimodal vision-based approach and evaluations. IEEE Trans. Intell. Transp. Syst. 15(6), 2368–2377 (2014)
Doshi, A., Trivedi, M.M.: On the roles of eye gaze and head dynamics in predicting driver’s intent to change lanes. IEEE Trans. Intell. Transp. Syst. 10(3), 453–462 (2009)
Zheng, R., Nakano, K., Ishiko, H., Hagita, K., Kihira, M., Yokozeki, T.: Eye-Gaze tracking analysis of driver behavior while interacting with navigation systems in an urban area. IEEE Tran. Hum. Mach. Syst. 46(4), 546–556 (2016)
Ahmad, B.I., Murphy, J.K., Langdon, P.M., Godsill, S.J., Hardy, R., Skrypchuk, L.: Intent inference for hand pointing gesture-based interactions in vehicles. IEEE Trans. Cybern. 46(4), 878–889 (2016)
Ahmad, B.I., Langdon, P.M., Godsill, S.J., Donkor, R., Wilde, R., Skrypchuk, L.: You do not have to touch to select: a study on predictive in-car touchscreen with mid-air selection. In: Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, pp. 113–120, October (2016)
Ahmad, B.I., Murphy, J.K., Godsill, S.J., Langdon, P.M., Hardy, R.: Intelligent interactive displays in vehicles with intent prediction: a Bayesian framework. IEEE Signal Process. Mag. 43(2), 82–94 (2017)
Wilson, J.R., Sharples, S. (eds.): Evaluation of Human Work. CRC Press, London (2015)
Biswas, P., Langdon, P.: Multimodal intelligent eye-gaze tracking system. Int. J. Hum. Comput. Interact. 31(4), 277–294 (2015)
Stellmach, S., Sellen, A., Blake, A.: The costs and benefits of combining gaze and hand gestures for remote interaction. In: Human-Computer Interaction, pp. 570–577. Springer International Publishing, Cham
Cockburn, A., Ahlström, D., Gutwin, C.: Understanding performance in touch selections: Tap, drag and radial pointing drag with finger, stylus and mouse. Int. J. Hum Comput Stud. 70(3), 218–233 (2012)
Fitts, P.M., Peterson, J.R.: Information capacity of discrete motor responses. J. Exp. Psychol. 67(2), 103 (1964)
McGuffin, M.J., Balakrishnan, R.: Fitts’ law and expanding targets: Experimental studies and designs for user interfaces. ACM Trans. Comput. Hum. Interact. (TOCHI) 12(4), 388–422 (2005)
Soukoreff, R.W., MacKenzie, I.S.: Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts’ law research in HCI. Int. J. Hum Comput Stud. 61(6), 751–789 (2004)
Large, D.R., Crundall, E., Burnett, G., Skrypchuk, L.: Predicting the visual demand of finger-touch pointing tasks in a driving context. In: Proceedings of the International Conferenece on Automotive User Interfaces and Interactive Vehicular Applications, pp. 221–224, September (2015)
Hick, W.E.: On the rate of gain of information. Q. J. Exp. Psychol. 4(1), 11–26 (1952)
Hyman, R.: Stimulus information as a determinant of reaction time. J. Exp. Psychol. 45(3), 188 (1953)
Hoffmann, E.R., Lim, J.T.: Concurrent manual-decision tasks. Ergonomics 40(3), 293–318 (1997)
Smart Eye: http://www.smarteye.se. Accessed 10 Mar 2017
Kim, H., Kwon, S., Heo, J., Lee, H., Chung, M.K.: The effect of touch-key size on the usability of In-Vehicle Information Systems and driving safety during simulated driving. Appl. Ergon. 45(3), 379–388 (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Ahmad, B.I., Langdon, P.M., Skrypchuk, L., Godsill, S.J. (2018). How Does Eye-Gaze Relate to Gesture Movement in an Automotive Pointing Task?. In: Stanton, N. (eds) Advances in Human Aspects of Transportation. AHFE 2017. Advances in Intelligent Systems and Computing, vol 597. Springer, Cham. https://doi.org/10.1007/978-3-319-60441-1_42
Download citation
DOI: https://doi.org/10.1007/978-3-319-60441-1_42
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-60440-4
Online ISBN: 978-3-319-60441-1
eBook Packages: EngineeringEngineering (R0)