Skip to main content
Log in

Driver’s eye-based gaze tracking system by one-point calibration

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

The accuracies of driver’s gaze detection by previous researches are affected by the various sitting positions and heights of drivers in case that initial calibration of driver is not performed. By using dual cameras, the driver’s calibration can be omitted, but processing time with complexity is increased. In addition, the problem of disappearing corneal specular reflection (SR) in the eye image as the driver severely turns his/her head has not been dealt in previous researches. To consider these issues, we propose a gaze tracking method based on driver’s one-point calibration using both corneal SR and medial canthus (MC) based on maximum entropy criterion. An experiment with collected data from 26 subjects (wearing nothing, glasses, sunglasses, hat, or taking various hand pose) in a vehicle, showed that the accuracy of the proposed method is higher than that of other gaze tracking methods. In addition, we showed the effectiveness of our method in the real driving environment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. 850nm CWL, 12.5mm Dia. Hard Coated OD 4 50nm Bandpass Filter. Available online: https://www.edmundoptics.co.kr/optics/optical-filters/bandpass-filters/hard-coated-od4-50nm-bandpass-filters/84778/ (accessed on 7 March 2017)

  2. Abtahi S, Hariri B, Shirmohammadi S (2011) Driver drowsiness monitoring based on yawning detection. In Proceedings of IEEE International Instrumentation and Measurement Technology Conference, Binjiang, pp. 1–4

  3. Ahlstrom C, Kircher K, Kircher A (2013) A gaze-based driver distraction warning system and its effect on visual behavior. IEEE Trans Intell Transp Syst 14:965–973

    Article  Google Scholar 

  4. Batista JP (2005) A real-time driver visual attention monitoring system. In Proceedings of the 2nd Iberian Conference on Pattern Recognition and Image Analysis, Estoril, pp. 200–208

    Chapter  Google Scholar 

  5. Bergen JR, Anandan P, Hanna KJ, Hingorani R (1992) Hierarchical model-based motion estimation. In Proceedings of European Conference on Computer Vision, Santa Margherita Ligure, pp. 237–252

    Chapter  Google Scholar 

  6. Centre (geometry). Available online: https://en.wikipedia.org/wiki/Centre_(geometry) (accessed on 7 March 2017)

  7. Cheng HD, Chen JR, Li J (1998) Threshold selection based on fuzzy c-partition entropy approach. Pattern Recogn 31:857–870

    Article  Google Scholar 

  8. Cho D-C, Kim W-Y (2013) Long-range gaze tracking system for large movements. IEEE Trans Biomed Eng 60:3432–3440

    Article  Google Scholar 

  9. Cho CW, Lee HC, Gwon SY, Lee JM, Jung D, Park KR, Kim H-C, Cha J (2014) Binocular gaze detection method using a fuzzy algorithm based on quality measurements. Opt Eng 53:053111-1–053111-22

    Google Scholar 

  10. Choi I-H, Hong SK, Kim Y-G (2016) Real-time categorization of driver’s gaze zone using the deep learning techniques. In Proceedings of IEEE International Conference on Big Data and Smart Computing, Hong Kong, pp. 143–148

  11. Cui J, Liu Y, Xu Y, Zhao H, Zha H (2013) Tracking generic human motion via fusion of low- and high-dimensional approaches. IEEE Trans Syst Man Cybern Part A-Syst Hum 43:996–1002

    Article  Google Scholar 

  12. Daewoo Lacetti Premiere. Available online: https://en.wikipedia.org/wiki/Chevrolet_Cruze (accessed on 2 April 2018)

  13. Dlib C++ Library (Real-time face pose estimation). Available online: http://blog.dlib.net/2014/08/real-time-face-pose-estimation.html (accessed on 7 March 2017)

  14. Dong Y, Hu Z, Uchimura K, Murayama N (2011) Driver inattention monitoring system for intelligent vehicles: a review. IEEE Trans Intell Transp Syst 12:596–614

    Article  Google Scholar 

  15. Dongguk Single Camera-based Driver Database (DSCD-DB1). Available online: http://dm.dgu.edu/link.html (accessed on 27 July 2017)

  16. Durna Y, Ari F (2017) Design of a binocular pupil and gaze point detection system utilizing high definition images. Appl Sci-Basel 7:1–16

    Google Scholar 

  17. ELP-USB500W02M-L36. Available online: http://www.elpcctv.com/usb20-5mp-usb-camera-module-ov5640-color-cmos-sensor-36mm-lens-p-216.html (accessed on 7 March 2017)

  18. Eye tracking. Available online: https://en.wikipedia.org/wiki/Eye_tracking (accessed on 7 March 2017)

  19. Franchak JM, Kretch KS, Soska KC, Adolph KE (2011) Head-mounted eye tracking: a new method to describe infant looking. Child Dev 82:1738–1750

    Article  Google Scholar 

  20. Fridman L, Lee J, Reimer B, Victor T (2016) "Owl" and "Lizard": Patterns of head pose and eye pose in driver gaze classification. IET Comput Vis 10:308–313

    Article  Google Scholar 

  21. Fridman L, Lee J, Reimer B, Victor T (2016) Owl and lizard: Patterns of head pose and eye pose in driver gaze classification. IET Comput Vis 10:308–314

    Article  Google Scholar 

  22. Fu X, Guan X, Peli E, Liu H, Luo G (2013) Automatic calibration method for driver’s head orientation in natural driving environment. IEEE Trans Intell Transp Syst 14:303–312

    Article  Google Scholar 

  23. Funke G, Greenlee E, Carter M, Dukes A, Brown R, Menke L (2016) Which eye tracker is right for your research? Performance evaluation of several cost variant eye trackers. In Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting, Washington, DC, pp. 1240–1244

  24. García I, Bronte S, Bergasa LM, Almazán J, Yebes J (2012) Vision-based drowsiness detector for real driving conditions. In Proceedings of IEEE Intelligent Vehicles Symposium, Alcala de Henares, 618–623

  25. Ghosh S, Nandy T, Manna N (2015) Real time eye detection and tracking method for driver assistance system. Advancements of Medical Electronics. Springer, New Delhi, pp. 13–25

    Google Scholar 

  26. Gonzalez RC, Woods RE (2010) Digital Image Processing, 3rd edn. Prentice Hall, New Jersey

    Google Scholar 

  27. Hansen DW, Ji Q (2010) In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32:478–500

    Article  Google Scholar 

  28. Itkonen T, Pekkanen J, Lappi O (2015) Driver gaze behavior is different in normal curve driving and when looking at the tangent point. PLoS One 10:1–19

    Article  Google Scholar 

  29. Jang JW, Heo H, Bang JW, Hong HG, Naqvi RA, Nguyen PH, Nguyen DT, Lee MB, Park KR (2018) Fuzzy-based estimation of continuous Z-distances and discrete directions of home appliances for NIR camera-based gaze tracking system. Multimed Tools Appl 77:11925–11955

    Article  Google Scholar 

  30. Jung D, Lee JM, Gwon SY, Pan W, Lee HC, Park KR, Kim H-C (2016) Compensation method of natural head movement for gaze tracking system using an ultrasonic sensor for distance measurement. Sensors 16:1–20

    Article  Google Scholar 

  31. Kar A, Corcoran P (2017) A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. IEEE Access 5:16495–16519

    Article  Google Scholar 

  32. Kazemi V, Sullivan J (2014) One millisecond face alignment with an ensemble of regression trees. In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Columbus, 1867–1874

  33. Khushaba RN, Kodagoda S, Lal S, Dissanayake G (2011) Driver drowsiness classification using fuzzy wavelet-packet-based feature-extraction algorithm. IEEE Trans Biomed Eng 58:121–131

    Article  Google Scholar 

  34. Koblova EV, Bashkatov AN, Genina EA, Tuchin VV, Bakutkin VV (2005) Estimation of melanin content in iris of human eye. Proc SPIE 5688:302–311

    Article  Google Scholar 

  35. Lee JW, Heo H, Park KR (2013) A novel gaze tracking method based on the generation of virtual calibration points. Sensors 13:10802–10822

    Article  Google Scholar 

  36. Lee SJ, Jo J, Jung HG, Park KR, Kim J (2011) Real-time gaze estimator based on driver’s head orientation for forward collision warning system. IEEE Trans Intell Transp Syst 12:254–267

    Article  Google Scholar 

  37. Lee B-G, Lee B-L, Chung W-Y (2014) Mobile healthcare for automatic driving sleep-onset detection using wavelet-based EEG and respiration signals. Sensors 14:17915–17936

    Article  Google Scholar 

  38. Li G, Chung W-Y (2015) A context-aware EEG headset system for early detection of driver drowsiness. Sensors 15:20873–20893

    Article  Google Scholar 

  39. Li Z, Li SE, Li R, Cheng B, Shi J (2017) Online detection of driver fatigue using steering wheel angles for real driving conditions. Sensors 17:1–12

    Article  Google Scholar 

  40. Li Y, Xue F, Fan X, Qu Z, Zhou G (2018) Pedestrian walking safety system based on smartphone built-in sensors. IET Commun 12:751–758

    Article  Google Scholar 

  41. Li Y, Xue F, Feng L, Qu Z (2017) A driving behavior detection system based on a smartphone's built-in sensor. Int J Commun Syst 30:1–13

    Google Scholar 

  42. Li Y, Zhou G, Li Y, Shen D (2016) Determining driver phone use leveraging smartphone sensors. Multimed Tools Appl 75:16959–16981

    Article  Google Scholar 

  43. Liang Y, Reyes ML, Lee JD (2007) Real-time detection of driver cognitive distraction using support vector machines. IEEE Trans Intell Transp Syst 8:340–350

    Article  Google Scholar 

  44. Liu L, Cheng L, Liu Y, Jia Y, Rosenblum DS (2016) Recognizing complex activities by a probabilistic interval-based model, In Proceedings of the 13th AAAI Conference on Artificial Intelligence, Phoenix, pp. 1266–1272

  45. Liu Y, Cui J, Zhao H, Zha H (2012) Fusion of low-and high-dimensional approaches by trackers sampling for generic human motion tracking. In Proceedings of the 21st International Conference on Pattern Recognition, Tsukuba, pp. 898–901

  46. Liu Y, Nie L, Han L, Zhang L, Rosenblum DS (2015) Action2Activity: Recognizing complex activities from sensor data. In Proceedings of the 24th International Joint Conference on Artificial Intelligence, Buenos Aires, pp. 1617–1623

  47. Liu Y, Nie L, Liu L, Rosenblum DS (2016) From action to activity: sensor-based activity recognition. Neurocomputing 181:108–115

    Article  Google Scholar 

  48. Noris B, Keller J-B, Billard A (2011) A wearable gaze tracking system for children in unconstrained environments. Comput Vis Image Underst 115:476–486

    Article  Google Scholar 

  49. OpenCV. Available online: http://opencv.org/ (accessed on 7 March 2017)

  50. Purkinje Images. Available online: https://en.wikipedia.org/wiki/Purkinje_images (accessed on 7 March 2017)

  51. Rantanen V, Vanhala T, Tuisku O, Niemenlehto P-H, Verho J, Surakka V, Juhola M, Lekkala J (2011) A wearable, wireless gaze tracker with integrated selection command source for human-computer interaction. IEEE Trans Inf Technol Biomed 15:795–801

    Article  Google Scholar 

  52. Ren Y-Y, Li X-S, Zheng X-L, Li Z, Zhao Q-C (2015) Analysis of drivers’ eye-movement characteristics when driving around curves. Discret Dyn Nat Soc 2015:1–10

    Google Scholar 

  53. Ren Y-Y, Li X-S, Zheng X-L, Li Z, Zhao Q-C, Chen X-X (2014) Analysis and modeling of driver’s gaze trajectories in curves driving. Adv Mech Eng 2014:1–12

    Google Scholar 

  54. Renault Samsung SM5. Available online: https://en.wikipedia.org/wiki/Renault_Samsung_SM5 (accessed on 7 March 2017)

  55. Sahayadhas A, Sundaraj K, Murugappan M (2012) Detecting driver drowsiness based on sensors: a review. Sensors 12:16937–16953

    Article  Google Scholar 

  56. Shih S-W, Liu J (2004) A novel approach to 3-D gaze tracking using stereo cameras. IEEE Trans. Syst. Man Cybern. Part B-Cybern 34:234–245

    Article  Google Scholar 

  57. Smith P, Shah M, da Vitoria Lobo N (2000) Monitoring head/eye motion for driver alertness with one camera. In Proceedings of International Conference on Pattern Recognition, Barcelona, pp. 636–642

  58. Smith P, Shah M, da Vitoria Lobo N (2003) Determining driver visual attention with one camera. IEEE Trans Intell Transp Syst 4:205–218

    Article  Google Scholar 

  59. Sturm RA, Frudakis TN (2004) Eye colour: portals into pigmentation genes and ancestry. Trends Genet 20:327–332

    Article  Google Scholar 

  60. Tawari A, Chen KH, Trivedi MM (2014) Where is the driver looking: Analysis of head, eye and iris for robust gaze zone estimation. In Proceedings of IEEE International Conference on Intelligent Transportation Systems, Qingdao, pp. 988–994

  61. Tawari A, Trivedi MM (2014) Robust and continuous estimation of driver gaze zone by dynamic analysis of multiple face videos. In Proceedings of IEEE Intelligent Vehicles Symposium, Dearborn, pp. 344–349

  62. Tobii. Available online: http://www.tobii.com (accessed on 7 March 2017)

  63. Tsukada A, Shino M, Devyver M, Kanade T (2011) Illumination-free gaze estimation method for first-person vision wearable device. In Proceedings of IEEE International Conference on Computer Vision Workshops, Barcelona, pp. 2084–2091

  64. van Leeuwen PM, Happee R, de Winter JCF (2015) Changes of driving performance and gaze behavior of novice drivers during a 30-min simulator-based training. Procedia Manufacturing 3:3325–3332

    Article  Google Scholar 

  65. Vicente F, Huang Z, Xiong X, De la Torre F, Zhang W, Levi D (2015) Driver gaze tracking and eyes off the road detection system. IEEE Trans Intell Transp Syst 16:2014–2027

    Article  Google Scholar 

  66. Vora S, Rangesh A, Trivedi MM (2017) On generalizing driver gaze zone estimation using convolutional neural networks. In Proceedings of IEEE Intelligent Vehicles Symposium, Redondo Beach, pp. 849–854

  67. Wang J, Zhang G, Shi J (2016) 2D gaze estimation based on pupil-glint vector using an artificial neural network. Appl Sci-Basel 6:1–17

    Google Scholar 

  68. Yoo DH, Chung MJ (2005) A novel non-intrusive eye gaze estimation using cross-ratio under large head motion. Comput Vis Image Underst 98:25–51

    Article  Google Scholar 

Download references

Acknowledgments

This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2018R1D1A1B07041921), by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2017R1D1A1B03028417), and by the National Research Foundation of Korea (NRF) grant funded by the Korea government (Ministry of Science and ICT) (NRF-2017R1C1B5074062).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kang Ryoung Park.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yoon, H.S., Hong, H.G., Lee, D.E. et al. Driver’s eye-based gaze tracking system by one-point calibration. Multimed Tools Appl 78, 7155–7179 (2019). https://doi.org/10.1007/s11042-018-6490-7

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-018-6490-7

Keywords

Navigation