Abstract
Gaze estimation is a typical approach to monitor the driver attention on the road scene. This indicator is of great importance in safe driving and in the design of the takeover control strategy for a Level 3 and Level 4 automation system. Nowadays, most of eye gaze tracking techniques are intrusive and costly, which limits their applicability over real vehicles. On the other hand, current databases used for gaze validation face the driver attention task focused on critical situations in simulation but they do not encounter actual accidents. This paper presents a low-cost and non-intrusive camera-based gaze mapping system integrating the open-source state-of-the art OpenFace 2.0 Toolkit [3] to visualise the driver attention simulation on prerecorded real traffic scenes through a heat map. The proposal has been validated by using the recent and challenging public dataset DADA2000 [9] which has 2000 video sequences with annotated driving scenarios based on real accidents. We compare our system with an expensive desktop-mounted eye-tracker, obtaining on par results and showing it is a good tool for driver attention monitoring able to be used in the design of take over systems and driving scenarios awareness systems for automated vehicles.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Tobii pro glasses 2 (2020). https://www.tobiipro.com/product-listing/tobii-pro-glasses-2/
Alletto, S., Palazzi, A., Solera, F., Calderara, S., Cucchiara, R.: Dr (eye) ve: a dataset for attention-based tasks with applications to autonomous and assisted driving. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 54–60 (2016)
Baltrusaitis, T., Zadeh, A., Lim, Y.C., Morency, L.P.: Openface 2.0: facial behavior analysis toolkit. In: 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), pp. 59–66. IEEE (2018)
Borji, A., Itti, L.: Cat2000: a large scale fixation dataset for boosting saliency research. arXiv preprint arXiv:1505.03581 (2015)
Bylinskii, Z., Judd, T., Borji, A., Itti, L., Durand, F., Oliva, A., Torralba, A.: MIT Saliency Benchmark (2019)
Cognolato, M., Atzori, M., Müller, H.: Head-mounted eye gaze tracking devices: an overview of modern devices and recent advances. J. Rehabil. Assistive Technol. Eng. 5, 2055668318773,991 (2018)
Dalmaijer, E., Mathôt, S., Stigchel, S.: Pygaze: An open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments. Behav. Res. Methods 46 (2013). https://doi.org/10.3758/s13428-013-0422-2
Dosovitskiy, A., Ros, G., Codevilla, F., Lopez, A., Koltun, V.: CARLA: an open urban driving simulator. In: Proceedings of the 1st Annual Conference on Robot Learning, pp. 1–16 (2017)
Fang, J., Yan, D., Qiao, J., Xue, J., Wang, H., Li, S.: Dada-2000: can driving accident be predicted by driver attention\(f\) analyzed by a benchmark. In: 2019 IEEE Intelligent Transportation Systems Conference (ITSC), pp. 4303–4309. IEEE (2019)
Fang, W., Chang, T.: Calibration in touch-screen systems. Texas Instruments Incorporated 10 (2007)
Jimenez, F.: Intelligent Vehicles: Enabling Technologies and Future Developments. Butterworth-Heinemann, Oxford (2017)
Jiménez, P., Bergasa, L.M., Nuevo, J., Hernández, N., Daza, I.G.: Gaze fixation system for the evaluation of driver distractions induced by ivis. IEEE Trans. Intell. Transport. Syst. 13(3), 1167–1178 (2012)
Mizuno, N., Yoshizawa, A., Hayashi, A., Ishikawa, T.: Detecting driver’s visual attention area by using vehicle-mounted device. In: 2017 IEEE 16th International Conference on Cognitive Informatics & Cognitive Computing (ICCI* CC), pp. 346–352. IEEE (2017)
Naqvi, R.A., Arsalan, M., Batchuluun, G., Yoon, H.S., Park, K.R.: Deep learning-based gaze detection system for automobile drivers using a nir camera sensor. Sensors 18(2), 456 (2018)
Palazzi, A., Abati, D., Solera, F., Cucchiara, R., et al.: Predicting the driver’s focus of attention: the dr (eye) ve project. IEEE Trans. Pattern Anal. Machine Intell. 41(7), 1720–1733 (2018)
Shen, J., Zafeiriou, S., Chrysos, G.G., Kossaifi, J., Tzimiropoulos, G., Pantic, M.: The first facial landmark tracking in-the-wild challenge: benchmark and results. In: 2015 IEEE International Conference on Computer Vision Workshop (ICCVW), pp. 1003–1011 (2015). https://doi.org/10.1109/ICCVW.2015.132
Vicente, F., Huang, Z., Xiong, X., De la Torre, F., Zhang, W., Levi, D.: Driver gaze tracking and eyes off the road detection system. IEEE Trans. Intell. Transport. Syst. 16(4), 2014–2027 (2015)
Xia, Y., Zhang, D., Kim, J., Nakayama, K., Zipser, K., Whitney, D.: Predicting driver attention in critical situations. In: Asian Conference on Computer Vision, pp. 658–674. Springer, Cham (2018)
Yang, L., Dong, K., Dmitruk, A.J., Brighton, J., Zhao, Y.: A dual-cameras-based driver gaze mapping system with an application on non-driving activities monitoring. IEEE Trans. Intell. Transport. Syst. 21, 4318–4327 (2019)
Zhang, X., Sugano, Y., Fritz, M., Bulling, A.: Mpiigaze: real-world dataset and deep appearance-based gaze estimation. IEEE Trans. Pattern Anal. Machine Intell. 41(1), 162–175 (2017)
Acknowledgements
This work has been funded in part from the Spanish MICINN/FEDER through the Techs4AgeCar project (RTI2018-099263-B-C21) and from the RoboCity2030-DIH-CM project (P2018/NMT- 4331), funded by Programas de actividades I+D (CAM) and cofunded by EU Structural Funds.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Araluce, J. et al. (2021). Integrating OpenFace 2.0 Toolkit for Driver Attention Estimation in Challenging Accidental Scenarios. In: Bergasa, L.M., Ocaña, M., Barea, R., López-Guillén, E., Revenga, P. (eds) Advances in Physical Agents II. WAF 2020. Advances in Intelligent Systems and Computing, vol 1285. Springer, Cham. https://doi.org/10.1007/978-3-030-62579-5_19
Download citation
DOI: https://doi.org/10.1007/978-3-030-62579-5_19
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-62578-8
Online ISBN: 978-3-030-62579-5
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)