Facial Expressions as Indicator for Discomfort in Automated Driving

  • Matthias BeggiatoEmail author
  • Nadine Rauh
  • Josef Krems
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1131)


Driving comfort is considered a key factor for broad public acceptance of automated driving. Based on continuous driver/passenger monitoring, potential discomfort could be avoided by adapting automation features such as the driving style. The EU-project MEDIATOR ( aims at developing a mediating system in automated vehicles by constantly evaluating the performance of driver and automation. As facial expressions could be an indicator of discomfort, a driving simulator study has been carried out to investigate this relationship. A total of 41 participants experienced three potentially uncomfortable automated approach situations to a truck driving ahead. The face video of four cameras was analyzed with the Visage facial feature detection and face analysis software, extracting 23 Action Units (AUs). Situation-specific effects showed that the eyes were kept open and eye blinks were reduced (AU43). Inner brows (AU1) as well as upper lids (AU5) raised, indicating surprise. Lips were pressed (AU24) and stretched (AU20) as sign for tension. Overall, facial expression analysis could contribute to detect discomfort in automated driving.


Face tracking Facial expressions Action units Automated driving Discomfort Driving simulator Mediator project 



Data collection was funded by the Federal Ministry of Education and Research under grant No. 16SV7690K (Project KomfoPilot). Data analysis of AUs was funded by the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 814735 (Project MEDIATOR).


  1. 1.
    ERTRAC: Connected Automated Driving Roadmap. European Road Transport Research Advisory Council, European Road Transport Research Advisory Council (2019).
  2. 2.
    Bellem, H., Thiel, B., Schrauf, M., Krems, J.F.: Comfort in automated driving: an analysis of preferences for different automated driving styles and their dependence on personality traits. Transp. Res. Part F: Traffic Psychol. Behav. 55, 90–100 (2018). Scholar
  3. 3.
    Elbanhawi, M., Simic, M., Jazar, R.: In the passenger seat: investigating ride comfort measures in autonomous cars. IEEE Intell. Transport. Syst. Mag. 7(3), 4–17 (2015). Scholar
  4. 4.
    Beggiato, M., Hartwich, F., Schleinitz, K., Krems, J.F., Othersen, I., Petermann-Stock, I.: What would drivers like to know during automated driving? Information needs at different levels of automation. 7. Tagung Fahrerassistenz, Munich, 25–26 November 2015 (2015).
  5. 5.
    Techer, F., Ojeda, L., Barat, D., Marteau, J.-Y., Rampillon, F., Feron, S., Dogan, E.: Anger and highly automated driving in urban areas: the role of time pressure. Transp. Res. Part F: Traffic Psychol. Behav. 64, 353–360 (2019). Scholar
  6. 6.
    Ekman, P., Hager, J.C., Friesen, W.V.: Facial action coding system. The manual and investigator’s Guide. Research Nexus, Salt Lake City (2002)Google Scholar
  7. 7.
    Ko, B.: A brief review of facial emotion recognition based on visual information. Sensors 18(2), 401 (2018). Scholar
  8. 8.
    Beggiato, M., Hartwich, F., Krems, J.: Using smartbands, pupillometry and body motion to detect discomfort in automated driving. Front. Hum. Neurosci. 12, 3138 (2018). Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Cognitive and Engineering PsychologyChemnitz University of TechnologyChemnitzGermany

Personalised recommendations