Emotion recognition for semi-autonomous vehicles framework


The human being in his blessed curiosity has always wondered how to make machines feel, and, at the same time how a machine can detect emotions. Perhaps some of the tasks that cannot be replaced by machines are the ability of human beings to feel emotions. In the last year, this hypothesis is increasingly questioned by scientists who have done work that seeks to understand the phenomena of brain functioning using the state of the art in instrumentation, sensors, and signal processing. Today, the world scientists have powerful machine learning methods developed to challenge this issue.The field of emotion detection is gaining significance as the technology advances, and particularly due to the current developments in machine learning, the Internet of Things, industry 4.0 and Autonomous Vehicles. Machines will need to be equipped with the capacity to monitor the state of the human user and to change their behaviour in response. Machine learning offers a route to this and should be able to make use of data collected from questionnaires, facial expression scans, and physiological signals such as electroencephalograms (EEG), electrocardiograms, and galvanic skin response. In this study, an approach was proposed to identify the emotional state of a subject from the collected data in the elicited emotion experiments. An algorithm using EEG data was developed, using the power spectral density of the frequency cerebral bands (alpha, beta, theta, and gamma) as features for classifier training. A K Nearest Neighbors algorithm using Euclidian distance was used to predict the emotional state of the subject. This article proposes a novel approach for emotion recognition that not only depends on images of the face, as in the previous literature, but also on the physiological data. The algorithm was able to recognize nine different emotions (Neutral, Anger, Disgust, Fear, Joy, Sadness, Surprise, Amusement, and Anxiety), nine valence positions, and nine positions on arousal axes. Using the data from only 14 EEG electrodes, an accuracy of approximately 97% was achieved. An approach has been developed for evaluating the state of mind of an driver in the context of a semi-autonomous vehicle context, for example. However, the system has a much wider range of potential applications, from the design of products to the evaluation of the user experience.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4


  1. 1.


  2. 2.


  3. 3.



  1. 1.

    Im, C.-H., Lee, J.-H., Lim, J.-H.: Neurocinematics based on passive BCI: decoding temporal change of emotional arousal during video watching from multi-channel EEG. In: 2015 10th Asian Control Conference (ASCC), pp. 1–3. IEEE (2015). https://doi.org/10.1109/ASCC.2015.7244792. http://ieeexplore.ieee.org/document/7244792/

  2. 2.

    Chen, M., Han, J., Guo, L., Wang, J., Patras, I.: Identifying valence and arousal levels via connectivity between EEG channels. In: 2015 International Conference on Affective Computing and Intelligent Interaction, ACII 2015, pp. 63–69 (2015). https://doi.org/10.1109/ACII.2015.7344552

  3. 3.

    Cheutet, V., Léon, J.C., Catalano, C.E., Giannini, F., Monti, M., Falcidieno, B.: Preserving car stylists design intent through an ontology. Int. J. Interact. Des. Manuf. (IJIDeM) 2(1), 9–16 (2008). https://doi.org/10.1007/s12008-007-0031-3. http://link.springer.com/10.1007/s12008-007-0031-3

    Article  Google Scholar 

  4. 4.

    Garbas, J.U., Ruf, T., Mattias, U., Dieckmann, A.: Towards robust real-time valence recognition from facial expressions for market research applications. In: Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII), pp. 570–575 (2013). https://doi.org/10.1109/ACII.2013.100. http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6681491

  5. 5.

    Izquierdo-Reyes, J., Ramirez-Mendoza, R.A., Bustamante-Bello, M.R.: A study of the effects of advanced driver assistance systems alerts on driver performance. Int. J. Interact. Des. Manuf. (IJIDeM) (2017). https://doi.org/10.1007/s12008-016-0368-6. http://link.springer.com/10.1007/s12008-016-0368-6

    Article  Google Scholar 

  6. 6.

    Izquierdo-Reyes, J., Ramirez-Mendoza, R.A., Bustamante-Bello, M.R., Navarro-Tuch, S., Avila-Vazquez, R.: Advanced driver monitoring for assistance system (ADMAS). Int. J. Interact. Des. Manuf. (IJIDeM) (2016). https://doi.org/10.1007/s12008-016-0349-9. http://link.springer.com/10.1007/s12008-016-0349-9

    Article  Google Scholar 

  7. 7.

    Jirayucharoensak, S., Pan-Ngum, S., Israsena, P.: EEG-based emotion recognition using deep learning network with principal component based covariate shift adaptation. Sci World J 2014, 1–10 (2014). https://doi.org/10.1155/2014/627892. http://www.hindawi.com/journals/tswj/2014/627892/

    Article  Google Scholar 

  8. 8.

    Koelstra, S., Muhl, C., Soleymani, M.: Jong-Seok Lee, Yazdani, A., Ebrahimi, T., Pun, T., Nijholt, A., Patras, I.: DEAP: A Database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012). https://doi.org/10.1109/T-AFFC.2011.15. http://www.eecs.qmul.ac.uk/mmv/datasets/deap/doc/tac_special_issue_2011.pdf ieeexplore.ieee.org/document/5871728/

    Article  Google Scholar 

  9. 9.

    Kolli, A., Fasih, A., Machot, F.A., Kyamakya, K.: Non-intrusive car driver s emotion recognition using thermal camera. In: 2011 Joint 3rd Int’l Workshop on Nonlinear Dynamics and Synchronization (INDS) & 16th Int’l Symposium on Theoretical Electrical Engineering (ISTET) (2011)

  10. 10.

    Kumar, J., Kumar, J.: Affective modelling of users in HCI using EEG. Procedia Comput. Sci. 84, 107–114 (2016). https://doi.org/10.1016/j.procs.2016.04.073. http://dx.doi.org/10.1016/j.procs.2016.04.073

    Article  Google Scholar 

  11. 11.

    Lichtenauer, J., Soleymani, M.: Mahnob-Hci-Tagging Database. Tech. rep., London (2011). https://mahnob-db.eu/hci-tagging/media/uploads/manual.pdf

  12. 12.

    Navarro-Tuch, S.A., Bustamante-Bello, M.R., Molina, A., Izquierdo-Reyes, J., Avila-Vazquez, R., Pablos-Hach, J.L., Gutiérrez-Martínez, Y.: Inhabitable space control for the creation of healthy interactive spaces through emotional domotics. Int. J. Interact. Des. Manuf. (IJIDeM) (2017). https://doi.org/10.1007/s12008-017-0410-3. http://link.springer.com/10.1007/s12008-017-0410-3

  13. 13.

    Petiot, J.F., Dagher, A.: Preference-oriented form design: application to cars headlights. Int. J. Interact. Des. Manuf. (IJIDeM) 5(1), 17–27 (2011). https://doi.org/10.1007/s12008-010-0105-5. http://link.springer.com/10.1007/s12008-010-0105-5

    Article  Google Scholar 

  14. 14.

    Selvaraj, J., Murugappan, M., Wan, K., Yaacob, S.: Classification of emotional states from electrocardiogram signals: a non-linear approach based on Hurst. Biomed. Eng. Online 12(1), 44 (2013). https://doi.org/10.1186/1475-925X-12-44. http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=3680185&tool=pmcentrez&rendertype=abstract

    Article  Google Scholar 

  15. 15.

    Soleymani, M., Lichtenauer, J., Pun, T., Pantic, M.: A Multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 3(1), 42–55 (2012). https://doi.org/10.1109/T-AFFC.2011.25. http://ieeexplore.ieee.org/document/5975141/

    Article  Google Scholar 

  16. 16.

    Solomon, O.M.: PSD Computations Using Welchs Method. Tech. Rep. December, Sandia National Laboratories (1991). https://www.osti.gov/scitech/servlets/purl/5688766/

  17. 17.

    Tivatansakul, S., Ohkura, M.: Emotion recognition using ECG signals with local pattern description methods. Int. J. Affect. Eng. 15(2), 51–61 (2016). https://doi.org/10.5057/ijae.IJAE-D-15-00036. https://www.jstage.jst.go.jp/article/ijae/15/2/15_IJAE-D-15-00036/_article

    Article  Google Scholar 

  18. 18.

    Torres-Valencia, C., Álvarez-López, M., Orozco-Gutiérrez, l: SVM-based feature selection methods for emotion recognition from multimodal data. J. Multimodal User Interfaces 11(1), 9–23 (2017). https://doi.org/10.1007/s12193-016-0222-y

    Article  Google Scholar 

  19. 19.

    Wang, S., Liu, Z., Lv, S., Lv, Y., Wu, G., Peng, P., Chen, F., Wang, X.: A natural visible and infrared facial expression database for expression recognition and emotion inference. IEEE Trans. Multimed. 12(7), 682–691 (2010). https://doi.org/10.1109/TMM.2010.2060716

    Article  Google Scholar 

  20. 20.

    Wang, S., Shen, P., Liu, Z.: Facial expression recognition from infrared thermal images using temperature difference by voting. In: Proceedings of IEEE CCIS2012, pp. 94–98 (2012)

  21. 21.

    Wu, G., Liu, G., Hao, M.: The analysis of emotion recognition from GSR based on PSO. In: Proceedings—2010 International Symposium on Intelligence Information Processing and Trusted Computing, IPTC 2010, pp. 360–363 (2010). https://doi.org/10.1109/IPTC.2010.60

  22. 22.

    Xu, Y., Liu, G., Hao, M., Wen, W., Huang, X.: Analysis of affective ECG signals toward emotion recognition. J. Electron. (China) 27(1), 8–14 (2010). https://doi.org/10.1007/s11767-009-0094-3. http://link.springer.com/10.1007/s11767-009-0094-3

    Article  Google Scholar 

Download references


This research was supported by Tecnologico de Monterrey and Consejo Nacional de Ciencia y Tecnologia (CONACYT) Mexico, under scholarship 593255, We give special thanks to the Instituto Cajal for present and future collaboration.

Author information



Corresponding author

Correspondence to Ricardo A. Ramirez-Mendoza.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Izquierdo-Reyes, J., Ramirez-Mendoza, R.A., Bustamante-Bello, M.R. et al. Emotion recognition for semi-autonomous vehicles framework. Int J Interact Des Manuf 12, 1447–1454 (2018). https://doi.org/10.1007/s12008-018-0473-9

Download citation


  • Electroencephalography
  • Emotion recognition
  • K nearest neighbor
  • Autonomous vehicles
  • Semi autonomous