Skip to main content
Log in

Automatic Classification of Emotions Based on Cardiac Signals: A Systematic Literature Review

  • Review
  • Published:
Annals of Biomedical Engineering Aims and scope Submit manuscript

Abstract

Emotions play a pivotal role in human cognition, exerting influence across diverse domains of individuals’ lives. The widespread adoption of artificial intelligence and machine learning has spurred interest in systems capable of automatically recognizing and classifying emotions and affective states. However, the accurate identification of human emotions remains a formidable challenge, as they are influenced by various factors and accompanied by physiological changes. Numerous solutions have emerged to enable emotion recognition, leveraging the characterization of biological signals, including the utilization of cardiac signals acquired from low-cost and wearable sensors. The objective of this work was to comprehensively investigate the current trends in the field by conducting a Systematic Literature Review (SLR) that focuses specifically on the detection, recognition, and classification of emotions based on cardiac signals, to gain insights into the prevailing techniques employed for signal acquisition, the extracted features, the elicitation process, and the classification methods employed in these studies. A SLR was conducted using four research databases, and articles were assessed concerning the proposed research questions. Twenty seven articles met the selection criteria and were assessed for the feasibility of using cardiac signals, acquired from low-cost and wearable devices, for emotion recognition. Several emotional elicitation methods were found in the literature, including the algorithms applied for automatic classification, as well as the key challenges associated with emotion recognition relying solely on cardiac signals. This study extends the current body of knowledge and enables future research by providing insights into suitable techniques for designing automatic emotion recognition applications. It emphasizes the importance of utilizing low-cost, wearable, and unobtrusive devices to acquire cardiac signals for accurate and accessible emotion recognition.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Source biosignalsplux https://biosignalsplux.com/products/wearables/respiban-pro.html

Similar content being viewed by others

Abbreviations

EEG:

Electroencephalogram

ECG:

Electrocardiogram

EMG:

Electromyogram

AI:

Artificial intelligence

ANN:

Artificial Neural Network

RQ:

Research questions

HR:

Heart rate

HRV:

Heart rate variability

GSR:

Galvanic skin response

RR:

Respiration rate

SKT:

Skin temperature

DL:

Deep learning

CNN:

Convolutional Neural Networks

SLR:

Systematic literature review

PPG:

Photoplethysmogram

SVM:

Support vector machine

kNN:

K-nearest neighbor

LDA:

Linear discriminant analysis

PICOC:

Population, Intervention, Comparison, Outcome, Context

ACC:

Three-axis acceleration

EDA:

Electrodermal activity

DT:

Decision trees

RF:

Random forest

RBF:

Radial basis function

LS:

Least squares

AB:

AdaBoost

DBN:

Deep Belief Networks

STRNN:

Spatial-Temporal Recurrent Neural Network

LSTM:

Long short term memory

A-LSTM:

Attention-LSTM

SLDA:

Soft LDA

MN:

Multinomial regression

NB:

Naive Bayes

XGBTREE:

Extreme gradient boost

BLR:

Boosted logistic regression

GLMNET:

Lasso and elastic-net regularized generalized linear models

TBAG:

Bagging trees

SFFS-KBCS:

Sequential forward floating selection-kernel-based class separability

GDA:

Generalized discriminant analysis

References

  1. Abadi, M. K., R. Subramanian, S. M. Kia, P. Avesani, I. Patras, and N. Sebe. DECAF: MEG-based multimodal database for decoding affective physiological responses. IEEE Trans. Affect. Comput. 2015. https://doi.org/10.1109/TAFFC.2015.2392932.

    Article  Google Scholar 

  2. Al-Nafjan, A., M. Hosny, Y. Al-Ohali, and A. Al-Wabil. Review and classification of emotion recognition based on EEG brain-computer interface system re-search: a systematic review. Appl. Sci. 2017. https://doi.org/10.3390/app7121239.

    Article  Google Scholar 

  3. Althobaiti, T., S. Katsigiannis, D. West, M. Bronte-Stewart, and N. Ramzan. Affect detection for human-horse interaction. In: 2018 21st Saudi Computer Society National Computer Conference (NCC). IEEE. 2018. https://doi.org/10.1109/NCG.2018.8593113

  4. Anand, A., A. Vijayvargiya, V. Moorthy, and S. Kumar. EmoSens: emotion recognition based on sensor data analysis using LightGBM. arXiv preprint. 2022; https://doi.org/10.48550/arXiv.2207.14640

  5. Arora, M., and M. Kumar. AutoFER: PCA and PSO based automatic facial emotion recognition. Multimed. Tools Appl. 2021. https://doi.org/10.1007/s11042-020-09726-4.

    Article  Google Scholar 

  6. Ashwin, T. S., and R. M. R. Guddeti. Automatic detection of students’ affective states in classroom environment using hybrid convolutional neural networks. Educ. Inf. Technol. 2020. https://doi.org/10.1007/s10639-019-10004-6.

    Article  Google Scholar 

  7. Bălan, O., G. Moise, L. Petrescu, A. Moldoveanu, M. Leordeanu, and F. Moldoveanu. Emotion classification based on biophysical signals and machine learning techniques. Symmetry. 2020. https://doi.org/10.3390/sym12010021.

    Article  Google Scholar 

  8. Bayoumy, K., M. Gaber, A. Elshafeey, O. Mhaimeed, E. H. Dineen, F. A. Marvel, S. S. Martin, E. D. Muse, M. P. Turakhia, K. G. Tarakji, and M. B. Elshazly. Smart wearable devices in cardiovascular care: where we are and how to move forward. Nat. Rev. Cardiol. 2021. https://doi.org/10.1038/s41569-021-00522-7.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Bradley, M. M., and P. J. Lang. International affective digitized sounds (IADS): stimuli, instruction manual and affective ratings (Tech. Rep. No. B-2). Gainesville: The Center for Research in Psychophysiology, University of Florida, 1999.

  10. Bulagang, A. F., N. G. Weng, J. Mountstephens, and J. Teo. A review of recent approaches for emotion classification using electrocardiography and electrodermography signals. Inform. Med. UnLocked. 2020. https://doi.org/10.1016/j.imu.2020.100363.

    Article  Google Scholar 

  11. Callejas-Cuervo, M., L. A. Martínez-Tejada, and A. C. Alarcón-Aldana. Emotion recognition techniques using physiological signals and video games-systematic review. Revista Facultad De Ingeniería. 2017. https://doi.org/10.19053/01211129.v26.n46.2017.7310.

    Article  Google Scholar 

  12. Calvo, R. A., and S. D’Mello. Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 2010. https://doi.org/10.1109/T-AFFC.2010.1.

    Article  Google Scholar 

  13. Chaturvedi, V., A. B. Kaur, V. Varshney, A. Garg, G. S. Chhabra, and M. Kumar. Music mood and human emotion recognition based on physiological signals: a systematic review. Multimed. Syst. 2021. https://doi.org/10.1007/s00530-021-00786-6.

    Article  Google Scholar 

  14. Chen, Y. C., C. C. Hsiao, W. D. Zheng, R. G. Lee, and R. Lin. Artificial neural networks-based classification of emotions using wristband heart rate monitor data. Medicine. 2019. https://doi.org/10.1097/MD.0000000000016863.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Chen, G., Y. Zhu, Z. Hong, and Z. Yang. Emotional-GAN: generating ECG to enhance emotion state classification. In: Proceedings of the 2019 International Conference on Artificial Intelligence and Computer Science, 2019. https://doi.org/10.1145/3349341.3349422

  16. Chen, P., B. Zou, A. N. Belkacem, X. Lyu, X. Zhao, W. Yi, Z. Huang, J. Liang, and C. Chen. An improved multi-input deep convolutional neural network for automatic emotion recognition. Front. Neurosci. 2022. https://doi.org/10.3389/fnins.2022.965871.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Chung, W. J., P. Patwa, and M. M. Markov. Microsoft Corp. Targeting advertisements based on emotion. U.S. Patent Application 12/958,775, 2012.

  18. Correa, J. A. M., M. K. Abadi, N. Sebe, and I. Patras. Amigos: a dataset for affect, personality and mood research on individuals and groups. IEEE Trans. Affect. Comput. 2018. https://doi.org/10.1109/TAFFC.2018.2884461.

    Article  Google Scholar 

  19. Desmet, P., and P. Hekkert. Framework of product experience. Int. J. Des. 1(1):57–66, 2007.

    Google Scholar 

  20. Domínguez-Jiménez, J. A., K. C. Campo-Landines, J. C. Martínez-Santos, E. J. Delahoz, and S. H. Contreras-Ortiz. A machine learning model for emotion recognition from physiological signals. Biomed. Signal Process. Control. 2020. https://doi.org/10.1016/j.bspc.2019.101646.

    Article  Google Scholar 

  21. Dzedzickis, A., A. Kaklauskas, and V. Bucinskas. Human emotion recognition: review of sensors and methods. Sensors. 2020. https://doi.org/10.3390/s20030592.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Fan, T., S. Qiu, Z. Wang, H. Zhao, J. Jiang, Y. Wang, J. Xu, T. Sun, and N. Jiang. A new deep convolutional neural network incorporating attentional mechanisms for ECG emotion recognition. Comput. Biol. Med. 2023. https://doi.org/10.1016/j.compbiomed.2023.106938.

    Article  PubMed  Google Scholar 

  23. Guo, H. W., Y. S. Huang, C. H. Lin, J. C. Chien, K. Haraikawa, and J. S. Shieh. Heart rate variability signal features for emotion recognition by using principal component analysis and support vectors machine. In: 2016 IEEE 16th International Conference on Bioinformatics and Bioengineering (BIBE). IEEE. 2016. https://doi.org/10.1109/BIBE.2016.40

  24. Harper, R., and J. Southern. End-to-end prediction of emotion from heartbeat data collected by a consumer fitness tracker. In: 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII). IEEE. 2019. https://doi.org/10.1109/ACII.2019.8925520

  25. Hasnul, M. A., N. A. A. Aziz, S. Alelyani, M. Mohana, and A. A. Aziz. Electrocardiogram-based emotion recognition systems and their applications in healthcare—a review. Sensors. 2021. https://doi.org/10.3390/s21155015.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Hasnul, M. A., N. A. A. Aziz, and A. A. Aziz. Augmenting ECG data with multiple filters for a better emotion recognition system. Arab. J. Sci. Eng. 2023. https://doi.org/10.1007/s13369-022-07585-9.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Hossain, M. S., and G. Muhammad. Emotion-aware connected healthcare big data towards 5G. IEEE Internet Things J. 2017. https://doi.org/10.1109/JIOT.2017.2772959.

    Article  Google Scholar 

  28. Hsu, Y. L., J. S. Wang, W. C. Chiang, and C. H. Hung. Automatic ECG-based emotion recognition in music listening. IEEE Trans. Affect. Comput. 2017. https://doi.org/10.1109/TAFFC.2017.2781732.

    Article  Google Scholar 

  29. Katsigiannis, S., and N. Ramzan. DREAMER: a data-base for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices. IEEE J. Biomed. Health Inform. 2017. https://doi.org/10.1109/JBHI.2017.2688239.

    Article  PubMed  Google Scholar 

  30. Khan, A. N., A. A. Ihalage, Y. Ma, B. Liu, Y. Liu, and Y. Hao. Deep learning framework for subject-independent emotion detection using wireless signals. PLoS ONE. 2021. https://doi.org/10.1371/journal.pone.0242946.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Kitchenham, B. Procedures for Performing Systematic Reviews, Vol. 33, Keele: Keele University, pp. 1–26, 2004.

    Google Scholar 

  32. Kitchenham, B., and S. Charters. Guidelines for performing systematic literature reviews in software engineering. 2007. https://userpages.uni-koblenz.de/~laemmel/esecourse/slides/slr.pdf. Accessed 02 Jul 2022.

  33. Koldijk, S., M. Sappelli, S. Verberne, M. A. Neerincx, and W. Kraaij. The swell knowledge work dataset for stress and user modeling research. In: Proceedings of the 16th International Conference on Multi-modal Interaction. 2014. https://doi.org/10.1145/2663204.2663257

  34. Lang, P., and M. M. Bradley. The International Affective Picture System (IAPS) in the study of emotion and attention. Handbook of emotion elicitation and assessment, 2007. pp. 70–73.

  35. Lee, M., Y. K. Lee, M. T. Lim, and T. K. Kang. Emotion recognition using convolutional neural network with selected statistical photoplethysmogram features. Appl. Sci. 2020. https://doi.org/10.3390/app10103501.

    Article  Google Scholar 

  36. Marrero-Fernández, P., A. Montoya-Padrón, A. Jaumei-Capó, and J. M. Buades Rubio. Evaluating the re-search in automatic emotion recognition. IETE Tech. Rev. 2014. https://doi.org/10.1080/02564602.2014.906863.

    Article  Google Scholar 

  37. Mauss, I. B., and M. D. Robinson. Measures of Emotion: A Review. London: Psychology Press, 2010. https://doi.org/10.1080/02699930802204677.

    Book  Google Scholar 

  38. Mellouk, W., and W. Handouzi. CNN-LSTM for automatic emotion recognition using contactless photoplythesmographic signals. Biomed. Signal Process. Control. 2023. https://doi.org/10.1016/j.bspc.2023.104907.

    Article  Google Scholar 

  39. Montesinos, V., F. Dell’Agnola, A. Arza, A. Aminifar, and D. Atienza. Multi-modal acute stress recognition using off-the-shelf wearable devices. In: 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE. 2019. https://doi.org/10.1109/EMBC.2019.8857130

  40. Nelson, B. W., C. A. Low, N. Jacobson, P. Areán, J. Torous, and N. B. Allen. Guidelines for wrist-worn consumer wearable assessment of heart rate in biobehavioral research. NPJ Digital Med. 2020. https://doi.org/10.1038/s41746-020-0297-4.

    Article  Google Scholar 

  41. Oh, S., J. Y. Lee, and D. K. Kim. The design of CNN architectures for optimal six basic emotion classification using multiple physiological signals. Sensors. 2020. https://doi.org/10.3390/s20030866.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Pao, W. The many applications of emotion recognition. a&s International. 2017. https://www.asmag.com/showpost/23883.aspx. Accessed 01 Jun 2021.

  43. Pessoa, L. Do intelligent robots need emotion? Trends Cogn. Sci. 2017. https://doi.org/10.1016/j.tics.2017.06.010.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Pham, P., and J. Wang. Understanding emotional responses to mobile video advertisements via physiological signal sensing and facial expression analysis. In: Proceedings of the 22nd International Conference on Intelligent User Interfaces. 2017. https://doi.org/10.1145/3025171.3025186

  45. Picard, R. W. Affective Computing. New York: MIT Press, 2000.

    Book  Google Scholar 

  46. Picard, R. W., and J. Healey. Affective wearables. Pers. Technol. 1997. https://doi.org/10.1007/BF01682026.

    Article  Google Scholar 

  47. Pinto, G., J. M. Carvalho, F. Barros, S. C. Soares, A. J. Pinho, and S. Brás. Multimodal emotion evaluation: a physiological model for cost-effective emotion classification. Sensors. 2020. https://doi.org/10.3390/s20123510.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Plutchik, R. The nature of emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. Am. Sci. 89(4):344–350, 2001.

    Article  Google Scholar 

  49. Poels, K., and S. Dewitte. How to capture the heart? Reviewing 20 years of emotion measurement in advertising. J. Advert. Res. 2006. https://doi.org/10.2501/S0021849906060041.

    Article  Google Scholar 

  50. Raheel, A., M. Majid, and S. M. Anwar. DEAR-MULSEMEDIA: dataset for emotion analysis and recognition in response to multiple sensorial media. Inf. Fusion. 2021. https://doi.org/10.1016/j.inffus.2020.08.007.

    Article  Google Scholar 

  51. Rakshit, R., V. R. Reddy, and P. Deshpande. Emotion detection and recognition using HRV features de-rived from photoplethysmogram signals. In: Proceedings of the 2nd workshop on Emotion Representations and Modelling for Companion Systems. 2016. https://doi.org/10.1145/3009960.3009962

  52. Saffaryazdi, N., S. T. Wasim, K. Dileep, A. F. Nia, S. Nanayakkara, E. Broadbent, and M. Billinghurst. Using facial micro-expressions in combination with EEG and physiological signals for emotion recognition. Front. Psychol. 2022. https://doi.org/10.3389/fpsyg.2022.864047.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Saganowski, S., A. Dutkowiak, A. Dziadek, M. Dzieżyc, J. Komoszyńska, W. Michalska, A. Polak, M. Ujma, and P. Kazienko. Emotion recognition using wearables: a systematic literature review-work-in-progress. In: 2020 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops). 2020. https://doi.org/10.1109/PerComWorkshops48775.2020.9156096

  54. Sarkar, P., and A. Etemad. Self-supervised ECG representation learning for emotion recognition. IEEE Trans. Affect. Comput. 2020. https://doi.org/10.1109/TAFFC.2020.3014842.

    Article  Google Scholar 

  55. Saxena, A., A. Khanna, and D. Gupta. Emotion recognition and detection methods: a comprehensive survey. J. Artif. Intell. Syst. 2020. https://doi.org/10.33969/AIS.2020.21005.

    Article  Google Scholar 

  56. Scherer, K. R. What are emotions? And how can they be measured? Soc. Sci. Inf. 2005. https://doi.org/10.1177/0539018405058216.

    Article  Google Scholar 

  57. Schmidt, P., A. Reiss, R. Duerichen, C. Marberger, and K. Van Laerhoven. Introducing wesad, a multi-modal dataset for wearable stress and affect detection. In: Proceedings of the 20th ACM international conference on multimodal interaction. 2018. https://doi.org/10.1145/3242969.3242985

  58. Shu, L., Y. Yu, W. Chen, H. Hua, Q. Li, J. Jin, and X. Xu. Wearable emotion recognition using heart rate data from a smart bracelet. Sensors. 2020. https://doi.org/10.3390/s20030718.

    Article  PubMed  PubMed Central  Google Scholar 

  59. Siddharth, S., T. P. Jung, and T. J. Sejnowski. Utilizing deep learning towards multi-modal bio-sensing and vision-based affective computing. IEEE Trans. Affect. Comput. 2019. https://doi.org/10.1109/TAFFC.2019.2916015.

    Article  Google Scholar 

  60. Soleymani, M., J. Lichtenauer, T. Pun, and M. Pantic. A multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 2011. https://doi.org/10.1109/T-AFFC.2011.25.

    Article  Google Scholar 

  61. Song, T., W. Zheng, C. Lu, Y. Zong, X. Zhang, and Z. Cui. MPED: a multi-modal physiological emotion database for discrete emotion recognition. IEEE Access. 2019. https://doi.org/10.1109/ACCESS.2019.2891579.

    Article  Google Scholar 

  62. Spinelli, S., and M. Niedziela. Emotion measurements and application to product and packaging development. In: Integrating the Packaging and Product Experience in Food and Beverages. Woodhead Publishing. 2016. https://doi.org/10.1016/B978-0-08-100356-5.00005-X

  63. Sreeja, P. S., and G. Mahalakshmi. Emotion models: a review. Int. J. Control Theory Appl. 10:651–657, 2017.

    Google Scholar 

  64. Suzuki, K., T. Laohakangvalvit, R. Matsubara, and M. Sugaya. Constructing an emotion estimation model based on eeg/hrv indexes using feature ex-traction and feature selection algorithms. Sensors. 2021. https://doi.org/10.3390/s21092910.

    Article  PubMed  PubMed Central  Google Scholar 

  65. Tawsif, K., N. A. A. Aziz, J. E. Raja, J. Hossen, and M. Z. H. Jesmeen. A systematic review on emotion Recognition System using physiological signals: data acquisition and methodology. Sci. J Emerg. 2022. https://doi.org/10.28991/ESJ-2022-06-05-017.

    Article  Google Scholar 

  66. Tomar, P. S., K. Mathur, and U. Suman. Unimodal approaches for emotion recognition: a systematic review. Cogn. Syst. Res. 2022. https://doi.org/10.1016/j.cogsys.2022.10.012.

    Article  Google Scholar 

  67. Udovičić, G., Ðerek, J., Russo, M., and M. Sikora. Wearable emotion recognition system based on GSR and PPG signals. In: Proceedings of the 2nd International Workshop on Multimedia for Personal Health and Health Care. 2017. https://doi.org/10.1145/3132635.3132641

  68. Yan, X., Z. Lin, Z. Lin, and B. Vucetic. A Novel Exploitative and Explorative GWO-SVM Algorithm for Smart Emotion Recognition. IEEE Internet of Things Journal. 2023. https://doi.org/10.1109/JIOT.2023.3235356.

    Article  Google Scholar 

  69. Yang, C. J., Fahier, N., He, C. Y., Li, W. C., and W. C. Fang. An AI-edge platform with multimodal wearable physiological signals monitoring sensors for affective computing applications. In: 2020 IEEE International Symposium on Circuits and Systems (IS-CAS). IEEE. 2020. https://doi.org/10.1109/ISCAS45731.2020.9180909

  70. Yang, W., Rifqi, M., Marsala, C., and A. Pinna. Physiological-based emotion detection and recognition in a video game context. In: 2018 International Joint Conference on Neural Networks (IJCNN). IEEE. 2018. https://doi.org/10.1109/IJCNN.2018.8489125

  71. Yu, G., Li, X., Song, D., Zhao, X., Zhang, P., Hou, Y., and B. Hu. Encoding physiological signals as images for affective state recognition using convolutional neural networks. In: 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE. 2016. https://doi.org/10.1109/EMBC.2016.7590825

  72. Zhang, J., Z. Yin, P. Chen, and S. Nichele. Emotion recognition using multi-modal data and machine learning techniques: a tutorial and review. Inf. Fusion. 2020. https://doi.org/10.1016/j.inffus.2020.01.011.

    Article  PubMed  PubMed Central  Google Scholar 

  73. Zhao, M., Adib, F., and D. Katabi. Emotion recognition using wireless signals. In: Proceedings of the 22nd Annual International Conference on Mobile Computing and Networking. 2016. https://doi.org/10.1145/2973750.2973762

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tatiana Sousa Cunha.

Ethics declarations

Conflict of interest

The authors declare that there is no conflict of interest.

Additional information

Associate Editor Jane Grande-Allen oversaw the review of this article.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Claret, A.F., Casali, K.R., Cunha, T.S. et al. Automatic Classification of Emotions Based on Cardiac Signals: A Systematic Literature Review. Ann Biomed Eng 51, 2393–2414 (2023). https://doi.org/10.1007/s10439-023-03341-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10439-023-03341-8

Keywords

Navigation