Skip to main content

Advertisement

Log in

Eye-blinking analysis as a marker of emotional states

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

In recent years, designing intelligent humanoid robots/computers capable of recognizing and responding to human emotions becomes the objective of many studies. To achieve this goal, various stimuli and paradigms were evaluated in laboratory set-ups. In bio-signal processing approaches, the primary focus of experiments has been devoted to the signal which reflects the inner status of man in encountering affective stimuli. However, external statuses, like eye features, can also provide pertinent information about emotions. The main objective of this study was to evaluate the ability of the eye-blinking measures in a single modality form for designing an affect recognizer. We analyzed different statistical, spectral- and nonlinear-based features of the signals of the SEED-IV database in sad, happy, fear, and neutral affective states. Additionally, several decision-making strategies including naïve Bayes, support vector machine, feedforward neural network (NN), and the k-nearest neighbor were carried out. The role of parameterization of classification algorithms on emotion recognition rates was also investigated. The NN classifier in tenfold cross-validation outperformed the other classification schemes. By implementing NN with a hidden layer size of 8, a classification accuracy of 98.67% was achieved for fear/neutral discrimination. The critical role of classification parameter adjustment to the results was also revealed. However, we did not conclude the optimal classifier parameter to obtain maximum performance. The impressive performance of the presented algorithm makes our proposed framework a superior one compared to the state-of-the-art approaches and paves the way for designing future emotion recognition systems based on eye-blinking data.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Availability of data and material

In this article, SEED-IV database [39] has been evaluated, which is feely accessible at: http://bcmi.sjtu.edu.cn/~seed/seed-iv.html. The codes have been also uploaded on Github.

References

  1. Al-gawwam S, Benaissa M (2018) Depression detection from eye blink features. In IEEE international symposium on signal processing and information technology (ISSPIT), Louisville, KY, USA, pp 388–392

  2. Alghowinem S, AlShehri M, Goecke R, Wagner M (2014) Exploring Eye activity as an indication of emotional states using an eye-tracking sensor. In Chen L, Kapoor S, Bhatia R (eds) Intelligent systems for science and information studies in computational intelligence, vol 542. Springer, Cham

    Google Scholar 

  3. Alghowinem S, Goecke R, Cohn JF, Wagner M, Parker G, Breakspear M (2015) Cross-cultural detection of depression from nonverbal behavior. In Proceedings of the IEEE international conference on automatic face and gesture detection, Ljubljana, Slovenia

  4. Becker H, Fleureau J, Guillotel P, Wendling F, Merlet I, Albera L (2020) Emotion recognition based on high-resolution EEG recordings and reconstructed brain sources. IEEE Trans Affect Comput 11(2):244–257

    Article  Google Scholar 

  5. Bek J, Poliakoff E, Lander K (2020) Measuring emotion recognition by people with Parkinson’s disease using eye-tracking with dynamic facial expressions. J Neurosci Methods 331:108524

    Article  Google Scholar 

  6. Bentivoglio AR, Bressman SB, Cassetta E, Carretta D, Tonali P, Albanese A (1997) Analysis of blink rate patterns in normal subjects. Mov Disord 12(6):1028–1034

    Article  Google Scholar 

  7. Black MH, Chen NTM, Iyer KK et al (2017) Mechanisms of facial emotion recognition in autism spectrum disorders: insights from eye tracking and electroencephalography. Neurosci Biobehav Rev 80:488–515

    Article  Google Scholar 

  8. Cohn JF, Xiao J, Moriyama T et al (2003) Automatic recognition of eye blinking in spontaneously occurring behavior. Behav Res Methods Instrum Comput 35:420–428

    Article  Google Scholar 

  9. Ekman P (1973) Darwin and facial expression: a century of research in review. Academic Press, New York

    Google Scholar 

  10. Goshvarpour A, Goshvarpour A (2020) A novel approach for EEG electrode selection in automated emotion recognition based on lagged Poincare’s indices and sLORETA. Cogn Comput 12:602–618

    Article  Google Scholar 

  11. Goshvarpour A, Goshvarpour A (2019) EEG spectral powers and source localization in depressing, sad, and fun music videos focusing on gender differences. Cogn Neurodyn 13(2):161–173

    Article  Google Scholar 

  12. Goshvarpour A, Abbasi A, Goshvarpour A (2016) Combination of sLORETA and nonlinear coupling for emotional EEG source localization. Nonlinear Dyn Psychol Life Sci 20(3):353–368

    Google Scholar 

  13. Goshvarpour A, Abbasi A, Goshvarpour A (2017) Fusion of heart rate variability and pulse rate variability for emotion recognition using lagged poincare plots. Australas Phys Eng Sci Med 40(3):617–629

    Article  Google Scholar 

  14. Goshvarpour A, Abbasi A, Goshvarpour A (2017) Indices from lagged poincare plots of heart rate variability: an efficient nonlinear tool for emotion discrimination. Australas Phys Eng Sci Med 40(2):277–287

    Article  Google Scholar 

  15. Goshvarpour A, Abbasi A, Goshvarpour A (2017) Do men and women have different ECG responses to sad pictures? Biomed Signal Process Control 38:67–73

    Article  Google Scholar 

  16. Goshvarpour A, Abbasi A, Goshvarpour A (2017) An accurate emotion recognition system using ECG and GSR signals and matching pursuit method. Biomed J 40:355–368

    Article  Google Scholar 

  17. Goshvarpour A, Abbasi A, Goshvarpour A, Daneshvar S (2017) Discrimination between different emotional states based on the chaotic behavior of galvanic skin responses. SIViP 11(7):1347–1355

    Article  Google Scholar 

  18. Goshvarpour A, Goshvarpour A (2020) The potential of photoplethysmogram and galvanic skin response in emotion recognition using nonlinear features. Phys Eng Sci Med 43:119–134

    Article  Google Scholar 

  19. Goshvarpour A, Goshvarpour A (2018) Poincaré’s section analysis for PPG-based automatic emotion recognition. Chaos Solitons Fractals 114:400–407

    Article  Google Scholar 

  20. Goshvarpour A, Goshvarpour A (2020) Evaluation of novel entropy-based complex wavelet sub-bands measures of PPG in an emotion recognition system. J Med Biol Eng 40:451–461

    Article  Google Scholar 

  21. Guo J, Zhou R, Zhao L, Lu B (2019) Multimodal Emotion Recognition from Eye Image, Eye Movement and EEG Using Deep Neural Networks. Annu Int Conf IEEE Eng Med Biol Soc. Berlin, Germany, pp. 3071–3074

  22. Gruebler A, Suzuki K (2014) Design of a wearable device for reading positive expressions from facial EMG signals. IEEE Trans Affect Comput 5(3):227–237

    Article  Google Scholar 

  23. Higuchi T (1988) Approach to an irregular time series on the basis of the fractal theory. Physica D 31(2):277–283

    Article  MathSciNet  Google Scholar 

  24. Hsu YL, Wang JS, Chiang WC, Hung CH (2020) Automatic ECG-based emotion recognition in music listening. IEEE Trans Affect Comput 11(1):85–99

    Article  Google Scholar 

  25. Kowler E, Anderson E, Dosher B, Blaser E (1995) The role of attention in the programming ofsaccades. Vision Res 35(13):1897–1916

    Article  Google Scholar 

  26. Lipton RB, Levin S, Holzman PS (1980) Horizontal and vertical pursuit eye movements, the oculocephalic reflex, and the functional psychoses. Psychiatry Res 3(2):193–203

    Article  Google Scholar 

  27. Lu Y, Zheng WL, Li B, Lu BL (2015) Combining eye movements and EEG to enhance emotion recognition. In Proceedings in international joint conference on artificial intelligence, Buenos Aires, Argentina, pp 1170–1176

  28. Lamba PS, Virmani D (2018) Information retrieval from emotions and eye blinks with help of sensor nodes. Int J Electr Comput Eng 8(4):2433–2441

    Google Scholar 

  29. Mackintosh J, Kumar R, Kitamura T (1983) Blink rate in psychiatric illness. Br J Psychiatry 143(1):55–57

    Article  Google Scholar 

  30. Maffei A, Angrilli A (2019) Spontaneous blink rate as an index of attention and emotion during film clips viewing. Physiol Behav 204:256–263

    Article  Google Scholar 

  31. McMonnies CW (2010) Blinking mechanisms. In Dartt DA (ed) Encyclopedia of the Eye. Academic Press, New York, pp 202–208

    Chapter  Google Scholar 

  32. Nardelli M, Valenza G, Greco A, Lanata A, Scilingo EP (2015) Recognizing emotions induced by affective sounds through heart rate variability. IEEE Trans Affect Comput 6(4):385–394

    Article  Google Scholar 

  33. Partala T, Jokiniemi M, Surakka V (2000) Pupillary responses to emotionally provocative stimuli. In Proceedings of the 2000 symposium on Eye tracking research & applications (ETRA '00). Association for Computing Machinery, New York, NY, USA, pp 123–129

  34. Singh MI, Singh M (2017) Development of a real time emotion classifier based on evoked EEG. Biocybern Biomed Eng 37(3):498–509

    Article  Google Scholar 

  35. Smith ML, Cottrell GW, Gosselin F, Schyns PG (2005) Transmitting and decoding facial expressions. Psychol Sci 16(3):184–189

    Article  Google Scholar 

  36. Schmidtmann G, Logan AJ, Carbon CC, Loong JT, Gold I (2020) In the blink of an eye: Reading mental states from briefly presented eye regions. i-Perception 11(5):1–18

    Article  Google Scholar 

  37. Schurgin MW, Nelson J, Iida S, Ohira H, Chiao JY, Franconeri SL (2014) Eye movements during emotion recognition in faces. J Vis 14:14

    Article  Google Scholar 

  38. Zheng WL, Dong BN, Lu BL (2014) Multimodal emotion recognition using EEG and eye tracking data. Annu Int Conf IEEE Eng Med Biol Soc 2014:5040–5043

    Google Scholar 

  39. Zheng WL, Liu W, Lu Y, Lu BL, Cichocki A (2019) Emotionmeter: a multimodal framework for recognizing human emotions. IEEE Trans Cybern 49(3):1110–1122

    Article  Google Scholar 

Download references

Funding

“This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors”.

Author information

Authors and Affiliations

Authors

Contributions

Both authors are equally involved in performing all sections of the article.

Corresponding author

Correspondence to Ateke Goshvarpour.

Ethics declarations

Ethics approval

This article examined the SEED-IV database [39] which is freely available in the public domain. This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Informed consent was obtained from all individual participants included in the study [39].

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Goshvarpour, A., Goshvarpour, A. Eye-blinking analysis as a marker of emotional states. Multimed Tools Appl 80, 33727–33746 (2021). https://doi.org/10.1007/s11042-021-11304-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-021-11304-1

Keywords

Navigation