Human Emotion Recognition Using an EEG Cloud Computing Platform

  • Huimin LuEmail author
  • Mei Wang
  • Arun Kumar Sangaiah


Human wearable helmet is a useful tool for monitoring the status of miners in the mining industry. However, there is little research regarding human emotion recognition in an extreme environment. To the best of our knowledge, this paper is the first to describe the human anxiety change rule and to propose a cloud computing platform for detecting human emotions using brain-computer interface (BCI) devices. In this paper, an emotional state evoked paradigm is designed to identify the brain area where the emotion feature is most evident. Next, the correct electrode position is determined for the collection of the negative emotion by the electroencephalograph (EEG) based on the international 10–20 system of electrode placement. Next, a fusion algorithm of the anxiety level is proposed to evaluate the person’s mental state using the θ, α, and β rhythms of an EEG. Next, the human smart helmet system is designed to collect the human state, which includes the mental parameters of the anxiety level, the fatigue level, the concentration level, and the environmental parameters in the coal mine. Experiments demonstrate that the position Fp2 is the best electrode position for obtaining the anxiety level parameter. The most visible EEG changes appear within the first 2 s following stimulation. The amplitudes of the θ rhythm increase most significantly in the negative emotional state. The fusion algorithm of the anxiety level accurately measures negative emotional change.


Emotion recognition EEG Cloud computing Internet of things 



This work was supported by Leading Initiative for Excellent Young Researcher (LEADER) of Ministry of Education, Culture, Sports, Science and Technology-Japan (16809746), Grant in Aid for Scientific Research of JSPS (17 K14694), Research Fund of State Key Laboratory of Marine Geology in Tongji University (MGK1608), Research Fund of State Key Laboratory of Ocean Engineering in Shanghai Jiaotong University (1510), Research Fund of The Telecommunications Advancement Foundation, Fundamental Research Developing Association for Shipbuilding and Offshore, Strengthening Research Support Project of Kyushu Institute of Technology, China National Natural Science Foundation under Grant 61702553 and in part by the MOE (Ministry of Education in China) Project of Humanities and Social Sciences under Grant 17YJCZH252.


  1. 1.
    Lu H, Li Y, Chen M, Kim H, Serikawa S (2017) Brain intelligence: go beyond artificial intelligence. Mobile Networks and Application 1-10Google Scholar
  2. 2.
    Khalili Z, Moradi M (2009) Emotion recognition system using brain and peripheral signals: using correlation dimension to improve the results of EEG. In: Proc. of International Joint Conference on Neural Networks. IEEE Press, Piscataway, pp 1920–1924Google Scholar
  3. 3.
    Kumar J, Kumar J (2016) Affective modelling of users in HCI using EEG. Procedia Computer Science 84:107–116CrossRefGoogle Scholar
  4. 4.
    Lee Y, Hsieh S (2014) Classifying different emotional states by means of EEG-based functional connectivity patterns. PLoS One 9(4):e95415. CrossRefGoogle Scholar
  5. 5.
    Murugappan M, Rizon M, Nagarajan R, Yaacob S, Hazry D, Zunaidi I (2008) Time-frequency analysis of EEG signals for human emotion detection. IFMBE Proc 21:262–265CrossRefGoogle Scholar
  6. 6.
    Wang C, Feng S, Wang D, Zhang Y (2015) Fuzzy-rough set based multi-labeled emotion intensity analysis for sentence, paragraph and document. Lect Notes Comput Sci 9362:444–452CrossRefGoogle Scholar
  7. 7.
    Hu J (2017) Comparison of different features and classifiers for driver fatigue detection based on a single EEG channel. Computational and Mathematical Methods in Medicine 2017:1–9Google Scholar
  8. 8.
    Langp J, Bradleym M, Cuthbertb H (1997) International affective picture system (IAPS): technical manual and affective ratings, pp 1-30Google Scholar
  9. 9.
    Andrey VB, Gennady GK, Alexander NS (2017) Depression and implicit emotion processing: an EEG study. Neurophysiologie Clinique/Clinical Neurophysiology 47(3):225–230CrossRefGoogle Scholar
  10. 10.
    Wang M, Guo L, Chen W (2017) Blink detection using Adaboost and contour circle for fatigue recognition. Comput Electr Eng 58:502–512CrossRefGoogle Scholar
  11. 11.
    Barrett L (1998) Discrete emotions or dimensions? The role of valence focus and arousal focus. Cognit Emot 12(4):579–599MathSciNetCrossRefGoogle Scholar
  12. 12.
    Olofsson J, Nordin S, Sequeira H, Polich J (2008) Affective picture processing: an integrative review of ERP findings. Biol Psychol 77(3):247–265CrossRefGoogle Scholar
  13. 13.
    Wang M, Xu C, Lu H (2017) Fault location without wave velocity influence using wavelet and Clark transform”, Artificial Intelligence and Robotics 321-326Google Scholar
  14. 14.
    Serikawa S, Lu H (2014) Underwater image dehazing using joint trilateral filter. Comput Electr Eng 40(1):41–50CrossRefGoogle Scholar
  15. 15.
    Lu H, Li B, Zhu J, Li Y, Li Y, Xu X, He L, Li X, Li J, Serikawa S (2017) Wound intensity correction and segmentation with convolutional neural networks. Concurrency and Computation: Practice and Experience 29(6):1–10CrossRefGoogle Scholar
  16. 16.
    Lu H, Li Y, Zhang L, Serikawa S (2015) Contrast enhancement for images in turbid water. JOSAA 32(5):886–893CrossRefGoogle Scholar
  17. 17.
    Li Y, H L, Li J, Li X, Li Y, Serikawa S (2016) Underwater image de-scattering and classification by deep neural network. Comput Electr Eng 54:68–77CrossRefGoogle Scholar
  18. 18.
    Lu H, Li Y, Nakashima S, Serikawa S (2016) Single image dehazing through improved atmospheric light estimation. Multimedia Tools and Applications 75(24):17081–17096CrossRefGoogle Scholar
  19. 19.
    Lu H, Li Y, Mu S, Wang D, Kim H, Serikawa S (2017) Motor anomaly detection for unmanned aerial vehicles using reinforcement learning. IEEE Internet of Things Journal 1-8. CrossRefGoogle Scholar
  20. 20.
    Lu H, Li Y, Li Y, Serikawa S, Kim H (2017) Highly accurate energy-conserving flexible touch sensors. Sensors and Materials 29(6):611–617Google Scholar
  21. 21.
    Dong W, Lu H, Yang M (2015) Kernel Collaborative Face Recognition. Pattern Recogn 48(10):3025–3237CrossRefGoogle Scholar
  22. 22.
    Wang D, Lu H, Li X (2011) Two dimensional principal components of natural images and its application. Neurocomputing 74(17):2745–2753CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Kyushu Institute of TechnologyKitakyushuJapan
  2. 2.Xi’an University of Science and TechnologyXi’anChina
  3. 3.VIT UniversityVelloreIndia

Personalised recommendations