Using Electroencephalograms to Interpret and Monitor the Emotions
Detecting the real-time human emotions became recently one important issue in Artificial Intelligent (AI). Numbers of research on emotional facial expressions, the effect of emotion on heart rate, eye movement and the evolution of emotions with the time show the interest of this topic. This paper presents a method for observing the human’s emotional evolutions (sequence of emotions) based on brain activities in its different parts. The Emotiv EPOC headset collects the data of Electroencephalograms (EEG) of the participant to calculate the arousal and valence. After training the system with headset output data, the noise and brain’s data other than emotional information will be cut out according to two levels of filtering. Finally, mapping the result (arousal and valence) with two dimensions circumplex space model presents the real-time emotional evolutions of the participant. Real-time emotional evolutions show all the picks of positive and negative feelings, moreover, analyzing the EEG data will allow recognizing the general emotions, which are the strongest routine senses of the participant. Comparing the unexpected reaction, the time taken by general emotion and feelings in picks, give us a tool to observe the health situation of the people and on the other hand, is an instrument to measure the mood of the people against a video game, news and advertising.
KeywordsArtificial intelligence Emotion Sentiment analysis Virtual reality Arousal Valence Opinion mining
Unable to display preview. Download preview PDF.
- 4.Nie, D., Wang, X.-W., Shi, L.-C., Lu, B.-L.: EEG-based emotion recognition during watching movies. In: 2011 5th International IEEE/EMBS Conference on Neural Engineering (NER), pp. 667–670. IEEE (2011)Google Scholar
- 5.Benlamine, S., Chaouachi, M., Villata, S., Cabrio, E., Frasson, C., Gandon, F.: Emotions in argumentation: an empirical evaluation. In: International Joint Conference on Artificial Intelligence, IJCAI 2015, pp. 156–163 (2015)Google Scholar
- 6.Benlamine, M.S., Chaouachi, M., Frasson, C., Dufresne, A.: Physiology-based recognition of Facial micro-expressions using EEG and identification of the relevant sensors by emotionGoogle Scholar
- 8.Derbali, L., Ghali, R., Frasson, C.: Assessing motivational strategies in serious games using hidden markov models. In: FLAIRS Conference 2013 (2013)Google Scholar
- 9.Wang, S., Esfahani, E.T., Sundararajan, V.: Evaluation of SSVEP as passive feedback for improving the performance of brain machine interfaces. In: ASME 2012 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference 2012, pp. 695–701. American Society of Mechanical Engineers (2012)Google Scholar
- 11.Lécuyer, A., Lotte, F., Reilly, R.B., Leeb, R., Hirose, M., Slater, M.: Brain-computer interfaces, virtual reality, and videogames. Computer 41(10) (2008)Google Scholar
- 18.Lang, P.J.: International affective picture system (IAPS): Digitized photographs, instruction manual and affective ratings. Technical Report (2005)Google Scholar
- 19.Sourina, O., Liu, Y.: A fractal-based algorithm of emotion recognition from EEG using arousal-valence model. In: BIOSIGNALS 2011, pp. 209–214 (2011)Google Scholar
- 21.Benlamine, M.S., Bouslimi, S., Harley, J.M., Frasson, C., Dufresne, A.: Toward brain-based gaming: measuring engagement during gameplay. In: EdMedia: World Conference on Educational Media and Technology 2015, vol. 1, pp. 717–722 (2015)Google Scholar