Skip to main content

Application of Annotation Smoothing for Subject-Independent Emotion Recognition Based on Electroencephalogram

  • Conference paper
  • First Online:
  • 549 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 10004))

Abstract

In the construction of computational models to recognize emotional state, emotion reporting continuously in time is essential based on the assumption that emotional responses of a human to certain stimuli could vary over time. However, currently existing methods to annotate emotion in temporal continuous fashion are confronting various types of challenges. Therefore, the manipulation of the annotated emotion prior to labeling training samples is necessary. In this work, we present an early attempt to manipulate the emotion annotated in arousal-valence space by applying three different signal filtering techniques to smooth annotation data; moving average filter, Savitzky-Golay filter, and me-dian filter. We conducted experiments of emotion recognition in music listening tasks employing brainwave signals recorded from an electroencephalogram (EEG). Smoothed annotation data were used to label the features extracted from EEG signals to train emotion recognizers using classification and regression techniques. Our empirical results indicated the potential of the moving average filter that could increase the performance of emotion recognition evaluated in subject-independent fashion.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    http://www.mathworks.com/products/signal/.

References

  • Candra, H., Yuwono, M., Chai, R., Handojoseno, A., Elamvazuthi, I., Nguyen, H., Su, S.: Investigation of window size in classification of EEG-emotion signal with wavelet entropy and support vector machine. In: Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 7250–7253 (2015)

    Google Scholar 

  • Cowie, R., Douglas-Cowie, E., Savvidou, S., McMahon, E., Sawey, M., Schröder, M.: Feeltrace: an instrument for recording perceived emotion in real time. In: Proceedings of the ISCA Workshop on Speech and Emotion, pp. 19–24 (2000)

    Google Scholar 

  • Delorme, A., Mullen, T., Kothe, C., Acar, Z.A., Bigdely-Shamlo, N., Vankov, A., Makeig, S.: EEGLAB, SIFT, NFT, BCILAB, and ERICA: new tools for advanced EEG processing. Comp. Intell. Neurosci. (2011)

    Google Scholar 

  • Gunes, H., Schuller, B.: Categorical and dimensional affect analysis in continuous input: Current trends and future directions. Image Vis. Comput. 31(2), 120–136 (2013)

    Article  Google Scholar 

  • Higuchi, T.: Approach to an irregular time series on the basis of the fractal theory. Physica D 31(2), 277–283 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  • Kim, M.K., Kim, M., Oh, E., Kim, S.P.: A review on the computational methods for emotional state estimation from the human EEG. Comput. Math., Methods in Medicine (2013)

    MATH  Google Scholar 

  • Koelsch, S.: Brain correlates of music-evoked emotions. Nat. Rev. Neurosci. 15(3), 170–180 (2014)

    Article  Google Scholar 

  • Koelstra, S., Muhl, C., Soleymani, M., Lee, J.S., Yazdani, A., Ebrahimi, T., Pun, T., Nijholt, A., Patras, I.: DEAP: a database for emotion analysis using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012)

    Article  Google Scholar 

  • Lin, Y.P., Yang, Y.H., Jung, T.P.: Fusion of electroencephalogram dynamics and musical contents for estimating emotional responses in music listening. Front. Neurosci. 8(94) (2014)

    Google Scholar 

  • Mariooryad, S., Busso, C.: Analysis and compensation of the reaction lag of evaluators in continuous emotional annotations. In: Proceedings of the 5th Humaine Association Conference on Affective Computing and Intelligent Interaction, pp. 85–90 (2013)

    Google Scholar 

  • Metallinou, A., Narayanan, S.: Annotation and processing of continuous emotional attributes: Challenges and opportunities. In: Proceedings of the 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition. pp. 1–8 (2013)

    Google Scholar 

  • Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980)

    Article  Google Scholar 

  • Savitzky, A., Golay, M.J.E.: Smoothing and differentiation of data by simplified least squares procedures. Anal. Chem. 36(8), 1627–1639 (1964)

    Article  Google Scholar 

  • Soleymani, M., Asghari-Esfeden, S., Fu, Y., Pantic, M.: Analysis of eeg signals and facial expressions for continuous emotion detection. IEEE Trans. Affect. Comput. 7(1), 17–28 (2016)

    Article  Google Scholar 

  • Soleymani, M., Lichtenauer, J., Pun, T., Pantic, M.: A multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 3(1), 42–55 (2012)

    Article  Google Scholar 

  • Sourina, O., Liu, Y., Nguyen, M.K.: Real-time EEG-based emotion recognition for music therapy. J. Multimodal. User. Int. 5(1–2), 27–35 (2012)

    Article  Google Scholar 

  • Sourina, O., Wang, Q., Liu, Y., Nguyen, M.K.: A real-time fractal-based brain state recognition from eeg and its applications. In: Babiloni, F., Fred, A.L.N., Filipe, J., Gamboa, H. (eds.) Proceedings of the BIOSIGNALS, pp. 82–90 (2011)

    Google Scholar 

  • Thammasan, N., Moriyama, K., Fukui, K., Numao, M.: Continuous music- emotion recognition based on electroencephalogram. IEICE Trans. Inform. Syst. E99-D(4), 1234–1241 (2016)

    Article  Google Scholar 

Download references

Acknowledgment

This research is partially supported by the Center of Innovation Program from Japan Science and Technology Agency (JST), JSPS KAKENHI Grant Number 25540101, and the Management Expenses Grants for National Universities Corporations from the Ministry of Education, Culture, Sports, Science and Technology of Japan (MEXT).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nattapong Thammasan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Thammasan, N., Fukui, Ki., Numao, M. (2017). Application of Annotation Smoothing for Subject-Independent Emotion Recognition Based on Electroencephalogram. In: Numao, M., Theeramunkong, T., Supnithi, T., Ketcham, M., Hnoohom, N., Pramkeaw, P. (eds) Trends in Artificial Intelligence: PRICAI 2016 Workshops. PRICAI 2016. Lecture Notes in Computer Science(), vol 10004. Springer, Cham. https://doi.org/10.1007/978-3-319-60675-0_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-60675-0_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-60674-3

  • Online ISBN: 978-3-319-60675-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics