Cognitive Computation

, Volume 6, Issue 2, pp 241–252 | Cite as

Classification of Music-Induced Emotions Based on Information Fusion of Forehead Biosignals and Electrocardiogram

  • Mohsen NajiEmail author
  • Mohammd Firoozabadi
  • Parviz Azadfallah


Emotion recognition systems have been developed to assess human emotional states during different experiences. In this paper, an approach is proposed for recognizing music-induced emotions through the fusion of three-channel forehead biosignals (the left temporalis, frontalis, and right temporalis channels) and an electrocardiogram. The classification of four emotional states in an arousal–valence space (positive valence/low arousal, positive valence/high arousal, negative valence/high arousal, and negative valence/low arousal) was performed by employing two parallel support vector machines as arousal and valence classifiers. The inputs of the classifiers were obtained by applying a fuzzy-rough model feature evaluation criterion and sequential forward floating selection algorithm. An average classification accuracy of 88.78 % was achieved, corresponding to an average valence classification accuracy of 94.91 % and average arousal classification accuracy of 93.63 %. The proposed emotion recognition system may be useful for interactive multimedia applications or music therapy.


Emotion classification Forehead biosignals ECG Arousal Valence 



We gratefully acknowledge the assistance of Ms. Atena Bajoulvand for her help with the collection of the data of female subjects. The authors would also like to thank the anonymous reviewers for their insightful comments.


  1. 1.
    Barkišli M. Les idées scientifiques de Farabi dans la musique. Pažūhišgāh-i Mūsīqī-šināsī-i Īrān; 1978.Google Scholar
  2. 2.
    Aldridge D. An overview of music therapy research. Complementary Ther Med. 1994;2:204–16.CrossRefGoogle Scholar
  3. 3.
    Trainor LJ, Schmidt LA. Processing emotions induced by music. In: Peretz I, Zatorre R, editors. The cognitive neuroscience of music. Oxford: Oxford University Press; 2003. p. 310–24.CrossRefGoogle Scholar
  4. 4.
    Sammler D, Grigutsch M, Fritz T, Koelsch S. Music and emotion: electrophysiological correlates of the processing of pleasant and unpleasant music. Psychophysiology. 2007;44:293–304.PubMedCrossRefGoogle Scholar
  5. 5.
    Pavlygina RA, Sakharov DS, Davydov VI. Spectral analysis of the human EEG during listening to musical compositions. Hum Physiol. 2004;30:54–60.CrossRefGoogle Scholar
  6. 6.
    Knight WEJ, Rickard NS. Relaxing music prevents stress-induced increases in subjective anxiety, systolic blood pressure, and heart rate in healthy males and females. J Music Ther. 2001;38:254–72.PubMedCrossRefGoogle Scholar
  7. 7.
    Bernardi L, Porta C, Sleight C. Cardiovascular, cerebrovascular, and respiratory changes induced by different types of music in musicians and non-musicians: the importance of silence. Heart. 2006;92:459–70.Google Scholar
  8. 8.
    Kallinen K. Emotion related psychological responses to listening to music with eyes-open versus eyes-closed: electrodermal (EDA), electrocardiac (ECG), and electromyographic (EMG) measures. In: Proceedings of 8th international conference on music perception and cognition. 2004. p. 299–301.Google Scholar
  9. 9.
    McFarland RA. Relationship of skin temperature changes to the emotions accompanying music. Biofeedback Self Regul. 1985;10:255–67.PubMedCrossRefGoogle Scholar
  10. 10.
    Janssen JH, Van den Broek EL, Westerink JHDM. Personalized affective music player. In: Proceedings of IEEE 3rd international conference on affective computing and intelligent interaction. Eindhoven. 2009. p. 1–6.Google Scholar
  11. 11.
    Kim J, André E. Emotion recognition based on physiological changes in music listening. IEEE Trans Pattern Anal Mach Intell. 2008;30:2067–83.PubMedCrossRefGoogle Scholar
  12. 12.
    Lin YP, Wang CH, Jung TP, Wu TL, Jeng SK, Duann JR, Chen JH. EEG-based emotion recognition in music listening. IEEE Trans Biomed Eng. 2010;57:1798–806.PubMedCrossRefGoogle Scholar
  13. 13.
    Firoozabadi SMP, Oskoei MRA, Hu H. A Human–Computer interface based on forehead Multi-Channel bio-signals to control a virtual wheelchair. In: Proceedings of 14th Iranian conference on biomedical engineering, Tehran. 2008. p. 108–113.Google Scholar
  14. 14.
    Rezazadeh IM, Wang X, Firoozabadi M, Golpayegani MRH. Using affective human–machine interface to increase the operation performance in virtual construction crane training system: a novel approach. Autom Constr. 2011;20:289–98.CrossRefGoogle Scholar
  15. 15.
    Rad RH, Firoozabadi M, Rezazadeh IM. Discriminating affective states in music induction environment using forehead bioelectric signals. In: Proceedings of 1st middle east conference on biomedical engineering, Sharjah. 2011. p. 343–346.Google Scholar
  16. 16.
    Ortony A, Clore GL, Collins A. The cognitive structures of emotions. Cambridge: Cambridge University Press; 1990.Google Scholar
  17. 17.
    Beigand E, Viellard S, Madurell F, Marozeau J, Dacquet A. Multidimensional scaling of emotional responses to music: the effect of musical expertise and of the duration of the excerpts. Cogn Emot. 2005;19:1113–39.CrossRefGoogle Scholar
  18. 18.
    Juslin PN, Västfjäll D. Emotional responses to music: the need to consider underlying mechanisms. Behav Brain Sci. 2008;31:559–621.PubMedCrossRefGoogle Scholar
  19. 19.
    Konečni VJ. Does music induce emotions? A theoretical and methodological analysis. Psychol Aesthet Creat Arts. 2008;2:115–29.CrossRefGoogle Scholar
  20. 20.
    Cambria E, Livingstone A, Hussain A. The hourglass of emotions. In: Esposito A, Esposito AM, Vinciareli A, Hoffmann R, Muller VC, editors. Cognitive behavioural systems. Berlin: Springer; 2012. p. 144–57.CrossRefGoogle Scholar
  21. 21.
    Schlosberg H. Three dimensions of emotion. Psychol Rev. 1954;61:81–8.PubMedCrossRefGoogle Scholar
  22. 22.
    Russel JA. A circumplex model of affect. J Pers Soc Psychol. 1980;39:1161–78.CrossRefGoogle Scholar
  23. 23.
    Flores-Gutiérrez EO, Díaz JL, Barrios FA, Favila-Humara R, Guevara MA, Del Río-Portilla Y, Corsi-Cabrea M. Metabolic and electric brain patterns during pleasant and unpleasant emotions induced by music masterpieces. Int J Psychophysiol. 2007;65:69–84.PubMedCrossRefGoogle Scholar
  24. 24.
    Pop-Jordanova N, Pop-Jordanova J. Spectrum-weighted EEG frequency (“brain-rate”) as a quantitative indicator of mental arousal. Prilozi. 2005;26:35–42.PubMedGoogle Scholar
  25. 25.
    Kaiser JF. On a simple algorithm to calculate the ‘energy’ of a signal. In: Proceedings of IEEE ICASSP’90, New Mexico. 1990. p. 381–384.Google Scholar
  26. 26.
    Petrantonakis PC, Hadjileontiadis LJ. Emotion recognition from EEG using higher order crossings. IEEE Trans Inf Technol Biomed. 2010;14:186–97.PubMedCrossRefGoogle Scholar
  27. 27.
    Acharya UR, Joseph KP, Kannathal N, Lim CM, Suri JS. Heart rate variability: a review. Med Biol Eng Comput. 2006;44:1031–51.CrossRefGoogle Scholar
  28. 28.
    Dabanloo NJ, Moharreri S, Parvaneh S, Nasrabadi AM. Application of novel mapping for heart rate phase space and its role in cardiac arrhythmia diagnosis. In: Computers in cardiology, Belfast. 2010. p. 209–212.Google Scholar
  29. 29.
    Hu Q, Xie Z, Yu D. Hybrid attribute reduction based on a novel fuzzy-rough model and information granulation. Pattern Recogn. 2007;40:3509–21.CrossRefGoogle Scholar
  30. 30.
    Theodoridis S, Koutroumbas K. Pattern recognition. 3rd ed. San Diego: Academic Press; 2006.Google Scholar
  31. 31.
    Grassi M, Cambria E, Hussain A, Piazza F. Sentic web: a new paradigm for managing social media affective information. Cognit Comput. 2011;3:480–9.CrossRefGoogle Scholar
  32. 32.
    Poria S, Gelbukh A, Hussain A, Howard N, Das D, Bandyopadhyay S. Enhanced senticNet with affective labels for concept-based opinion mining. IEEE Intell Syst. 2013;28:31–8.CrossRefGoogle Scholar
  33. 33.
    Lang PJ, Greenwald MK, Bradley MM, Hamm AO. Looking at pictures: affective, facial, visceral, and behavioral reactions. Psychophysiology. 1993;30:261–73.PubMedCrossRefGoogle Scholar
  34. 34.
    Liu Y, Sourina O, Nguyen MK. Real-time EEG-based human emotion recognition and visualization. In: Proceedings of international conference on cyberworlds, Singapore. 2010. p. 262–9.Google Scholar
  35. 35.
    Soleymani M, Pantic M, Pun T. Emotion recognition in response to videos. IEEE Trans Affect Comput. 2012;3:211–23.CrossRefGoogle Scholar
  36. 36.
    Khosrowabadi R, Heijnen M, Wahab A, Quek HC. The dynamic emotion recognition system based on functional connectivity of brain regions. In: Proceedings of IEEE intelligent vehicles symposium, San Diego. 2010. p. 377–381.Google Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  • Mohsen Naji
    • 1
    Email author
  • Mohammd Firoozabadi
    • 2
  • Parviz Azadfallah
    • 3
  1. 1.Department of Biomedical Engineering, Science and Research BranchIslamic Azad UniversityTehranIran
  2. 2.Department of Medical PhysicsTarbiat Modares UniversityTehranIran
  3. 3.Department of PsychologyTarbiat Modares UniversityTehranIran

Personalised recommendations