Skip to main content
Log in

Automatic song indexing by predicting listener’s emotion using EEG correlates and multi-neural networks

  • 1183: Multimedia Processing to Tackle the Dark Side of Social Life
  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Song indexing using emotions is an interesting area of research as it enables authentic analysis based on listener’s emotions. In this work, we propose a multi-neural network architecture with learning accuracy based weights algorithm in order to classify the emotions of listener’s based on self-rated valence, arousal and dominance values into three emotion categories. This classification data is then used in order to identify the major emotion induced by the song, which is then compared with the tags present on that particular song on the last.fm website. The training and testing data for the multi-neural network model is taken from the DEAP dataset and we obtained 85% accuracy in indexing the songs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Aneja D, Colburn A, Faigin G, Shapiro L, Mones B (2016) Modeling stylized character expressions via deep learning. In: Proceedings of the 13th asian conference on computer vision. Springer

  2. Bhandari P, Bijarniya RK, Chatterjee S, Kolekar M (2018) Analysis for self-taught and transfer learning based approaches for emotion recognition. In: 5th IEEE international conference on signal processing and integrated networks (SPIN) (pp 509–512)

  3. Choi K, Fazekas G, Sandler MB, Cho K (2017) Transfer learning for music classification and regression tasks. In: Proceedings of the 18th international society for music information retrieval conference (ISMIR), pp 141–149

  4. Gautam G, Choudhary K, Chatterjee S, Kolekar MH (2017) Facial expression recognition using Krawtchouk moments and support vector machine classifier. In: 2017 Fourth international conference on image information processing (ICIIP). IEEE, pp 1–6

  5. Ghosal D, Kolekar MH (2018) Music genre recognition using deep neural networks and transfer learning. Proc. Interspeech:2087–2091

  6. He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778

  7. Huang H, Hu Z, Wang W, Wu M (2020) Multimodal emotion recognition based on ensemble convolutional neural network. IEEE Access 8:3265–3271. https://doi.org/10.1109/ACCESS.2019.2962085

    Article  Google Scholar 

  8. Kanade T, Cohn JF, Tian Y (2000) Comprehensive database for facial expression analysis. In: Proceedings of the Fourth IEEE international conference on automatic face and gesture recognition (FG’00), Grenoble, France, pp 46–53

  9. Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2012) DEAP: A database for emotion analysis; using physiological signals. IEEE Transactions on Affective Computing 3 (1):18–31

    Article  Google Scholar 

  10. Kolekar MH, Sengupta S (2015) Bayesian network based customized highlight generation for broadcast soccer videos. IEEE Trans Broadcasting 61 (2):195–209

    Article  Google Scholar 

  11. Krizhevsky A, Sutskever I, Hinton GE (2012) Image net classification with deep convolutional neural networks. In: Advances in neural information processing systems, pp 1097–1105

  12. Kumar JS, Bhuvaneswari P (2012) Analysis of electroencephalography (EEG) signals and its categorization–a study. Procedia Eng:38

  13. Liu Y, Sourina O (2013) EEG databases for emotion recognition. In: 2013 international conference on cyberworlds. IEEE, pp 302–309

  14. Liu W, Zheng W-L, Lu B-L (2016) Emotion recognition using multimodal deep learning. In: International conference on neural information processing. Springer, Cham, pp 521–529

  15. Lucey P, Cohn JF, Kanade T, Saragih J, Ambadar Z, Matthews I (2010) The extended Cohn-Kanade Dataset (CK+): A complete expression dataset for action unit and emotion-specified expression. In: Proceedings of the third international workshop on CVPR for human communicative behavior analysis (CVPR4HB 2010), San Francisco, USA, pp 94–101

  16. Tzanetakis G, Cook P (2002) Musical genre classification of audio signals. IEEE Trans Speech Audio Process 10(5):293–302

    Article  Google Scholar 

  17. Tripathi S, Shrinivas A, Sharma RD, Mittal S, Bhattacharya S (2017) Using deep and convolutional neural networks for accurate emotion classification on DEAP dataset. In: Twenty-Ninth IAAI conference

  18. Varghese AA, Cherian JP, Kizhakkethottam JJ (2015) Overview on emotion recognition system. In: 2015 International conference on soft-computing and networks security (ICSNS). IEEE, pp 1–5

  19. Xie O, Liu Z-T, Ding X-W (2018) Electroencephalogram emotion recognition based on a stacking classification model. In: 2018 37th Chinese control conference (CCC) (Wuhan: IEEE), pp 5544–5548. https://doi.org/10.23919/ChiCC.2018.8483496

  20. Xu C, Maddage NC, Shao X, Cao F, Tian Q (2003) Musical genre classification using support vector machines. IEEE Int Conf on Acoustics, Speech, and Signal Processing (ICASSP) 5:V–429

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Maheshkumar H. Kolekar.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gonegandla, P., Kolekar, M.H. Automatic song indexing by predicting listener’s emotion using EEG correlates and multi-neural networks. Multimed Tools Appl 81, 27137–27147 (2022). https://doi.org/10.1007/s11042-021-11879-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-021-11879-9

Keywords

Navigation