Skip to main content

Movie Identification from Electroencephalography Response Using Convolutional Neural Network

  • 1597 Accesses

Part of the Lecture Notes in Computer Science book series (LNAI,volume 12960)


Visual, audio, and emotional perception by human beings have been an interesting research topic in the past few decades. Electroencephalography (EEG) signals are one of the ways to represent human brain activity. It has been shown, that different brain networks correspond to processes corresponding to varieties of emotional stimuli. In this paper, we demonstrate a deep learning architecture for the movie identification task from the EEG response using Convolutional Neural Network (CNN). The dataset includes nine movie clips that span across different emotional states. The EEG time series data has been collected for 20 participants. Given one second EEG response of particular participant, we tried to predict its corresponding movie ID. We have also discussed the various pre-processing steps for data cleaning and data augmentation process. All the participants have been considered in both train and test data. We obtained 80.22% test accuracy for this movie classification task. We also tried cross participant testing using the same model and the performance was poor for the unseen participants. Our result gives insight toward the creation of identifiable patterns in the brain during audiovisual perception.


  • EEG
  • Classification
  • CNN
  • Neural entrainmment
  • Brain signals

This is a preview of subscription content, access via your institution.

Buying options

USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
USD   89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions


  1. Chang, C.Y., Hsu, S.H., Pion-Tonachini, L., Jung, T.P.: Evaluation of artifact subspace reconstruction for automatic artifact components removal in multi-channel EEQ recordings. IEEE Trans. Biomed. Eng. 67(4), 1114–1121 (2019)

    CrossRef  Google Scholar 

  2. Delorme, A., Makeig, S.: EEGLAB: an open source toolbox for analysis of single-trial EEQ dynamics including independent component analysis. J. Neurosci. Methods 134(1), 9–21 (2004).

    CrossRef  Google Scholar 

  3. Dmochowski, J.P., Sajda, P., Dias, J., Parra, L.C.: Correlated components of ongoing EEQ point to emotionally laden attention-a possible marker of engagement. Front. Hum. Neurosci. 6, 112 (2012)

    CrossRef  Google Scholar 

  4. Ghosh, M.: The NATYASASTRA Ascribed to Bharata Muni, vol. I. Asiatic Society of Bengal,, Calcutta (1951)

    Google Scholar 

  5. Gulli, A., Pal, S.: Deep Learning with Keras. Packt Publishing Ltd., Mumbai (2017)

    Google Scholar 

  6. Hasson, U., Landesman, O., Knappmeyer, B., Vallines, I., Rubin, N., DJ., H.: Neurocinematics: the neuroscience of film. Projections 1(2), 1 (2008)

    Google Scholar 

  7. Hejmadi, A., Davidson, R.J., Rozin, P.: Exploring Hindu Indian emotion expressions: evidence for accurate recognition by Americans and Indians. Psychol. Sci 11, 183–187 (2000)

    CrossRef  Google Scholar 

  8. Lee, Y.Y., Hsieh, S.: Classifying different emotional states by means of EEQ-based functional connectivity patterns. PLoS ONE 9 (2014)

    Google Scholar 

  9. Mahmud, M., Kaiser, M.S., Hussain, A., Vassanelli, S.: Applications of deep learning and reinforcement learning to biological data. IEEE Trans. Neural Netw. Learn. Syt. 29(6), 2063–2079 (2018)

    CrossRef  MathSciNet  Google Scholar 

  10. Nair, V., Hinton, G.E.: Rectified linear units improve restricted boltzmann machines. In: Proceedings of the 27th International Conference on Machine Learning (ICML2010), pp. 807–814 (2010)

    Google Scholar 

  11. Nie, D., Wang, X.W., Shi, L.C., Lu, B.L.: EEQ-based emotion recognition during watching movies. Neural Eng.(NER) 2011(5), 667–670 (2011)

    CrossRef  Google Scholar 

  12. Pandey, P., Miyapuram, K.P.: Brain2depth: Lightweight CNN model for classification of cognitive states from EEQ recordings (2021). arXiv preprint arXiv:2106.06688

  13. Sonawane, D., Miyapuram, K.P., Rs, B., Lomas, D.J.: Guessthemusic: song identification from electroencephalography response. In: 8th ACM IKDD CODS and 26th COMAD, pp. 154–162 (2021)

    Google Scholar 

  14. Tripathi, R., Mukhopadhyay, D., Singh, C.K., Miyapuram, K.P., Jolad, S.: Characterization of functional brain networks and emotional centers using the complex networks techniques. In: International Conference on Complex Networks and Their Applications, pp. 854–867 (2019)

    Google Scholar 

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Krishna Prasad Miyapuram .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Sonawane, D., Pandey, P., Mukopadhyay, D., Miyapuram, K.P. (2021). Movie Identification from Electroencephalography Response Using Convolutional Neural Network. In: Mahmud, M., Kaiser, M.S., Vassanelli, S., Dai, Q., Zhong, N. (eds) Brain Informatics. BI 2021. Lecture Notes in Computer Science(), vol 12960. Springer, Cham.

Download citation

  • DOI:

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-86992-2

  • Online ISBN: 978-3-030-86993-9

  • eBook Packages: Computer ScienceComputer Science (R0)