Abstract
Visual, audio, and emotional perception by human beings have been an interesting research topic in the past few decades. Electroencephalography (EEG) signals are one of the ways to represent human brain activity. It has been shown, that different brain networks correspond to processes corresponding to varieties of emotional stimuli. In this paper, we demonstrate a deep learning architecture for the movie identification task from the EEG response using Convolutional Neural Network (CNN). The dataset includes nine movie clips that span across different emotional states. The EEG time series data has been collected for 20 participants. Given one second EEG response of particular participant, we tried to predict its corresponding movie ID. We have also discussed the various pre-processing steps for data cleaning and data augmentation process. All the participants have been considered in both train and test data. We obtained 80.22% test accuracy for this movie classification task. We also tried cross participant testing using the same model and the performance was poor for the unseen participants. Our result gives insight toward the creation of identifiable patterns in the brain during audiovisual perception.
Keywords
- EEG
- Classification
- CNN
- Neural entrainmment
- Brain signals
This is a preview of subscription content, access via your institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Chang, C.Y., Hsu, S.H., Pion-Tonachini, L., Jung, T.P.: Evaluation of artifact subspace reconstruction for automatic artifact components removal in multi-channel EEQ recordings. IEEE Trans. Biomed. Eng. 67(4), 1114–1121 (2019)
Delorme, A., Makeig, S.: EEGLAB: an open source toolbox for analysis of single-trial EEQ dynamics including independent component analysis. J. Neurosci. Methods 134(1), 9–21 (2004). https://doi.org/10.1016/j.jneumeth.2003.10.009
Dmochowski, J.P., Sajda, P., Dias, J., Parra, L.C.: Correlated components of ongoing EEQ point to emotionally laden attention-a possible marker of engagement. Front. Hum. Neurosci. 6, 112 (2012)
Ghosh, M.: The NATYASASTRA Ascribed to Bharata Muni, vol. I. Asiatic Society of Bengal,, Calcutta (1951)
Gulli, A., Pal, S.: Deep Learning with Keras. Packt Publishing Ltd., Mumbai (2017)
Hasson, U., Landesman, O., Knappmeyer, B., Vallines, I., Rubin, N., DJ., H.: Neurocinematics: the neuroscience of film. Projections 1(2), 1 (2008)
Hejmadi, A., Davidson, R.J., Rozin, P.: Exploring Hindu Indian emotion expressions: evidence for accurate recognition by Americans and Indians. Psychol. Sci 11, 183–187 (2000)
Lee, Y.Y., Hsieh, S.: Classifying different emotional states by means of EEQ-based functional connectivity patterns. PLoS ONE 9 (2014)
Mahmud, M., Kaiser, M.S., Hussain, A., Vassanelli, S.: Applications of deep learning and reinforcement learning to biological data. IEEE Trans. Neural Netw. Learn. Syt. 29(6), 2063–2079 (2018)
Nair, V., Hinton, G.E.: Rectified linear units improve restricted boltzmann machines. In: Proceedings of the 27th International Conference on Machine Learning (ICML2010), pp. 807–814 (2010)
Nie, D., Wang, X.W., Shi, L.C., Lu, B.L.: EEQ-based emotion recognition during watching movies. Neural Eng.(NER) 2011(5), 667–670 (2011)
Pandey, P., Miyapuram, K.P.: Brain2depth: Lightweight CNN model for classification of cognitive states from EEQ recordings (2021). arXiv preprint arXiv:2106.06688
Sonawane, D., Miyapuram, K.P., Rs, B., Lomas, D.J.: Guessthemusic: song identification from electroencephalography response. In: 8th ACM IKDD CODS and 26th COMAD, pp. 154–162 (2021)
Tripathi, R., Mukhopadhyay, D., Singh, C.K., Miyapuram, K.P., Jolad, S.: Characterization of functional brain networks and emotional centers using the complex networks techniques. In: International Conference on Complex Networks and Their Applications, pp. 854–867 (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Sonawane, D., Pandey, P., Mukopadhyay, D., Miyapuram, K.P. (2021). Movie Identification from Electroencephalography Response Using Convolutional Neural Network. In: Mahmud, M., Kaiser, M.S., Vassanelli, S., Dai, Q., Zhong, N. (eds) Brain Informatics. BI 2021. Lecture Notes in Computer Science(), vol 12960. Springer, Cham. https://doi.org/10.1007/978-3-030-86993-9_25
Download citation
DOI: https://doi.org/10.1007/978-3-030-86993-9_25
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-86992-2
Online ISBN: 978-3-030-86993-9
eBook Packages: Computer ScienceComputer Science (R0)