Abstract
Classification of patterns of brain activity in neuroengineering research is an important tool for understanding the brain, developing neurodiagnostics, and designing closed-loop neural interfaces. Scalp electroencephalography (EEG), by virtue of its noninvasiveness and lower cost, has been used for neural signal classification, and researchers have utilized various machine learning methods. Recently, deep learning has gained popularity due to its ability to significantly increase the classification performance in numerous domains while elucidating the relevant features for classification. It is a natural step to deploy such promising techniques for EEG classification tasks. This book chapter aims to serve as a comprehensive reference source for both EEG and deep learning researchers interested in EEG-based deep learning studies. Potential pitfalls, challenges, and opportunities in the application of deep learning to EEG data are discussed.
Keywords
- Deep learning
- Machine learning
- Neural networks
- Electroencephalography
- Neuroengineering
- Volitional processes
- External stimulation
- Affective computing
- Brain computer interfaces
- Deep learning interpretation
This is a preview of subscription content, access via your institution.









Abbreviations
- AD:
-
Alzheimer’s Disease
- AE:
-
AutoEncoder
- AMIGOS:
-
Dataset for Affect, Personality, and Mood Research on Individuals and Groups
- BCI:
-
Brain Computer Interface
- BIDS:
-
Brain Imaging Data Structure
- BLDA:
-
Bayesian Linear Discriminant Analysis
- CCNN:
-
Channel-wise CNN
- CNN:
-
Convolutional Neural Network
- CNN-R:
-
Residual CNN
- CSP:
-
Common Spatial Pattern
- DBN:
-
Deep Belief Machine
- DBS:
-
Deep Brain Stimulation
- DEAP:
-
Database of Emotion Analysis using Physiological signals
- DL:
-
Deep Learning
- DSP:
-
Digital Signal Processing
- EEG:
-
Electroencephalography
- ELU:
-
Exponential Linear Unit
- EMG:
-
Electromyography
- EOG:
-
Electrooculography
- ERN:
-
Error-related Negativity Response
- ERP:
-
Event Related Potential
- FFT:
-
Fast Fourier Transform
- GRU:
-
Gated Recurrent Unit
- HDCA:
-
Hierarchical Discriminant Component Analysis
- ITR:
-
Information Transfer Rate
- KMI:
-
Kinesthetic Motor Imagery
- LDA:
-
Linear Discriminant Analysis
- LRP:
-
Layer-wise Relevance Propagation
- LSTM:
-
Long-Short Term Memory
- LVQ:
-
Linear Vector Quantization
- MASS:
-
Montreal Archive of Sleep Studies
- MCI:
-
Motor Cognitive Impairment
- MDM:
-
Minimum Distance to Mean
- ML:
-
Machine Learning
- MLP:
-
Multi-Layer Perceptron
- MRCP:
-
Movement-Related Cortical Potential
- NIH:
-
National Institutes of Health
- NN:
-
Neural Network
- PCA:
-
Principal Component Analysis
- RBM:
-
Restricted Boltzmann Machine
- RCNN:
-
Recurrent CNN
- ReLU:
-
Rectified Linear Unit
- REM:
-
Rapid Eye Movement
- RNN:
-
Recurrent Neural Networks
- RSVP:
-
Rapid Serial Visual Presentation
- SEED:
-
SJTU Emotion EEG Database
- SELU:
-
Scaled Exponential Linear Unit
- SMR:
-
Sensory-Motor Rhythms
- SNR:
-
Signal-to-Noise Ratio
- SOTA:
-
State of the Art
- SSVEP:
-
Steady-State Visual Evoked Potential
- SVM:
-
Support Vector Machines
- TCN:
-
Temporal Convolution Network
- TSNN:
-
Two Stream Neural Network
- VMI:
-
Visual Motor Imagery
- WoS:
-
Web of Science
References
Vaughan, T.M., et al.: Brain-computer interface technology: a review of the Second International Meeting (2003)
Lotte, F., Congedo, M., Lécuyer, A., Lamarche, F., Arnaldi, B.: A review of classification algorithms for EEG-based brain–computer interfaces. J. Neural Eng. 4(2), R1 (2007)
Yannick, R., Hubert, B., Isabela, A., Alexandre, G., Jocelyn, F., et al.: Deep learning-based electroencephalography analysis: a systematic review. arXiv preprint arXiv:1901.05498 (2019). https://github.com/hubertjb/dleeg-review
Sejnowski, T.J.: The unreasonable effectiveness of deep learning in artificial intelligence. In: Proceedings of the National Academy of Sciences (2020)
Craik, A., He, Y., Contreras-Vidal, J.L.P.: Deep learning for Electroencephalogram (EEG) classification tasks: a review. J. Neural Eng. 16, 031001 (2019)
Morabito, F.C., et al.: Deep convolutional neural networks for classification of mild cognitive impaired and Alzheimer’s disease patients from scalp EEG recordings. In: 2016 IEEE 2nd International Forum on Research and Technologies for Society and Industry Leveraging a Better Tomorrow (RTSI), pp. 1–6. IEEE (2016)
Kim, D., Kim, K.: Detection of early stage Alzheimer’s disease using EEG relative power with deep neural network. In: 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 352–355. IEEE (2018)
Zhao, Y., He, L.: Deep learning in the EEG diagnosis of Alzheimer’s disease. In: Asian Conference on Computer Vision, pp. 340–353. Springer (2014)
Acharya, U.R., Oh, S.L., Hagiwara, Y., Tan, J.H., Adeli, H., Subha, D.P.: Automated EEG-based screening of depression using deep convolutional neural network. Comput. Methods Programs Biomed. 161, 103–113 (2018)
Baltatzis, V., Bintsi, K.-M., Apostolidis, G.K., Hadjileontiadis, L.J.: Bullying incidences identification within an immersive environment using HD EEG-based analysis: a swarm decomposition and deep learning approach. Sci. Rep. 7(1), 1–8 (2017)
Guo, Y., Friston, K., Aldo, F., Hill, S., Peng, H.: Brain Informatics and Health: 8th International Conference, BIH 2015, London, 30 Aug–2 Sept 2015. Proceedings, vol. 9250. Springer (2015)
Bashivan, P., Rish, I., Yeasin, M., Codella, N.: Learning representations from EEG with deep recurrent-convolutional neural networks. arXiv preprint arXiv:1511.06448 (2015)
Le Roux, N., Bengio, Y.: Representational power of restricted Boltzmann machines and deep belief networks. Neural Comput. 20(6), 1631–1649 (2008)
Lawrence, S., Giles, C.L., Tsoi, A.C., Back, A.D.: Face recognition: a convolutional neural-network approach. IEEE Trans. Neural Netw. 8(1), 98–113 (1997)
LeCun, Y.: Deep learning & convolutional networks. In: 27th IEEE Hot Chips Symposium, HCS 2015. Institute of Electrical and Electronics Engineers Inc (2016)
Pearlmutter, B.A.: Learning state space trajectories in recurrent neural networks. Neural Comput. 1(2), 263–269 (1989)
Onose, G., et al.: On the feasibility of using motor imagery EEG-based brain–computer interface in chronic tetraplegics for assistive robotic arm control: a clinical test and long-term post-trial follow-up. Spinal Cord 50(8), 599 (2012)
Pfurtscheller, G., Neuper, C.: Motor imagery and direct brain-computer communication. Proc. IEEE 89(7), 1123–1134 (2001)
Féry, Y.-A.: Differentiating visual and kinesthetic imagery in mental practice. Can. J. Exp. Psychol./Revue Canadienne de Psychologie Expérimentale 57(1)), 1 (2003)
Tangermann, M., et al.: Review of the BCI competition IV. Front. Neurosci. 6, 55 (2012)
Abbas, W., Khan, N.A.: DeepMI: deep learning for multiclass motor imagery classification. In: 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 219–222. IEEE (2018)
Lawhern, V.J., Solon, A.J., Waytowich, N.R., Gordon, S.M., Hung, C.P., Lance, B.J.: EEGNet: a compact convolutional neural network for EEG-based brain–computer interfaces. J. Neural Eng. 15(5), 056013 (2018)
Sakhavi, S., Guan, C., Yan, S.: Learning temporal information for brain-computer interface using convolutional neural networks. IEEE Trans. Neural Netw. Learn. Syst. 99, 1–11 (2018)
Luo, T.-J., Chao, F., et al.: Exploring spatial-frequency-sequential relationships for motor imagery classification with recurrent neural network. BMC Bioinformatics 19(1), 344 (2018)
Wang, Z., Cao, L., Zhang, Z., Gong, X., Sun, Y., Wang, H.: Short time Fourier transformation and deep neural networks for motor imagery brain computer interface recognition. Concurr. Comput. Pract. Exp. 30(23), e4413 (2018)
Tefft, B.C., et al.: Prevalence of motor vehicle crashes involving drowsy drivers, United States, 2009–2013. Citeseer (2014)
Hajinoroozi, M., Mao, Z., Huang, Y.: Prediction of driver’s drowsy and alert states from EEG signals with deep learning. In: 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP), pp. 493–496. IEEE (2015)
Zeng, H., Yang, C., Dai, G., Qin, F., Zhang, J., Kong, W.: EEG classification of driver mental states by deep learning. Cogn. Neurodyn. 12(6), 597–606 (2018)
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
Szegedy, C., Ioffe, S., Vanhoucke, V., Alemi, A.A.: Inception-v4, inception-resnet and the impact of residual connections on learning. In: Thirty-First AAAI Conference on Artificial Intelligence (2017)
Gao, Z., et al.: EEG-based spatio-temporal convolutional neural network for driver fatigue evaluation. IEEE Trans. Neural Netw. Learn. Syst. 30, 2755–2763 (2019)
Jeong, J.-H., Yu, B.-W., Lee, D.-H., Lee, S.-W.: Classification of drowsiness levels based on a deep spatio-temporal convolutional bidirectional LSTM network using electroencephalography signals. Brain Sci. 9(12), 348 (2019)
Borghini, G., Astolfi, L., Vecchiato, G., Mattia, D., Babiloni, F.: Measuring neurophysiological signals in aircraft pilots and car drivers for the assessment of mental workload, fatigue and drowsiness. Neurosci. Biobehav. Rev. 44, 58–75 (2014)
Aghajani, H., Garbey, M., Omurtag, A.: Measuring mental workload with EEG+ fNIRS. Front. Hum. Neurosci. 11, 359 (2017)
Young, M.S., Brookhuis, K.A., Wickens, C.D., Hancock, P.A.: State of science: mental workload in ergonomics. Ergonomics 58(1), 1–17 (2015)
Jiao, Z., Gao, X., Wang, Y., Li, J., Xu, H.: Deep convolutional neural networks for mental load classification based on EEG data. Pattern Recogn. 76, 582–595 (2018)
Zhang, P., Wang, X., Zhang, W., Chen, J.: Learning spatial–spectral–temporal EEG features with recurrent 3D convolutional neural networks for cross-task mental workload assessment. IEEE Trans. Neural Syst. Rehabil. Eng. 27, 31–42 (2018)
Zhang, P., Wang, X., Chen, J., You, W., Zhang, W.: Spectral and temporal feature learning with two-stream neural networks for mental workload assessment. IEEE Trans. Neural Syst. Rehabil. Eng. 27(6), 1149–1159 (2019)
Deepak, K., Kalbande, D.: A review on visual brain computer interface. In: Somsubhra, G., Sandip, B., Karabi, G., Indranath, S., Papun, B. (eds.) Advancements of Medical Electronics, pp. 193–206. Springer, New Delhi (2015). isbn: 978-81-322-2256-9
Gao, S., Wang, Y., Gao, X., Hong, B.: Visual and auditory brain–computer interfaces. IEEE Trans. Biomed. Eng. 61, 1436–1447 (2014)
Donchin, E., Spencer, K.M., Wijesinghe, R.: The mental prosthesis: assessing the speed of a P300-based braincomputer interface. IEEE Trans. Rehabil. Eng. Publ. IEEE Eng. Med. Biol. Soc. 8(2), 174–179 (2000)
Blankertz, B., et al.: The BCI competition 2003: progress and perspectives in detection and discrimination of EEG single trials. IEEE Trans. Biomed. Eng. 51, 1044–1051 (2004)
Rakotomamonjy, A., Guigue, V.: BCI competition III: dataset II – ensemble of SVMs for BCI P300 speller. IEEE Trans. Biomed. Eng. 55, 1147–1154 (2008)
Liu, M., Wu, W., Gu, Z., Yu, Z., Qi, F., Li, Y.: Deep learning based on Batch Normalization for P300 signal detection. Neurocomputing 275, 288–297 (2018)
Manor, R., Geva, A.B.: Convolutional neural network for multi-category rapid serial visual presentation BCI. Front. Comput. Neurosci. 9, 146 (2015)
Cecotti, H., Gräser, A.: Convolutional neural networks for P300 detection with application to brain-computer interfaces. IEEE Trans. Pattern Anal. Mach. Intell. 33, 433–445 (2011)
Schirrmeister, R.T., et al.: Deep learning with convolutional neural networks for EEG decoding and visualization. Hum. Brain Map (2017). https://github.com/robintibor/braindecode
Shamwell, J., Lee, H., Kwon, H., Marathe, A.R., Lawhern, V., Nothwang, W.: Single-trial EEG RSVP classification using convolutional neural networks, vol. 9836 (2016). https://doi.org/10.1117/12.2224172
Chen, X., Wang, Y., Nakanishi, M., Gao, X., T.-P. Jung, and Gao, S. “High-speed spelling with a noninvasive braincomputer interface. Proc. Natl. Acad. Sci. U. S. A. 112(44), E6058–E6067 (2015)
Volosyak, I., Valbuena, D., Luth, T., Malechka, T., Graser, A.: BCI demographics II: how many (and what kinds of) people can use a high-frequency SSVEP BCI? IEEE Trans. Neural Syst. Rehabil. Eng. 19, 232–239 (2011)
Guger, C., et al.: How many people could use an SSVEP BCI? Front. Neurosci. 6, 169 (2012)
Aznan, N.K.N., Bonner, S., Connolly, J.D., Moubayed, N.A., Breckon, T.P.: On the classification of SSVEP-based dry-EEG signals via convolutional neural networks. In: 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 3726–3731 (2018)
Kwan, P., Brodie, M.J.: Early identification of refractory epilepsy. N. Engl. J. Med. 342(5), 314–319 (2000)
Andrade, D., et al.: Long-term follow-up of patients with thalamic deep brain stimulation for epilepsy. Neurology 66(10), 1571–1573 (2006)
Andrzejak, R.G., Lehnertz, K., Mormann, F., Rieke, C., David, P., Elger, C.E.: Indications of nonlinear deterministic and finite-dimensional structures in time series of brain electrical activity: dependence on recording region and brain state. Phys. Rev. E Stat. Nonlinear Soft Matter Phys. 64(6), Pt 1, 061907 (2001). http://epileptologie-bonn.de/cms/front_content.php?idcat=193&lang=3&changelang=3
Winterhalder, M., Maiwald, T., Voss, H., Aschenbrenner-Scheibe, R., Timmer, J., Schulze-Bonhage, A.: The seizure prediction characteristic: a general framework to assess and compare seizure prediction methods. Epilepsy Behav. 4(3), 318–325 (2003)
Ullah, I., Qazi, E.-H., Aboalsamh, H.A.: An automated system for epilepsy detection using EEG brain signals based on deep learning approach. Expert Syst. Appl. 107, 61–71 (2018)
Acharya, U.R., Oh, S.L., Hagiwara, Y., Tan, J.H., Adeli, H.: Deep convolutional neural network for the automated detection and diagnosis of seizure using EEG signals. Comput. Biol. Med. 100, 270–278 (2017)
Emami, A., Kunii, N., Matsuo, T., Shinozaki, T., Kawai, K., Takahashi, H.K.: Seizure detection by convolutional neural network-based analysis of scalp electroencephalography plot images. NeuroImage: Clin. 22, 3 (2019)
Tjepkema-Cloostermans, M.C., de Carvalho, R.C., van Putten, M.J.: Deep learning for detection of focal epileptiform discharges from scalp EEG recordings. Clin. Neurophysiol. 129(10), 2191–2196 (2018)
Morabito, F.C., et al.: Deep convolutional neural networks for classification of mild cognitive impaired and Alzheimer’s disease patients from scalp EEG recordings. In: 2016 IEEE 2nd International Forum on Research and Technologies for Society and Industry Leveraging a Better Tomorrow (RTSI), pp. 1–6 (2016)
Kim, D., Kim, K.: Detection of early stage Alzheimer’s disease using EEG relative power with deep neural network. In: 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 352–355 (2018)
Aboalayon, K., Faezipour, M., Almuhammadi, W., Moslehpour, S.: Sleep stage classification using EEG signal analysis: a comprehensive survey and new investigation. Entropy 18(9), 272 (2016)
Supratak, A., Dong, H., Wu, C., Guo, Y.: DeepSleepNet: a model for automatic sleep stage scoring based on raw single-channel EEG. IEEE Trans. Neural Syst. Rehabil. Eng. 25, 1998–2008 (2017)
Krishnamoorthy, V., Shoorangiz, R., Weddell, S.J., Beckert, L., Jones, R.D.: Deep learning with convolutional neural network for detecting microsleep states from EEG: a comparison between the oversampling technique and cost-based learning. In: 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 4152–4155. IEEE (2019)
Chambon, S., Galtier, M.N., Arnal, P.J., Wainrib, G., Gramfort, A.: A deep learning architecture for temporal sleep stage classification using multivariate and multimodal time series. IEEE Trans. Neural Syst. Eng. 26(4), 758–769 (2018)
Koelstra, S., et al.: DEAP: a database for emotion analysis; Using physiological signals. IEEE Trans. Affect. Comput. (2012). issn: 19493045. https://doi.org/10.1109/T-AFFC.2011.15
Zheng, W.L., Lu, B.L.: Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans. Auton. Ment. Dev. (2015). issn: 19430604. https://doi.org/10.1109/TAMD.2015.2431497
Zheng, W.-L., Zhu, J.-Y., Lu, B.-L.: Identifying stable patterns over time for emotion recognition from EEG. IEEE Trans. Affect. Comput. (2017). issn: 1949–3045
Miranda Correa, J.A., Abadi, M.K., Sebe, N., Patras, I.: AMIGOS: a dataset for affect, personality and mood research on individuals and groups (2018). https://doi.org/10.1109/TAFFC.2018.2884461. arXiv: 1702.02510v3
Song, T., Zheng, W., Lu, C., Zong, Y., Zhang, X., Cui, Z.: MPED: a multi-modal physiological emotion database for discrete emotion recognition. IEEE Access 7, 12177–12191 (2019). issn: 2169–3536
Katsigiannis, S., Ramzan, N.: DREAMER: a database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices. IEEE J. Biomed. Health Informatics 22(1), 98–107 (2018). issn: 2168–2194
Jirayucharoensak, S., Pan-Ngum, S., Israsena, P.: EEG-based emotion recognition using deep learning network with principal component based covariate shift adaptation. Sci. World J. (2014). issn: 1537744X. https://doi.org/10.1155/2014/627892. arXiv: 627892
Jia, X., Li, K., Li, X., Zhang, A.: A novel semi-supervised deep learning framework for affective state recognition on eeg signals. In: 2014 IEEE International Conference on Bioinformatics and Bioengineering, pp. 30–37. IEEE (2014). isbn: 1479975028
Zheng, W.L., Zhu, J.Y., Peng, Y., Lu, B.L.: EEG-based emotion classification using deep belief networks. In: Proceedings – IEEE International Conference on Multimedia and Expo (2014). isbn: 978-1-4799-4761-4. https://doi.org/10.1109/ICME.2014.6890166
Liu, W., Zheng, W.L., Lu, B.L.: Emotion recognition using multimodal deep learning. In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (2016). isbn: 9783319466712. https://doi.org/10.1007/978-3-319-46672-9_58
Li, J., Zhang, Z., He, H.: Implementation of EEG emotion recognition system based on hierarchical convolutional neural networks. In: International Conference on Brain Inspired Cognitive Systems, pp. 22–33. Springer (2016)
Tripathi, S., Acharya, S., Ranti, S., Mittal, S., Bhattacharya, S.: Using deep and convolutional neural networks for accurate emotion classification on DEAP dataset. In: Proceedings of the Twenty-Ninth AAAI Conference on Innovative Applications (2017). issn: 00415782
Xu, H., Plataniotis, K.N.: Affective states classification using EEG and semi-supervised deep learning approaches. In: 2016 IEEE 18th International Workshop on Multimedia Signal Processing, MMSP 2016 (2017). isbn: 9781509037247. https://doi.org/10.1109/MMSP.2016.7813351
Li, X., Song, D., Zhang, P., Yu, G., Hou, Y., Hu, B.: Emotion recognition from multi-channel EEG data through Convolutional Recurrent Neural Network. In: Proceedings – 2016 IEEE International Conference on Bioinformatics and Biomedicine, BIBM 2016 (2017). isbn: 9781509016105. https://doi.org/10.1109/BIBM.2016.7822545
Yanagimoto, M., Sugimoto, C.: Recognition of persisting emotional valence from EEG using convolutional neural networks. In: 2016 IEEE 9th International Workshop on Computational Intelligence and Applications, IWCIA 2016 – Proceedings (2017). isbn: 9781509027750. https://doi.org/10.1109/IWCIA.2016.7805744
Zhang, T., Zheng, W., Cui, Z., Zong, Y., Li, Y.: Spatial-Temporal Recurrent Neural Network for Emotion Recognition (2018). https://doi.org/10.1109/TCYB.2017.2788081. arXiv: 1705.04515
Bozhkov, L., Koprinkova-Hristova, P., Georgieva, P.: Learning to decode human emotions with Echo State Networks. Neural Netw. (2016). issn: 18792782. https://doi.org/10.1016/j.neunet.2015.07.005
Mehmood, R.M., Du, R., Lee, H.J.: Optimal feature selection and deep learning ensembles method for emotion recognition from human brain EEG sensors. IEEE Access (2017). issn: 21693536. https://doi.org/10.1109/ACCESS.2017.2724555
Chao, H., Zhi, H., Dong, L., Liu, Y.: Recognition of emotions using multichannel EEG data and DBN-GC-based ensemble deep learning framework. Comput. Intell. Neurosci. (2018). issn: 1687–5265
Song, T., Zheng, W., Song, P., Cui, Z.: EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Trans. Affect. Comput. (2018). issn: 1949–3045
Miranda-Correa, J.A., Patras, I.: A multi-task cascaded network for prediction of affect, personality, mood and social context using EEG signals. In: 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), pp. 373–380. IEEE (2018). isbn: 1538623358
Acharya, U.R., Oh, S.L., Hagiwara, Y., Tan, J.H., Adeli, H., Subha, D.P.: Automated EEG-based screening of depression using deep convolutional neural network. Comput. Methods Programs Biomed. (2018). issn: 18727565. https://doi.org/10.1016/j.cmpb.2018.04.012
Li, X., et al.: EEG-based mild depression recognition using convolutional neural network. Med. Biol. Eng. Comput. 1–12 (2019). issn: 0140–0118
Miller, T.: Explanation in artificial intelligence: insights from the social sciences. Artif. Intell. 267, 1–38 (2019)
Guidotti, R., Monreale, A., Turini, F., Pedreschi, D., Giannotti, F.: A survey of methods for explaining black box models. ACM Comput. Surv. 51, 93:1–93:42 (2018)
Zhang, Q., Zhu, S.-C.: Visual interpretability for deep learning: a survey. Front. Inf. Technol. Electron. Eng. 19, 27–39 (2018)
Adadi, A., Berrada, M.: Peeking inside the black-box: a survey on explainable artificial intelligence (XAI). IEEE Access 6, 52138–52160 (2018)
Gilpin, L.H., Bau, D., Yuan, B.Z., Bajwa, A., Specter, M., Kagal, L.: Explaining explanations: an overview of interpretability of machine learning. In: 2018 IEEE 5th International Conference on Data Science and Advanced Analytics (DSAA), pp. 80–89 (2018)
Haufe, S., et al.: On the interpretation of weight vectors of linear models in multivariate neuroimaging. NeuroImage 87, 96–110 (2014)
Sturm, I., Bach, S., Samek, W., Müller, K.-R.: Interpretable deep neural networks for single-trial EEG classification. J. Neurosci. Methods 274, 141–145 (2016)
Molnar, C.: Interpretable Machine Learning. A Guide for Making Black Box Models Explainable. https://christophm.github.io/interpretable-ml-book/. leanpub.com (2019)
Cecotti, H., Eckstein, M.P., Giesbrecht, B.: Single-trial classification of event-related potentials in rapid serial visual presentation tasks using supervised spatial filtering. IEEE Trans. Neural Netw. Learn. Syst. 25, 2030–2042 (2014)
Ravindran, A.S., Mobiny, A., Cruz-Garza, J.G., Paek, A., Kopteva, A., Contreras-Vidal, J.L.: Assaying neural activity of children during video game play in public spaces: a deep learning approach. J. Neural Eng. 16, 036028 (2019)
Erhan, D., Bengio, Y., Courville, A., Vincent, P.: Visualizing higher-layer features of a deep network. Tech Rep Univ Montreal 1341(3), 1 (2009)
Ravindran, A.S., et al.: Interpretable deep learning models for single trial prediction of balance loss. In: 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (2020, Accepted)
Selvaraju, R.R., Cogswell, M., Das, A., Vedantam, R., Parikh, D., Batra, D.: Grad-cam: visual explanations from deep networks via gradient-based localization. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 618–626 (2017)
Munafò, M.R., et al.: A manifesto for reproducible science. Nat. Hum. Behav. 1, 0021 (2017)
Pernet, C., Appelhoff, S., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R.: BIDS-EEG: an extension to the Brain Imaging Data Structure (BIDS) Specification for electroencephalography (2018). https://doi.org/10.31234/osf.io/63a4y
Papers with Code. https://paperswithcode.com/sota
Lee, B.D.: Ten simple rules for documenting scientific software. PLoS Comput. Biol. 14, e1006561 (2018)
Boettiger, C.: An introduction to Docker for reproducible research. Oper. Syst. Rev. 49, 71–79 (2015)
Kurtzer, G.M., Sochat, V.V., Bauer, M.W.: Singularity: scientific containers for mobility of compute. PloS One 12, e0177459 (2017)
Buck, I.: GPU computing with NVIDIA CUDA. In: ACM SIGGRAPH 2007 Courses. SIGGRAPH’07. ACM, San Diego (2007). isbn: 978-1-4503-1823-5. https://doi.org/10.1145/1281500.1281647
Li, R., et al.: Training on the test set? An analysis of Spampinato et al. [arXiv:1609.00344] (2018)
Spampinato, C., Palazzo, S., Kavasidis, I., Giordano, D., Shah, M., Souly, N.: Deep learning human mind for automated visual classification. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 4503–4511 (2017)
Daoud, H., Bayoumi, M.A.: Efficient epileptic seizure prediction based on deep learning. IEEE Trans. Biomed. Circuits Syst. 13(5), 804–813 (2019)
Kilicarslan, A., Grossman, R.G., Contreras-Vidal, J.L.: A robust adaptive denoising framework for real-time artifact removal in scalp EEG measurements. J. Neural Eng. 13(2), 026013 (2016)
Mullen, T., et al.: Real-time modeling and 3D visualization of source dynamics and connectivity using wearable EEG. In: Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, pp. 2184–2187 (2013), isbn: 9781457702167. https://doi.org/10.1109/EMBC.2013.6609968. arXiv: NIHMS150003
Bigdely-Shamlo, N., Mullen, T., Kothe, C., Su, K.-M., Robbins, K.A.: The PREP pipeline: standardized preprocessing for large-scale EEG analysis. Front. Neuroinformatics 9, 16 (2015)
Cruz-Garza, J.G., et al.: Deployment of mobile EEG technology in an art museum setting: evaluation of signal quality and usability. Front. Hum. Neurosci. 11, 527 (2017)
Lotte, F., et al.: A review of classification algorithms for EEG-based brain–computer interfaces: a 10 year update. J. Neural Eng. 15(3), 031005 (2018)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Section Editor information
Rights and permissions
Copyright information
© 2022 Springer Nature Singapore Pte Ltd.
About this entry
Cite this entry
Nakagome, S., Craik, A., Sujatha Ravindran, A., He, Y., Cruz-Garza, J.G., Contreras-Vidal, J.L. (2022). Deep Learning Methods for EEG Neural Classification. In: Thakor, N.V. (eds) Handbook of Neuroengineering. Springer, Singapore. https://doi.org/10.1007/978-981-15-2848-4_78-1
Download citation
DOI: https://doi.org/10.1007/978-981-15-2848-4_78-1
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-2848-4
Online ISBN: 978-981-15-2848-4
eBook Packages: Springer Reference EngineeringReference Module Computer Science and Engineering