Abstract
Brain-computer interfaces (BCIs) enable communication between the brain and a computer and electroencephalography (EEG) has been widely used to implement BCIs because of its high temporal resolution and noninvasiveness. Recently, a tactile-based EEG task was introduced to overcome the current limitations of visual-based tasks, such as visual fatigue from sustained attention. However, the classification performance of tactile-based BCIs as control signals is unsatisfactory. Therefore, a novel classification approach is required for this purpose. Here, we propose TSANet, a deep neural network, that uses multibranch convolutional neural networks and a feature-attention mechanism to classify tactile selective attention (TSA) in a tactile-based BCI system. We tested TSANet under three evaluation conditions, namely, within-subject, leave-one-out, and cross-subject. We found that TSANet achieved the highest classification performance compared with conventional deep neural network models under all evaluation conditions. Additionally, we show that TSANet extracts reasonable features for TSA by investigating the weights of spatial filters. Our results demonstrate that TSANet has the potential to be used as an efficient end-to-end learning approach in tactile-based BCIs.
Similar content being viewed by others
Availability of data and materials
All data, as well as analysis codes that were used to perform analyses, can be made available from the corresponding author upon reasonable request.
References
Wolpaw JR, Birbaumer N, Heetderks WJ, McFarland DJ, Peckham PH, Schalk G, et al. Brain-computer interface technology: a review of the first international meeting. IEEE Trans Rehabil Eng. 2000;8:164–73.
Roy Y, Banville H, Albuquerque I, Gramfort A, Falk TH, Faubert J. Deep learning-based electroencephalography analysis: a systematic review. J Neural Eng. 2019;16:051001.
Abiri R, Borhani S, Sellers EW, Jiang Y, Zhao X. A comprehensive review of EEG-based brain-computer interface paradigms. J Neural Eng. 2019;16:011001. https://doi.org/10.1088/1741-2552/aaf12e.
Gramfort A, Strohmeier D, Haueisen J, Hämäläinen MS, Kowalski M. Time-frequency mixed-norm estimates: sparse M/EEG imaging with non-stationary source activations. Neuroimage. 2013;70:410–22.
Nicolas-Alonso LF, Gomez-Gil J. Brain computer interfaces, a review. Sensors. 2012;12:1211–79.
Murguialday AR, Aggarwal V, Chatterjee A, Cho Y, Rasmussen R, O’Rourke B, et al. Brain-computer interface for a prosthetic hand using local machine control and haptic feedback. In: IEEE 10th international conference on rehabilitation robotics. 2007, pp. 609–13.
McFarland DJ, Sarnacki WA, Wolpaw JR. Electroencephalographic (EEG) control of three-dimensional movement. J Neural Eng. 2010. https://doi.org/10.1088/1741-2560/7/3/036007.
Sun A, Fan B, Jia C. Motor imagery EEG-based online control system for upper artificial limb. In: International conference on transportation, mechanical, and electrical engineering. 2011, pp. 1646–9.
Bhattacharyya S, Khasnobish A, Konar A, Tibarewala DN, Nagar AK. Performance analysis of left/right hand movement classification from EEG signal by intelligent algorithms. In: IEEE symposium on computational intelligence, cognitive algorithms, mind, and brain. 2011, 1–8.
Cipresso P, Carelli L, Solca F, Meazzi D, Meriggi P, Poletti B, et al. The use of P300-based BCIs in amyotrophic lateral sclerosis: from augmentative and alternative communication to cognitive assessment. Brain Behav. 2012;2:479–98.
Krusienski DJ, Sellers EW, Cabestaing F, Bayoudh S, McFarland DJ, Vaughan TM, et al. A comparison of classification techniques for the P300 Speller. J Neural Eng. 2006. https://doi.org/10.1088/1741-2560/3/4/007.
Hoffmann U, Vesin JM, Ebrahimi T, Diserens K. An efficient P300-based brain-computer interface for disabled subjects. J Neurosci Methods. 2008;167:115–25.
Bell CJ, Shenoy P, Chalodhorn R, Rao RPN. Control of a humanoid robot by a noninvasive brain-computer interface in humans. J Neural Eng. 2008;5:214–20.
Bryan M, Green J, Chung M, Chang L, Scherer R, Smith J, et al. An adaptive brain-computer interface for humanoid robot control. In: IEEE-RAS international conference on humanoid robots. 2011, pp. 199–204.
Gembler F, Stawicki P, Volosyak I. Autonomous parameter adjustment for SSVEP-based BCIs with a novel BCI wizard. Front Neurosci. 2015;9:1–12.
Blankertz B, Sannelli C, Halder S, Hammer EM, Kübler A, Müller KR, et al. Neurophysiological predictor of SMR-based BCI performance. Neuroimage. 2010;51:1303–9.
Volosyak I, Valbuena D, Lüth T, Malechka T, Gräser A. BCI demographics II: How many (and what kinds of) people can use a high-frequency SSVEP BCI? IEEE Trans Neural Syst Rehabil Eng. 2011;19:232–9.
Müller-Putz GR, Scherer R, Neuper C, Pfurtscheller G. Steady-state somatosensory evoked potentials: Suitable brain signals for brain-computer interfaces? IEEE Trans Neural Syst Rehabil Eng. 2006;14:30–7.
Yao L, Meng J, Zhang D, Sheng X, Zhu X. Selective sensation based brain-computer interface via mechanical vibrotactile stimulation. PLoS ONE. 2013. https://doi.org/10.1371/journal.pone.0064784.
Yao L, Meng J, Zhang D, Sheng X, Zhu X. Combining motor imagery with selective sensation toward a hybrid-modality BCI. IEEE Trans Biomed Eng. 2014;61:2304–12.
Ahn S, Ahn M, Cho H, Chan JS. Achieving a hybrid brain-computer interface with tactile selective attention and motor imagery. J Neural Eng. 2014. https://doi.org/10.1088/1741-2560/11/6/066004.
Ahn S, Kim K, Jun SC. Steady-state somatosensory evoked potential for brain-computer interface-present and future. Front Hum Neurosci. 2016;9:1–6.
Craik A, He Y, Contreras-Vidal JL. Deep learning for electroencephalogram (EEG) classification tasks: a review. J Neural Eng. 2019;16:031001.
Schirrmeister RT, Springenberg JT, Fiederer LDJ, Glasstetter M, Eggensperger K, Tangermann M, et al. Deep learning with convolutional neural networks for EEG decoding and visualization. Hum Brain Mapp. 2017;38:5391–420.
Gao Z, Sun X, Liu M, Dang W, Ma C, Chen G. Attention-based Parallel Multiscale Convolutional Neural Network for Visual Evoked Potentials EEG Classification. IEEE J Biomed Health Inform. 2021;2194:1–8.
Lawhern VJ, Solon AJ, Waytowich NR, Gordon SM, Hung CP, Lance BJ. EEGNet: A compact convolutional neural network for EEG-based brain-computer interfaces. J Neural Eng. 2018;15: 056013.
Acharya UR, Oh SL, Hagiwara Y, Tan JH, Adeli H. Deep convolutional neural network for the automated detection and diagnosis of seizure using EEG signals. Comput Biol Med. 2018;100:270–8.
Sors A, Bonnet S, Mirek S, Vercueil L, Payen JF. A convolutional neural network for sleep stage scoring from raw single-channel EEG. Biomed Signal Process Control. 2018;42:107–14.
Ingolfsson TM, Hersche M, Wang X, Kobayashi N, Cavigelli L, Benini L. EEG-TCNet: an accurate temporal convolutional network for embedded motor-imagery brain-machine interfaces. In: 2020 IEEE international conference on systems, man, and cybernetics (SMC). 2020, pp. 2958–65.
Zhang D, Yao L, Zhang X, Wang S, Chen W, Boots R. Cascade and parallel convolutional recurrent neural networks on EEG-based intention recognition for brain computer interface. In: 32nd AAAI conference on artificial intelligence. 2018, pp. 1703–10.
Delorme A, Makeig S. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J Neurosci Methods. 2004;134:9–21.
Urigüen JA, Garcia-Zapirain B. EEG artifact removal—state-of-the-art and guidelines. J Neural Eng. 2015. https://doi.org/10.1088/1741-2560/12/3/031001.
Supratak A, Dong H, Wu C, Guo Y. DeepSleepNet: A model for automatic sleep stage scoring based on raw single-channel EEG. IEEE Trans Neural Syst Rehabil Eng. 2017;25:1998–2008.
Zhang R, Zong Q, Zhao X. A multi-branch 3D convolutional neural network for EEG-based motor imagery classification. In: Chinese control conference. 2019, pp. 8428–32.
Zhang Y, Cai H, Nie L, Xu P, Zhao S, Guan C. An end-to-end 3D convolutional neural network for decoding attentive mental state. Neural Netw. 2021;144:129–37.
Chollet F. Xception: deep learning with depthwise separable convolutions. In: Proceedings - 30th IEEE conference on computer vision and pattern recognition. 2017, pp. 1800–7.
Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, et al. Attention is all you need. In: Advances in neural information processing systems. Neural information processing systems foundation; 2017. pp. 5999–6009.
Qu W, Wang Z, Hong H, Chi Z, Feng DD, Grunstein R, et al. A residual based attention model for EEG based sleep staging. IEEE J Biomed Health Inform. 2020;24:2833–43.
Yao Q, Wang R, Fan X, Liu J, Li Y. Multi-class arrhythmia detection from 12-lead varied-length ECG using attention-based time-incremental convolutional neural network. Inf Fusion. 2020;53:174–82.
Springenberg JT, Dosovitskiy A, Brox T, Riedmiller M. Striving for simplicity: the all convolutional net. In: 3rd International conference on learning representations. 2015, pp. 1–14.
Zhang D, Yao L, Zhang X, Wang S, Chen W. Boots Cascade and Parallel Convolutional Recurrent Neural Networks on EEG-Based Intention Recognition for Brain Computer Interface. Thirty-Second AAAI Conference on Artificial Intelligence. 2017;32:1703–10.
Aggarwal S, Chugh N. Review of machine learning techniques for EEG based brain computer interface. Arch Comput Methods Eng. 2022;29:3001–20.
Yao L, Jiang N, Mrachacz-Kersting N, Zhu X, Farina D, Wang Y. Reducing the calibration time in somatosensory BCI by using tactile ERD. IEEE Trans Neural Syst Rehabil Eng. 2022;30:1870–6.
Lotte F, Bougrain L, Cichocki A, Clerc M, Congedo M, Rakotomamonjy A, et al. A review of classification algorithms for EEG-based brain-computer interfaces: a 10 year update. J Neural Eng. 2018. https://doi.org/10.1088/1741-2552/aab2f2.
Acknowledgements
We would like to thank all participants for their participation in the study.
Funding
This work was supported by the National Research Foundation of Korea (NRF) Grant funded by the Korea government (MSIT) (No. NRF-2022R1A4A1023248 and No. RS-2023-00209794). This work was supported by the IITP (Institute of Information and Communications Technology Planning & Evaluation) grants (No. 2017-0-00451, No. 2019-0-01842) funded by the Korea government.
Author information
Authors and Affiliations
Contributions
S.C.J and S.A designed the experimental paradigms and S.A collected the data. J.S.P and H.J performed the data analysis, and J.S.P, H.J, and S.A wrote the original draft of the manuscript. All authors reviewed and revised the manuscript.
Corresponding author
Ethics declarations
Conflict of interest
All authors declare that they have no competing interests.
Ethics approval and consent to participate
All participants signed an informed consent. The study was approved by the Institutional Review Board of the Gwangju Institute of Science and Technology.
Consent for publication
All participants signed an informed consent for publication.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Below is the link to the electronic supplementary material.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Jang, H., Park, J.S., Jun, S.C. et al. TSANet: multibranch attention deep neural network for classifying tactile selective attention in brain-computer interfaces. Biomed. Eng. Lett. 14, 45–55 (2024). https://doi.org/10.1007/s13534-023-00309-4
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13534-023-00309-4