Abstract
Improving information transfer rate is a key to prompt the speed of outputting instructions of the event-related potential-based brain–computer interface. Our previous study designed a dual-stimuli interface that simultaneously presents two types of different stimuli to improve the speed. While, adding more stimuli into this interface makes subject easily affected by “flanker effect” that decreases the accuracy of recognizing intention. To achieve high recognition accuracy with many stimuli, this study proposes a dual stimuli interface based on whole flash and local move (DS-WL) and two rules of stimulus arrangement to induce the brain signals. Twenty subjects participated in the experiment, and their signals are recognized by a back propagation neural network classifier. The local move induces larger and later signals of targets to help discriminate the two kinds of stimuli; the rules reduce the N200 and P300 amplitudes of non-target, which improves accuracy. This study demonstrates that the DS-WL is a useful way to shorten the instruction output cycle and speed up the instructions outputting by local move and rules.
Similar content being viewed by others
References
Bin G, Gao X, Gao S (2009) VEP-based brain–computer interfaces: time, frequency, and code modulations. IEEE Comput Intell Mag 4:22–26
Bin G, Gao X, Wang Y, Li Y (2011) A high-speed BCI based on code modulation VEP. J Neural Eng 8:025015
Chaurasiya RK, Londle ND, Ghosh S (2016) A novel weighted edit distance-based spelling correction approach for improving the reliability of Devanagari script-based P300 speller system. IEEE Access 4:8184–8198
Chen X, Wang Y, Gao S (2015) Filter bank canonical correlation analysis for implementing a high-speed SSVEP-based brain–computer interface. J Neural Eng 12:1–15
Duan F, Lin D (2015) Design of a multimodal EEG-based hybrid BCI system with visual servo module. IEEE Trans Auton Ment Dev 7:332–341
Fenandez A, Rodriguez F (2019) Evaluation of flashing stimuli shape and color heterogeneity using a P300 brain–computer interface speller. J Neurosci Lett 709:1–6
Gong M, Xu G, Li M (2020) An idle state-detecting method based on transient visual evoked potentials for an asynchronous ERP-based BCI. J Neurosci Meth 337:1–10
Halder S, Leinfelder T, Schulz SM (2019) Neural mechanisms of training an auditory event-related potentials task in a brain–computer interface context. Hum Brain Mapp 40:2399–2412
Hu K, Chen C, Meng Q (2019) Scientific profile of brain–computer interfaces: bibliometric analysis in a 10-year period. Neurosci Lett 635:61–66
Jin J, Horki P, Brunner C (2010) A new P300 stimulus presenting pattern for EEG-based spelling systems. Biomed Tech 55:203–210
Jin J, Allison BZ, Sellers EW (2011) Optimized stimulus presenting patterns for an event-related potential EEG-based brain–computer interface. Med Bio Eng Comput 49:181–191
Kaufmann T, Kubler A (2014) Beyond maximum speed-a novel two-stimulus paradigm for brain–computer interfaces based on event-related potentials (P300-BCI). J Neural Eng 11:1–13
Ke YF, Wang P, Chen Y (2016) Training and testing ERP-BCIs under different mental workload conditions. J Neural Eng 13:1–14
Li W, Li M, Zhou H (2018) A dual stimuli approach combined with convolutional neural network to improve information transfer rate of event-related potential-based brain–computer interface. Int J Neural Syst 28:1–17
Li S, Jin J, Daly I (2020) Comparison of the ERP-based BCI performance among chromatic (RGB) semitransparent face patterns. Front Neurosci 14:1–12
Liu Y, Wei Q, Lu Z (2018) A multi-target brain–computer interface based on code modulated visual evoked potentials. PLoS ONE 13:1–17
Ma T, Li Y, Huggins A (2022) Bayesian inferences on neural activity in EEG-based brain–computer interface. J Am Stat Assoc. https://doi.org/10.1080/01621459.2022.2041422
Mao X, Li W, He H (2017) Object extraction in cluttered environments via a P300-based IFCE. Comput Intell Neurosci 2017:1–12
Mao YY, Yin EW, Allison BZ, Zhang Y, Chen Y, Dong Y, Wang XY, Hu DW, Chchocki A, Jin J (2020) An ERP-based BCI with peripheral stimuli: validation with ALS patients. Cogn Neurodyn 14:21–33
Riechman H, Finke A, Ritter H (2016) Using a cVEP-based brain–computer interface to control a virtual agent. IEEE Trans Neural Syst Rehabil Eng 24:692–699
Sun Q, Zheng L, Wang Y (2022) A 120-target brain–computer interface based on code-modulated visual evoked potentials. J Neurosci Methods 375:109597
Thurling ME, Van BF, Brouwer AM (2013) Controlling a tactile ERP-BCI in a dual mask. IEEE Trans Comput Intell AI 5:129–140
Treder MS, Blankertz B (2020) Covert attention and visual speller design in an ERP-based brain–computer interface. Behav Brain Funct 6:1–13
Usama N, Leerskov KK, Niazi IK (2020) Classification of error-related potentials from single-trial EEG in association with executed and imagined movements: a feature and classifier investigation. Med Bio Eng Comput 58:2699–2710
Verhoeve T, Hubner D (2017) Improving zero-training brain–computer interfaces by mixing model estimators. J Neural Eng 14:1–17
Xiao X, Xu M, Jin J (2020) Discriminative canonical pattern matching for single-trial classification of ERP components. IEEE Trans Bio-Med Eng 67:2266–2275
Xue Y, Tang J, He F, Xu M (2019) Improved P300 speller performance by changing stimuli onset asynchrony (SOA) without retraining the subject-independent model. IEEE Access 7:134137–134144
Yeom SK, Fazli S (2014) An efficient ERP-based brain–computer interface using random set presenting and face familiarity. PLoS ONE 9:1–13
Zhang Z, Duan F, Casals J (2019) A novel deep learning approach with data augmentation to classify motor imagery signals. IEEE Access 7:15945–15954
Zhang B, Zhou Z, Jiang J (2020) A 36-class bimodal ERP brain–computer interface using location-congruent auditory-tactile stimuli. Brain Sci 10:1–18
Zhou Y, He H, Huang Q (2020) A hybrid asynchronous brain–computer interface combining SSVEP and EOG signals. IEEE Trans Bio-Med Eng 67:2881–2892
Acknowledgements
The study was funded by the Natural Science Foundation of Hebei Province under Grant F2021202003, the Technology Nova of Hebei University of Technology under Grant JBKYXX2007, the State Key Laboratory of Reliability and Intelligence of Electrical Equipment under Grant EERI_OY2020004, the National Natural Science Foundation of China under Grant 61806070, 51977060.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Li, M., Wu, L., Lin, F. et al. Dual stimuli interface with logical division using local move stimuli. Cogn Neurodyn 17, 965–973 (2023). https://doi.org/10.1007/s11571-022-09878-z
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11571-022-09878-z