An exploration of spatial auditory BCI paradigms with different sounds: music notes versus beeps
- 317 Downloads
Visual brain-computer interfaces (BCIs) are not suitable for people who cannot reliably maintain their eye gaze. Considering that this group usually maintains audition, an auditory based BCI may be a good choice for them. In this paper, we explore two auditory patterns: (1) a pattern utilizing symmetrical spatial cues with multiple frequency beeps [called the high low medium (HLM) pattern], and (2) a pattern utilizing non-symmetrical spatial cues with six tones derived from the diatonic scale [called the diatonic scale (DS) pattern]. These two patterns are compared to each other in terms of accuracy to determine which auditory pattern is better. The HLM pattern uses three different frequency beeps and has a symmetrical spatial distribution. The DS pattern uses six spoken stimuli, which are six notes solmizated as “do”, “re”, “mi”, “fa”, “sol” and “la”, and derived from the diatonic scale. These six sounds are distributed to six, spatially distributed, speakers. Thus, we compare a BCI paradigm using beeps with another BCI paradigm using tones on the diatonic scale, when the stimuli are spatially distributed. Although no significant differences are found between the ERPs, the HLM pattern performs better than the DS pattern: the online accuracy achieved with the HLM pattern is significantly higher than that achieved with the DS pattern (p = 0.0028).
KeywordsAuditory BCI Auditory pattern Spatial cues Diatonic scale
This work was supported in part by the Grant National Natural Science Foundation of China, under Grant Numbers 61573142, 61203127, 91420302 and 61305028 and supported part by Shanghai Leading Academic Discipline Project, Project Number: B504. This work was also supported by the Fundamental Research Funds for the Central Universities (WG1414005, WH1314023, WH1516018).
- Higashi H, Rutkowski TM, Washizawa Y, Cichocki A, Tanaka T (2011) EEG auditory steady state responses classification for the novel BCI. In: Proceedings of the Engineering in Medicine and Biology Society, EMBC (2011) annual international conference of the IEEEGoogle Scholar
- Kim D-W, Cho J-H, Hwang H-J, Lim J-H, Im C-H (2011) A vision-free brain–computer interface (BCI) paradigm based on auditory selective attention. In: Proceedings of the Engineering in Medicine and Biology Society, EMBC (2011) annual international conference of the IEEEGoogle Scholar
- Kodama T, Makino S, Rutkowski T M (2014) Spatial tactile brain–computer interface paradigm applying vibration stimuli to large areas of user’s back. arXiv preprint arXiv:14044226
- Laghari K, Gupta R, Arndt S, Antons J, Schleicher R, Moller S, Falk T(2013) Auditory BCIs for visually impaired users: should developers worry about the quality of text-to-speech readers? In: Proceedings of the international BCI meetingGoogle Scholar
- Lelievre Y, Washizawa Y, Rutkowski TM (2013) Single trial BCI classification accuracy improvement for the novel virtual sound movement-based spatial auditory paradigm. In: Proceedings of the signal and information processing association annual summit and conference (APSIPA) Asia-PacificGoogle Scholar
- Li Y, Pan J, Long J, Yu T, Wang F, Yu Z, Wu W (2016) Multimodal BCIs: target detection, multidimensional control, and awareness evaluation in patients with disorder of consciousness. In: Proceedings of the IEEEGoogle Scholar
- Muller-Putz G, Klobassa D, Pokorny C, Pichler G, Erlbeck H, Real R, Kubler A, Risetti M, Mattia D (2012) The reviewer’s comment. In: Proceedings of the Engineering in Medicine and Biology Society (EMBC), 2012 annual international conference of the IEEEGoogle Scholar
- Rutkowski T, Tanaka T, Zhao Q, Cichocki A (2010) Spatial auditory BCI/BMI paradigm-multichannel EMD approach to brain responses estimation. In: Proceedings of the APSIPA annual summit and conferenceGoogle Scholar
- Smith RC, Price SR (2014) Modelling of human low frequency sound localization acuity demonstrates dominance of spatial variation of interaural time difference and suggests uniform just-noticeable differences in interaural time difference. PLoS ONE 9(2):e89033CrossRefPubMedPubMedCentralGoogle Scholar
- Wang M, Daly I, Allison BZ, Jin J, Zhang Y, Chen L, Wang X (2014b) A new hybrid BCI paradigm based on P300 and SSVEP. J Neurosci Methods 24:16–25Google Scholar