Cognitive Neurodynamics

, Volume 10, Issue 3, pp 201–209 | Cite as

An exploration of spatial auditory BCI paradigms with different sounds: music notes versus beeps

  • Minqiang Huang
  • Ian Daly
  • Jing Jin
  • Yu Zhang
  • Xingyu Wang
  • Andrzej Cichocki
Research Article


Visual brain-computer interfaces (BCIs) are not suitable for people who cannot reliably maintain their eye gaze. Considering that this group usually maintains audition, an auditory based BCI may be a good choice for them. In this paper, we explore two auditory patterns: (1) a pattern utilizing symmetrical spatial cues with multiple frequency beeps [called the high low medium (HLM) pattern], and (2) a pattern utilizing non-symmetrical spatial cues with six tones derived from the diatonic scale [called the diatonic scale (DS) pattern]. These two patterns are compared to each other in terms of accuracy to determine which auditory pattern is better. The HLM pattern uses three different frequency beeps and has a symmetrical spatial distribution. The DS pattern uses six spoken stimuli, which are six notes solmizated as “do”, “re”, “mi”, “fa”, “sol” and “la”, and derived from the diatonic scale. These six sounds are distributed to six, spatially distributed, speakers. Thus, we compare a BCI paradigm using beeps with another BCI paradigm using tones on the diatonic scale, when the stimuli are spatially distributed. Although no significant differences are found between the ERPs, the HLM pattern performs better than the DS pattern: the online accuracy achieved with the HLM pattern is significantly higher than that achieved with the DS pattern (p = 0.0028).


Auditory BCI Auditory pattern Spatial cues Diatonic scale 



This work was supported in part by the Grant National Natural Science Foundation of China, under Grant Numbers 61573142, 61203127, 91420302 and 61305028 and supported part by Shanghai Leading Academic Discipline Project, Project Number: B504. This work was also supported by the Fundamental Research Funds for the Central Universities (WG1414005, WH1314023, WH1516018).


  1. Belitski A, Farquhar J, Desain P (2011) P300 audio-visual speller. J Neural Eng 8(2):025022CrossRefPubMedGoogle Scholar
  2. Birbaumer N, Ghanayim N, Hinterberger T, Iversen I, Kotchoubey B, Kübler A, Perrelmouter J, Taub E, Flor H (1999) A spelling device for the paralysed. Nature 398:297–298CrossRefPubMedGoogle Scholar
  3. Daly I, Billinger M, Laparra-Hernández J, Aloise F, García ML, Faller J, Scherer R, Müller-Putz G (2013) On the control of brain-computer interfaces by users with cerebral palsy. Clin Neurophysiol 124(9):1787–1797CrossRefPubMedGoogle Scholar
  4. Daly I, Williams D, Hwang F, Kirke A, Malik A, Roesch E, Weaver J, Miranda E, Nasuto SJ (2014) Investigating music tempo as a feedback mechanism for closed-loop BCI control. Brain Comput Interfaces. doi: 10.1080/2326263X.2014.979728 Google Scholar
  5. Dominguez LG, Kostelecki W, Wennberg R, Velazquez JLP (2011) Distinct dynamical patterns that distinguish willed and forced actions. Cogn Neurodyn 5(1):67–76CrossRefGoogle Scholar
  6. Farwell LA (2012) Brain fingerprinting: a comprehensive tutorial review of detection of concealed information with event-related brain potentials. Cogn Neurodyn 6(2):115–154CrossRefPubMedPubMedCentralGoogle Scholar
  7. Farwell LA, Donchin E (1988) Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalogr Clin Neurophysiol 70(6):510–523CrossRefPubMedGoogle Scholar
  8. Freeman WJ (2007a) Definitions of state variables and state space for brain-computer interface. Part 1: multiple hierarchical levels of brain function. Cogn Neurodyn 1(1):3–14CrossRefPubMedPubMedCentralGoogle Scholar
  9. Freeman WJ (2007b) Definitions of state variables and state space for brain-computer interface. Part 2: extraction and classification of feature vectors. Cogn Neurodyn 1(2):85–96CrossRefPubMedPubMedCentralGoogle Scholar
  10. Guo J, Gao S, Hong B (2010) An auditory brain–computer interface using active mental response. IEEE Trans Neural Syst Rehabil Eng 18(3):230–235CrossRefPubMedGoogle Scholar
  11. Halder S, Rea M, Andreoni R, Nijboer F, Hammer E, Kleih S, Birbaumer N, Kübler A (2010) An auditory oddball brain–computer interface for binary choices. Clin Neurophysiol 121(4):516–523CrossRefPubMedGoogle Scholar
  12. Hayashi H, Kato S (1989) Total manifestations of amyotrophic lateral sclerosis: ALS in the totally locked-in state. J Neurol Sci 93(1):19–35CrossRefPubMedGoogle Scholar
  13. Higashi H, Rutkowski TM, Washizawa Y, Cichocki A, Tanaka T (2011) EEG auditory steady state responses classification for the novel BCI. In: Proceedings of the Engineering in Medicine and Biology Society, EMBC (2011) annual international conference of the IEEEGoogle Scholar
  14. Hoffmann U, Vesin J-M, Ebrahimi T, Diserens K (2008) An efficient P300-based brain–computer interface for disabled subjects. J Neurosci Methods 167(1):115–125CrossRefPubMedGoogle Scholar
  15. Höhne J, Schreuder M, Blankertz B, Tangermann M (2011) A novel 9-class auditory ERP paradigm driving a predictive text entry system. Front Neurosci. doi: 10.3389/fnins.2011.00099 PubMedPubMedCentralGoogle Scholar
  16. Hwang H-J, Kim S, Choi S, Im C-H (2013) EEG-based brain-computer interfaces: a thorough literature survey. Int J Hum Comput Interact 29(12):814–826CrossRefGoogle Scholar
  17. Ide Y, Takahashi M, Lauwereyns J, Sandner G, Tsukada M, Aihara T (2013) Fear conditioning induces guinea pig auditory cortex activation by foot shock alone. Cogn Neurodyn 7(1):67–77CrossRefPubMedPubMedCentralGoogle Scholar
  18. Jin J, Allison BZ, Sellers EW, Brunner C, Horki P, Wang X, Neuper C (2011) An adaptive P300-based control system. J Neural Eng 8(3):036006CrossRefPubMedPubMedCentralGoogle Scholar
  19. Jin J, Allison BZ, Zhang Y, Wang X, Cichocki A (2014a) An erp-based bci using an oddball paradigm with different faces and reduced errors in critical functions. Int J Neural Syst. doi: 10.1142/S0129065714500270 Google Scholar
  20. Jin J, Daly I, Zhang Y, Wang X, Cichocki A (2014b) An optimized ERP brain–computer interface based on facial expression changes. J Neural Eng 11(3):036004CrossRefPubMedGoogle Scholar
  21. Käthner I, Ruf CA, Pasqualotto E, Braun C, Birbaumer N, Halder S (2013) A portable auditory P300 brain–computer interface with directional cues. Clin Neurophysiol 124(2):327–338CrossRefPubMedGoogle Scholar
  22. Kim D-W, Cho J-H, Hwang H-J, Lim J-H, Im C-H (2011) A vision-free brain–computer interface (BCI) paradigm based on auditory selective attention. In: Proceedings of the Engineering in Medicine and Biology Society, EMBC (2011) annual international conference of the IEEEGoogle Scholar
  23. Klobassa DS, Vaughan T, Brunner P, Schwartz N, Wolpaw J, Neuper C, Sellers E (2009) Toward a high-throughput auditory P300-based brain–computer interface. Clin Neurophysiol 120(7):1252–1261CrossRefPubMedPubMedCentralGoogle Scholar
  24. Kodama T, Makino S, Rutkowski T M (2014) Spatial tactile brain–computer interface paradigm applying vibration stimuli to large areas of user’s back. arXiv preprint arXiv:14044226
  25. Kübler A, Furdea A, Halder S, Hammer EM, Nijboer F, Kotchoubey B (2009) A brain–computer interface controlled auditory event-related potential (P300) spelling system for locked-in patients. Ann N Y Acad Sci 1157(1):90–100CrossRefPubMedGoogle Scholar
  26. Laghari K, Gupta R, Arndt S, Antons J, Schleicher R, Moller S, Falk T(2013) Auditory BCIs for visually impaired users: should developers worry about the quality of text-to-speech readers? In: Proceedings of the international BCI meetingGoogle Scholar
  27. Lelievre Y, Washizawa Y, Rutkowski TM (2013) Single trial BCI classification accuracy improvement for the novel virtual sound movement-based spatial auditory paradigm. In: Proceedings of the signal and information processing association annual summit and conference (APSIPA) Asia-PacificGoogle Scholar
  28. Li Y, Pan J, Long J, Yu T, Wang F, Yu Z, Wu W (2016) Multimodal BCIs: target detection, multidimensional control, and awareness evaluation in patients with disorder of consciousness. In: Proceedings of the IEEEGoogle Scholar
  29. Makous JC, Middlebrooks JC (1990) Two-dimensional sound localization by human listeners. J Acoust Soc Am 87(5):2188–2200CrossRefPubMedGoogle Scholar
  30. Muller-Putz G, Klobassa D, Pokorny C, Pichler G, Erlbeck H, Real R, Kubler A, Risetti M, Mattia D (2012) The reviewer’s comment. In: Proceedings of the Engineering in Medicine and Biology Society (EMBC), 2012 annual international conference of the IEEEGoogle Scholar
  31. Nijboer F, Furdea A, Gunst I, Mellinger J, McFarland DJ, Birbaumer N, Kübler A (2008) An auditory brain–computer interface (BCI). J Neurosci Methods 167(1):43–50CrossRefPubMedGoogle Scholar
  32. Rutkowski T, Tanaka T, Zhao Q, Cichocki A (2010) Spatial auditory BCI/BMI paradigm-multichannel EMD approach to brain responses estimation. In: Proceedings of the APSIPA annual summit and conferenceGoogle Scholar
  33. Schreuder M, Blankertz B, Tangermann M (2010) A new auditory multi-class brain-computer interface paradigm: spatial hearing as an informative cue. PLoS ONE 5(4):e9813CrossRefPubMedPubMedCentralGoogle Scholar
  34. Schreuder M, Rost T, Tangermann M (2011) Listen, you are writing! Speeding up online spelling with a dynamic auditory BCI. Front Neurosci. doi: 10.3389/fnins.2011.00112 Google Scholar
  35. Smith RC, Price SR (2014) Modelling of human low frequency sound localization acuity demonstrates dominance of spatial variation of interaural time difference and suggests uniform just-noticeable differences in interaural time difference. PLoS ONE 9(2):e89033CrossRefPubMedPubMedCentralGoogle Scholar
  36. Treder MS, Purwins H, Miklody D, Sturm I, Blankertz B (2014) Decoding auditory attention to instruments in polyphonic music using single-trial EEG classification. J Neural Eng 11(2):026009CrossRefPubMedGoogle Scholar
  37. Wang D, Chang P (2008) An oscillatory correlation model of auditory streaming. Cogn Neurodyn 2(1):7–19CrossRefPubMedPubMedCentralGoogle Scholar
  38. Wang H, Li Y, Long J, Yu T, Gu Z (2014a) An asynchronous wheelchair control by hybrid EEG–EOG brain–computer interface. Cogn Neurodyn 8(5):399–409CrossRefPubMedPubMedCentralGoogle Scholar
  39. Wang M, Daly I, Allison BZ, Jin J, Zhang Y, Chen L, Wang X (2014b) A new hybrid BCI paradigm based on P300 and SSVEP. J Neurosci Methods 24:16–25Google Scholar
  40. Wolpaw JR, Birbaumer N, McFarland DJ, Pfurtscheller G, Vaughan TM (2002) Brain–computer interfaces for communication and control. Clin Neurophysiol 113(6):767–791CrossRefPubMedGoogle Scholar
  41. Yin E, Zhou Z, Jiang J, Chen F, Liu Y, Hu D (2013) A novel hybrid BCI speller based on the incorporation of SSVEP into the P300 paradigm. J Neural Eng 10(2):026012CrossRefPubMedGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2016

Authors and Affiliations

  1. 1.Key Laboratory of Advanced Control and Optimization for Chemical Processes, Ministry of EducationEast China University of Science and TechnologyShanghaiPeople’s Republic of China
  2. 2.Brain Embodiment Lab, School of Systems EngineeringUniversity of ReadingReadingUK
  3. 3.Lab for Advanced Brain Signal Processing and BTCCRiken BSIWako-shiJapan
  4. 4.Skolkowo Institute of Science and TechnologyMoscowRussia

Personalised recommendations