Abstract
During the past decade, brain–computer interfaces (BCIs) have rapidly developed, both in technological and application domains. However, most of these interfaces rely on the visual modality. Only some research groups have been studying non-visual BCIs, primarily based on auditory and, sometimes, on somatosensory signals. These non-visual BCI approaches are especially useful for severely disabled patients with poor vision. From a broader perspective, multisensory BCIs may offer more versatile and user-friendly paradigms for control and feedback. This chapter describes current systems that are used within auditory and somatosensory BCI research. Four categories of noninvasive BCI paradigms are employed: (1) P300 evoked potentials, (2) steady-state evoked potentials, (3) slow cortical potentials, and (4) mental tasks. Comparing visual and non-visual BCIs, we propose and discuss different possible multisensory combinations, as well as their pros and cons. We conclude by discussing potential future research directions of multisensory BCIs and related research questions
Keywords
- Amyotrophic Lateral Sclerosis
- Motor Imagery
- Amyotrophic Lateral Sclerosis Patient
- Mental Task
- Oddball Paradigm
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, access via your institution.
Buying options

References
Allison, B., McFarland, D., Schalk, G., Zheng, S., Jackson, M., Wolpaw, J.: Towards an independent brain–computer interface using steady state visual evoked potentials. Clin. Neurophysiol. 119(2), 399–408 (2008)
Allison, B.Z., Pineda, J.A.: ERPs evoked by different matrix sizes: implications for a brain computer interface (BCI) system. IEEE Trans. Neural Syst. Rehabil. Eng. 11, 110–113 (2003). DOI 10.1109/TNSRE.2003.814448
Belitski, A., Farquhar, J., Desain, P.: P300 audio–visual speller. J. Neural Eng. 8(2), 025,022 (2011). DOI 10.1088/1741-2560/8/2/025022, http://dx.doi.org/10.1088/1741-2560/8/2/025022
Birbaumer, N., Hinterberger, T., Kübler, A., Neumann, N.: The thought-translation device (TTD): neurobehavioral mechanisms and clinical outcome. IEEE Trans. Neural Syst. Rehabil. Eng. 11(2), 120–123 (2003). DOI 10.1109/TNSRE.2003.814439, http://dx.doi.org/10.1109/TNSRE.2003.814439
Bregman, A.: Auditory scene analysis: Hearing in complex environments. In: McAdams, S., Bigand, E. (eds.) Thinking in sound: the cognitive psychology of human audition, pp. 10–36. Oxford University Press, Oxford (1993)
Brouwer, A.M., van Erp, J.B.: A tactile P300 brain–computer interface. Front. Neurosci. 4, 19 (2010). DOI 10.3389/fnins.2010.00019
Brouwer, A.M., van Erp, J.B.F., Aloise, F., Cincotti, F.: Tactile, visual and bimodal P300s: Could bimodal P300s boost BCI performance? SRX Neuroscience, Article ID:967027
Brumberg, J.S., Wright, E.J., Andreasen, D.S., Guenther, F.H., Kennedy, P.R.: Classification of intended phoneme production from chronic intracortical microelectrode recordings in speech-motor cortex. Front. Neurosci. 5, 65 (2011). DOI 10.3389/fnins.2011.00065, http://dx.doi.org/10.3389/fnins.2011.00065
Cabrera, A., Dremstrup, K.: Auditory and spatial navigation imagery in brain–computer interface using optimized wavelets. J. Neurosci. Methods 174(1), 135–146 (2008)
Chatterjee, A., Aggarwal, V., Ramos, A., Acharya, S., Thakor, N.: A brain–computer interface with vibrotactile biofeedback for haptic information. J. Neuroeng. Rehabil. 4(1), 40 (2007)
Cincotti, F., Kauhanen, L., Aloise, F., Palomäki, T., Caporusso, N., Jylänki, P., Mattia, D., Babiloni, F., Vanacker, G., Nuttin, M., et al.: Vibrotactile feedback for brain–computer interface operation. Comput. Intell. Neurosci. 2007:48937 (2007)
Curran, E., Sykacek, P., Stokes, M., Roberts, S., Penny, W., Johnsrude, I., Owen, A.: Cognitive tasks for driving a brain–computer interfacing system: a pilot study. IEEE Trans. Neural Syst. Rehabil. Eng. 12(1), 48–54 (2004)
Daly, I., Nasuto, S., Warwick, K.: Towards natural human computer interaction in BCI. In: AISB 2008 Convention Communication, Interaction and Social Intelligence, vol 1, p. 26 (2008)
Desain, P., Hupse, A., Kallenberg, M., de Kruif, B., Schaefer, R.: Brain–computer interfacing using selective attention and frequency-tagged stimuli. In: Proceedings of the 3rd International Brain–Computer Interface Workshop & Training Course, Graz, Austria, pp. 98–99 (2006)
Farquhar, J., Blankespoor, J., Vlek, R., Desain, P.: Towards a noise-tagging auditory BCI-paradigm. In: Proceedings of the 4th International BCI Workshop and Training Course, Graz, Austria, pp. 50–55 (2008)
Farwell, L.A., Donchin, E.: Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalogr. Clin. Neurophysiol. 70(6), 510–523 (1988)
Friedrich, E., Scherer, R., Neuper, C.: The effect of distinct mental strategies on classification performance for brain–computer interfaces. International J. Psychophysiol. (2012)
Furdea, A., Halder, S., Krusienski, D., Bross, D., Nijboer, F., Birbaumer, N., Kübler, A.: An auditory oddball (P300) spelling system for brain–computer interfaces. Psychophysiology 46(3), 617–625 (2009). DOI 10.1111/j.1469-8986.2008.00783.x
Ghazanfar, A., Schroeder, C.: Is neocortex essentially multisensory? Trends in Cognitive Sciences 10(6), 278–285 (2006)
Gomez-Rodriguez, M., Peters, J., Hill, J., Schölkopf, B., Gharabaghi, A., Grosse-Wentrup, M.: Closing the sensorimotor loop: Haptic feedback facilitates decoding of arm movement imagery. In: Systems Man and Cybernetics (SMC), 2010 IEEE International Conference on IEEE, pp. 121–126 (2010)
Guo, J., Hong, B., Guo, F., Gao, X., Gao, S.: An auditory BCI using voluntary mental response. In: Neural Engineering, 2009. NER’09. 4th International IEEE/EMBS Conference on IEEE, pp. 455–458 (2009)
Guo, J., Gao, S., Hong, B.: An auditory brain–computer interface using active mental response. IEEE Trans. Neural Syst. Rehabil. Eng. 18(3), 230–235 (2010)
Halder, S., Rea, M., Andreoni, R., Nijboer, F., Hammer, E.M., Kleih, S.C., Birbaumer, N., Kübler, A.: An auditory oddball brain–computer interface for binary choices. Clin. Neurophysiol. 121(4), 516–523 (2010). DOI 10.1016/j.clinph.2009.11.087, http://dx.doi.org/10.1016/j.clinph.2009.11.087
Hill, N., Lal, T., Bierig, K., Birbaumer, N., Schölkopf, B.: An auditory paradigm for brain–computer interfaces. Adv. Neural Inf. Process. Syst. 17, 569–76 (2005)
Hill, N.J., Schölkopf, B.: An online brain-computer interface based on shifting attention to concurrent streams of auditory stimuli. J Neural Eng. 9(2):026011 (2012)
Hinterberger, T.: The sensorium: a multimodal neurofeedback environment. Adv. Hum. Comput. Interact. 2011, 3 (2011)
Hinterberger, T., Hill, J., Birbaumer, N.: An auditory brain–computer communication device. In: Biomedical Circuits and Systems, 2004 IEEE International Workshop on IEEE, pp. S3–6 (2004a)
Hinterberger, T., Neumann, N., Pham, M., Kübler, A., Grether, A., Hofmayer, N., Wilhelm, B., Flor, H., Birbaumer, N.: A multimodal brain-based feedback and communication system. Exp. Brain Res. 154, 521–526 (2004b). DOI 10.1007/s00221-003-1690-3
Höhne, J., Schreuder, M., Blankertz, B., Tangermann, M.: Frontiers: A novel 9-class auditory ERP paradigm driving a predictive text entry system. Front. Neuroprosthetics 5:99 (2011)
Hong, B., Lou, B., Guo, J., Gao, S.: Adaptive active auditory brain computer interface. In: Engineering in Medicine and Biology Society, 2009. EMBC 2009. Annual International Conference of the IEEE, IEEE, pp. 4531–4534 (2009)
Jacobs, L., Bozian, D., Heffner, R., Barron, S.: An eye movement disorder in amyotrophic lateral sclerosis. Neurology 31(10), 1282–1287 (1981)
Kanoh, S., Miyamoto, K., Yoshinobu, T.: A brain–computer interface (BCI) system based on auditory stream segregation. Conf Proc IEEE Eng Med Biol Soc., 2008:642–645 (2008)
Keil, A., Gruber, T., Müller, M., Moratti, S., Stolarova, M., Bradley, M., Lang, P.: Early modulation of visual perception by emotional arousal: evidence from steady-state visual evoked brain potentials. Cogn. Affect. Behav. Neurosci. 3(3), 195–206 (2003)
Kelly, S., Lalor, E., Finucane, C., McDarby, G., Reilly, R.: Visual spatial attention control in an independent brain–computer interface. IEEE Trans. Biomed. Eng. 52(9), 1588–1596 (2005)
Kim, D.W., Hwang, H.J., Lim, J.H., Lee, Y.H., Jung, K.Y., Im, C.H.: Classification of selective attention to auditory stimuli: toward vision-free brain–computer interfacing. J. Neurosci. Methods 197(1), 180–185 (2011). DOI 10.1016/j.jneumeth.2011.02.007, http://dx.doi.org/10.1016/j.jneumeth.2011.02.007
Klobassa, D.S., Vaughan, T.M., Brunner, P., Schwartz, N.E., Wolpaw, J.R., Neuper, C., Sellers, E.W.: Toward a high-throughput auditory P300-based brain–computer interface. Clin. Neurophysiol. 120(7), 1252–1261 (2009). DOI 10.1016/j.clinph.2009.04.019, http://dx.doi.org/10.1016/j.clinph.2009.04.019
Klonowski, W., Duch, W., Perovic, A., Jovanovic, A.: Some computational aspects of the brain computer interfaces based on inner music. Comput. Intell. Neurosci. 2009:950403 (2009)
de Kruif, B., Schaefer, R., Desain, P.: Classification of imagined beats for use in a brain computer interface. Conf Proc IEEE Eng Med Biol Soc., 2007:678–681 (2007)
Kübler, A., Furdea, A., Halder, S., Hammer, E., Nijboer, F., Kotchoubey, B.: A brain–computer interface controlled auditory event-related potential (P300) spelling system for locked-in patients. Ann. N. Y. Acad. Sci. 1157, 90–100 (2009). DOI 10.1111/j.1749-6632.2008.04122.x
Lopez, M., Pomares, H., Pelayo, F., Urquiza, J., Perez, J.: Evidences of cognitive effects over auditory steady-state responses by means of artificial neural networks and its use in brain–computer interfaces. Neurocomputing 72(16-18), 3617–3623 (2009)
Lulé D., Diekmann, V., Müller, H., Kassubek, J., Ludolph, A., Birbaumer, N.: Neuroimaging of multimodal sensory stimulation in amyotrophic lateral sclerosis. J. Neurol. Neurosurg. Psychiatry 81(8), 899 (2010)
McCorry, D.: Using statistical classification algorithms to decode covert speech states with functional magnetic resonance imaging. PhD thesis, George Mason University (2010)
Miranda, E.: Brain–computer music interface for composition and performance. Int. J. Disabil. Hum. Dev. 5(2), 119 (2006)
Miranda, E., Magee, W., Wilson, J., Eaton, J., Palaniappan, R.: Brain–computer music interfacing (BCMI): From basic research to the real world of special needs. Music Med. 3:134–140 (2011)
Mitchell, T., Shinkareva, S., Carlson, A., Chang, K.M., Malave, V., Mason, R., Just, M.: Predicting human brain activity associated with the meanings of nouns. Science 320(5880), 1191–1195 (2008). DOI 10.1126/science.1152876, http://dx.doi.org/10.1126/science.1152876
Mitsumoto, H., Przedborski, S., Gordon, P. (eds.): Amyotrophic Lateral Sclerosis. Taylor & Francis Group: New York, NY (2006)
Molina, G., Tsoneva, T., Nijholt, A.: Emotional brain–computer interfaces. In: Affective Computing and Intelligent Interaction and Workshops, 2009. ACII 2009. 3rd International Conference on IEEE, pp. 1–9 (2009)
Müller-Putz, G., Neuper, C., Pfurtscheller, G.: Resonance-like frequencies of sensorimotor areas evoked by repetitive tactile stimulation. Biomed. Tech. (Berl.) 46, 186–190 (2001)
Müller-Putz, G., Scherer, R., Neuper, C., Pfurtscheller, G.: Steady-state somatosensory evoked potentials: suitable brain signals for brain–computer interfaces? IEEE Trans. Neural Syst. Rehabil. Eng. 14(1), 30–37 (2006)
Murphy, B., Poesio, M., Bovolo, F., Bruzzone, L., Dalponte, M., Lakany, H.: EEG decoding of semantic category reveals distributed representations for single concepts. Brain Lang. 117(1), 12–22 (2011). DOI 10.1016/j.bandl.2010.09.013, http://dx.doi.org/10.1016/j.bandl.2010.09.013
Neuper, C., Pfurtscheller, G.: Event-related dynamics of cortical rhythms: frequency-specific features and functional correlates. Int. J. Psychophysiol. 43(1), 41–58 (2001)
Nijboer, F., Furdea, A., Gunst, I., Mellinger, J., McFarland, D., Birbaumer, N., Kübler, A.: An auditory brain–computer interface (BCI). J. Neurosci. methods 167(1), 43–50 (2008)
Pham, M., Hinterberger, T., Neumann, N., Kübler, A., Hofmayer, N., Grether, A., Wilhelm, B., Vatine, J., Birbaumer, N.: An auditory brain–computer interface based on the self-regulation of slow cortical potentials. Neurorehabil. Neural Repair 19(3), 206 (2005)
Polich, J.: Updating P300: an integrative theory of P3a and P3b. Clin. Neurophysiol. 118(10), 2128–2148 (2007). DOI 10.1016/j.clinph.2007.04.019, http://dx.doi.org/10.1016/j.clinph.2007.04.019
Porbadnigk, A., Wester, M., Calliess, J.P., Schultz, T.: EEG-based speech recognition – impact of temporal effects. In: Proceedings of the International Conference on Bio-inspired Systems and Signal Processing (2009)
Rosenboom, D.: Extended musical interface with the human nervous system. Leonardo Monograph Series International Society for the Arts, Sciences and Technology (ISAST) 1 (1997)
Roß B., Borgmann, C., Draganova, R., Roberts, L., Pantev, C.: A high-precision magnetoencephalographic study of human auditory steady-state responses to amplitude-modulated tones. J. Acoust. Soc. Am. 108, 679 (2000)
Rutkowski, T., Vialatte, F., Cichocki, A., Mandic, D., Barros, A.: Auditory feedback for brain computer interface management–an EEG data sonification approach. In: Knowledge-Based Intelligent Information and Engineering Systems, pp. 1232–1239. Springer-Verlag: Berling Heidelberg (2006)
Schreuder, M., Blankertz, B., Tangermann, M.: A new auditory multi-class brain–computer interface paradigm: spatial hearing as an informative cue. PLoS One 5, e9813 (2010). DOI 10.1371/journal.pone.0009813
Schröger, E., Widmann, A.: Speeded responses to audiovisual signal changes result from bimodal integration. Psychophysiology 35(6), 755–759 (1998). DOI 10.1111/1469-8986.3560755, http://dx.doi.org/10.1111/1469-8986.3560755
Sellers, E., Donchin, E.: A P300-based brain–computer interface: initial tests by ALS patients. Clin. Neurophysiol. 117(3), 538–548 (2006). DOI 10.1016/j.clinph.2005.06.027, http://dx.doi.org/10.1016/j.clinph.2005.06.027
Sellers, E., Kübler, A., Donchin, E.: Brain-computer interface research at the University of South Florida Cognitive Psychophysiology Laboratory: the P300 speller. IEEE Trans. Neural Syst. Rehabil. Eng. 14, 221–224 (2006). DOI 10.1109/TNSRE.2006.875580
Simanova, I., van Gerven, M., Oostenveld, R., Hagoort, P.: Identifying object categories from event-related EEG: toward decoding of conceptual representations. PLoS One 5(12), e14465 (2010). DOI 10.1371/journal.pone.0014465, http://dx.doi.org/10.1371/journal.pone.0014465
Skrandies, W., Jedynak, A., Kleiser, R.: Scalp distribution components of brain activity evoked by visual motion stimuli. Exp. Brain Res. 122(1), 62–70 (1998)
Soto-Faraco, S., Väljamäe, A.: Multisensory interactions during motion perception: From basic principles to media applications. Taylor & Francis Group: New York, NY (2011)
Stapells, D., Herdman, A., Small, S., Dimitrijevic, A., Hatton, J.: Current status of the auditory steady-state responses for estimating an infant’s audiogram. A sound foundation through early amplification, pp. 43–59 (2004)
Suppes, P., Han, B., Lu, Z.L.: Brain wave recognition of words. Proc. Natl. Acad. Sci. USA 94(26), 14,965–14,969 (1997)
Suppes, P., Han, B., Lu, Z.L.: Brain-wave recognition of sentences. Proc. Natl. Acad. Sci. USA 95(26), 15,861–15,866 (1998)
Suppes, P., Han, B., Epelboim, J., Lu, Z.: Invariance between subjects of brain wave representations of language. Proc. Natl. Acad. Sci. 96(22), 12,953 (1999)
Sutton, S., Braren, M., Zubin, J., John, E.: Evoked-potential correlates of stimulus uncertainty. Science 150(700), 1187–1188 (1965)
Townsend, G., LaPallo, B., Boulay, C., Krusienski, D., Frye, G., Hauser, C., Schwartz, N., Vaughan, T., Wolpaw, J., Sellers, E.: A novel P300-based brain–computer interface stimulus presentation paradigm: moving beyond rows and columns. Clin. Neurophysiol. 121, 1109–1120 (2010). DOI 10.1016/j.clinph.2010.01.030
Väljamäe, A., Kleiner, M.: Spatial sound in auditory vision substitution systems. In: Audio Engineering Society Convention, pp. 120 (2006). http://www.aes.org/e-lib/browse.cfm?elib=13599
Väljamäe, A., Tajadura-Jimenez, A., Larsson, P., Västfjäll, D., Kleiner, M.: Handheld experiences: Using audio to enhance the illusion of self-motion. IEEE MultiMedia, pp. 68–75 (2008)
Vlek, R., Schaefer, R., Gielen, C., Farquhar, J., Desain, P.: Sequenced subjective accents for brain–computer interfaces. J. Neural Eng. 8(3), 036,002 (2011). DOI 10.1088/1741-2560/8/3/036002, http://dx.doi.org/10.1088/1741-2560/8/3/036002
Wagner, I.: An auditory brain–computer interface for binary choices using event-related potentials and lateralized hemispheric brain activity: Tests with healthy controls. Master Thesis, University of Graz, Graz, Austria (2011)
Wang, Y., Gao, X., Hong, B., Jia, C., Gao, S.: Brain–computer interfaces based on visual evoked potentials. IEEE Eng. Med. Biol. Mag. 27(5), 64–71 (2008)
Wolpaw, J., Birbaumer, N., McFarland, D., Pfurtscheller, G., Vaughan, T.: Brain-computer interfaces for communication and control. Clin. Neurophysiol. 113, 767–791 (2002). DOI 10.1016/S1388-2457(02)00057-3
Acknowledgements
This work was supported by Support action “FutureBNCI,” Project number ICT-2010-248320.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Wagner, I.C., Daly, I., Väljamäe, A. (2012). Non-visual and Multisensory BCI Systems: Present and Future. In: Allison, B., Dunne, S., Leeb, R., Del R. Millán, J., Nijholt, A. (eds) Towards Practical Brain-Computer Interfaces. Biological and Medical Physics, Biomedical Engineering. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-29746-5_19
Download citation
DOI: https://doi.org/10.1007/978-3-642-29746-5_19
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-29745-8
Online ISBN: 978-3-642-29746-5
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)