A Brain-Computer Interface Based on Abstract Visual and Auditory Imagery: Evidence for an Effect of Artistic Training
Various kinds of mental imagery have been employed in controlling a brain-computer interface (BCI). BCIs based on mental imagery are typically designed for certain kinds of mental imagery, e.g., motor imagery, which have known neurophysiological correlates. This is a sensible approach because it is much simpler to extract relevant features for classifying brain signals if the expected neurophysiological correlates are known beforehand. However, there is significant variance across individuals in the ability to control different neurophysiological signals, and insufficient empirical data is available in order to determine whether different individuals have better BCI performance with different types of mental imagery. Moreover, there is growing interest in the use of new kinds of mental imagery which might be more suitable for different kinds of applications, including in the arts.
This study presents a BCI in which the participants determined their own specific mental commands based on motor imagery, abstract visual imagery, and abstract auditory imagery. We found that different participants performed best in different sensory modalities, despite there being no differences in the signal processing or machine learning methods used for any of the three tasks. Furthermore, there was a significant effect of background domain expertise on BCI performance, such that musicians had higher accuracy with auditory imagery, and visual artists had higher accuracy with visual imagery.
These results shed light on the individual factors which impact BCI performance. Taking into account domain expertise and allowing for a more personalized method of control in BCI design may have significant long-term implications for user training and BCI applications, particularly those with an artistic or musical focus.
KeywordsBrain-computer interface Mental imagery Individual differences Performance predictors Domain expertise User-centred design Auditory imagery Visual imagery
This research was funded by a Discovery grant from the Natural Sciences and Engineering Research Council of Canada (NSERC) to SB and an NSERC PGS scholarship to KD.
- 9.Burde, W., Blankertz, B.: Is the locus of control of reinforcement a predictor of brain-computer interface performance? na (2006)Google Scholar
- 10.Carrino, F., Dumoulin, J., Mugellini, E., Khaled, O.A., Ingold, R.: A self-paced BCI system to control an electric wheelchair: evaluation of a commercial, low-cost EEG device. In: 2012 ISSNIP Biosignals and Biorobotics Conference (BRC), pp. 1–6, January 2012Google Scholar
- 14.Dhindsa, K., Carcone, D., Becker, S.: An open-ended approach to BCI: embracing individual differences by allowing for user-defined mental commands. In: Conference Abstract: German-Japanese Adaptive BCI Workshopp (2015). Front. Comput. NeurosciGoogle Scholar
- 16.Emotiv Systems. Emotiv - brain computer interface technology, May 2011. http://www.emotiv.com
- 20.Kindermans, P.-J., Verschore, H., Verstraeten, D., Schrauwen, B.: A p300 BCI for the masses: prior information enables instant unsupervised spelling. In: Advances in Neural Information Processing Systems, pp. 710–718 (2012)Google Scholar
- 21.Kothe, C.A., Makeig, S., Onton, J.A.: Emotion recognition from EEG during self-paced emotional imagery. In: Proceedings - 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, ACII, pp. 855–858 (2013)Google Scholar
- 22.Kübler, A., Müller, K.R.: An introduction to brain computer interfacing. In: Dornhege, G., del Millán, J.R., Hinterberger, T., McFarland, D., Müller, K.R. (eds.) Toward Brain-Computer Interfacing. MIT Press, Cambridge (2007)Google Scholar
- 26.Lin, C.J., Weng, R.C., et al.: Simple Probabilistic Predictions for Support Vector Regression. National Taiwan University, Taipei (2004)Google Scholar
- 28.Liu, Y., Jiang, X., Cao, T., Wan, F., Mak, P.U., Mak, P.I., Vai, M.I.: Implementation of SSVEP based BCI with Emotiv EPOC. In: Proceedings of IEEE International Conference on Virtual Environments, Human-Computer Interfaces, and Measurement Systems, VECIMS, pp. 34–37 (2012)Google Scholar
- 29.Lotte, F., Larrue, F., Mühl, C.: Flaws in current human training protocols for spontaneous Brain-Computer Interfaces: lessons learned from instructional design. Front. Hum. Neurosci. 7(September), 568 (2013)Google Scholar
- 32.MATLAB. Version 8.2.0 (R2013b). The MathWorks Inc., Natick, Massachusetts (2013)Google Scholar
- 35.Neuper, C., Pfurtscheller, G.: Neurofeedback training for BCI control. In: Graimann, B., Pfurtscheller, G., Allison, B. (eds.) Brain-Computer Interfaces, pp. 65–78. Springer, Heidelberg (2010)Google Scholar
- 41.Pfurtscheller, G., Lopes da Silva, F.H.: EEG event-related desynchronization (ERD), event-related synchronization (ERS). Electroencephalogr.: Basic Princ. Clin. Appl. Relat. Fields 4, 958–967 (1999)Google Scholar
- 43.Platt, J., et al.: Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. Adv. Large Margin Classif. 10(3), 61–74 (1999)Google Scholar
- 45.Randolph, A.B.: Not all created equal: individual-technology fit of brain-computer interfaces. In: Proceedings of the Annual Hawaii International Conference on System Sciences, pp. 572–578 (2011)Google Scholar