Advertisement

A Brain-Computer Interface Based on Abstract Visual and Auditory Imagery: Evidence for an Effect of Artistic Training

  • Kiret DhindsaEmail author
  • Dean Carcone
  • Suzanna Becker
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10285)

Abstract

Various kinds of mental imagery have been employed in controlling a brain-computer interface (BCI). BCIs based on mental imagery are typically designed for certain kinds of mental imagery, e.g., motor imagery, which have known neurophysiological correlates. This is a sensible approach because it is much simpler to extract relevant features for classifying brain signals if the expected neurophysiological correlates are known beforehand. However, there is significant variance across individuals in the ability to control different neurophysiological signals, and insufficient empirical data is available in order to determine whether different individuals have better BCI performance with different types of mental imagery. Moreover, there is growing interest in the use of new kinds of mental imagery which might be more suitable for different kinds of applications, including in the arts.

This study presents a BCI in which the participants determined their own specific mental commands based on motor imagery, abstract visual imagery, and abstract auditory imagery. We found that different participants performed best in different sensory modalities, despite there being no differences in the signal processing or machine learning methods used for any of the three tasks. Furthermore, there was a significant effect of background domain expertise on BCI performance, such that musicians had higher accuracy with auditory imagery, and visual artists had higher accuracy with visual imagery.

These results shed light on the individual factors which impact BCI performance. Taking into account domain expertise and allowing for a more personalized method of control in BCI design may have significant long-term implications for user training and BCI applications, particularly those with an artistic or musical focus.

Keywords

Brain-computer interface Mental imagery Individual differences Performance predictors Domain expertise User-centred design Auditory imagery Visual imagery 

Notes

Acknowledgments

This research was funded by a Discovery grant from the Natural Sciences and Engineering Research Council of Canada (NSERC) to SB and an NSERC PGS scholarship to KD.

References

  1. 1.
    Allison, B., Luth, T., Valbuena, D., Teymourian, A., Volosyak, I., Graser, A.: BCI demographics: how many (and what kinds of) people can use an SSVEP BCI? IEEE Trans. Neural Syst. Rehabil. Eng. 18(2), 107–116 (2010)CrossRefGoogle Scholar
  2. 2.
    Allison, B.Z., Neuper, C.: Could anyone use a BCI? In: Tan, D.S., Nijholt, A. (eds.) Brain-Computer Interfaces, pp. 35–54. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  3. 3.
    Badcock, N.A., Mousikou, P., Mahajan, Y., de Lissa, P., Thie, J., McArthur, G.: Validation of the Emotiv EPOC EEG gaming system for measuring research quality auditory ERPs. PeerJ 1, 2 (2013)CrossRefGoogle Scholar
  4. 4.
    Blankertz, B., Sannelli, C., Halder, S., Hammer, E.M., Kübler, A., Müller, K.R., Curio, G., Dickhaus, T.: Neurophysiological predictor of SMR-based BCI performance. NeuroImage 51(4), 1303–1309 (2010)CrossRefGoogle Scholar
  5. 5.
    Blankertz, B., Tomioka, R., Lemm, S., Kawanabe, M., Muller, K.-R.: Optimizing spatial filters for robust EEG single-trial analysis. IEEE Signal Proc. Mag. 25(1), 41–56 (2008)CrossRefGoogle Scholar
  6. 6.
    Bobrov, P., Frolov, A., Cantor, C., Fedulova, I., Bakhnyan, M., Zhavoronkov, A.: Brain-computer interface based on generation of visual images. PLoS One 6(6), e20674 (2011)CrossRefGoogle Scholar
  7. 7.
    Borisoff, J.F., Mason, S.G., Bashashati, A., Birch, G.E.: Brain-computer interface design for asynchronous control applications: improvements to the LF-ASD asynchronous brain switch. IEEE Trans. Biomed. Eng. 51(6), 985–992 (2004)CrossRefGoogle Scholar
  8. 8.
    Brainard, D.H.: The psychophysics toolbox. Spat. Vis. 10, 433–436 (1997)CrossRefGoogle Scholar
  9. 9.
    Burde, W., Blankertz, B.: Is the locus of control of reinforcement a predictor of brain-computer interface performance? na (2006)Google Scholar
  10. 10.
    Carrino, F., Dumoulin, J., Mugellini, E., Khaled, O.A., Ingold, R.: A self-paced BCI system to control an electric wheelchair: evaluation of a commercial, low-cost EEG device. In: 2012 ISSNIP Biosignals and Biorobotics Conference (BRC), pp. 1–6, January 2012Google Scholar
  11. 11.
    Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol. 2, 27:1–27:27 (2011). http://www.csie.ntu.edu.tw/cjlin/libsvm CrossRefGoogle Scholar
  12. 12.
    Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995)zbMATHGoogle Scholar
  13. 13.
    del Millan, J.R., Mouriño, J.: Asynchronous BCI and local neural classifiers: an overview of the adaptive brain interface project. IEEE Trans. Neural Syst. Rehabil. Eng. 11(2), 159–161 (2003)CrossRefGoogle Scholar
  14. 14.
    Dhindsa, K., Carcone, D., Becker, S.: An open-ended approach to BCI: embracing individual differences by allowing for user-defined mental commands. In: Conference Abstract: German-Japanese Adaptive BCI Workshopp (2015). Front. Comput. NeurosciGoogle Scholar
  15. 15.
    Duvinage, M., Castermans, T., Petieau, M., Hoellinger, T., Cheron, G., Dutoit, T.: Performance of the Emotiv Epoc headset for P300-based applications. Biomed. Eng. Online 12, 56 (2013)CrossRefGoogle Scholar
  16. 16.
    Emotiv Systems. Emotiv - brain computer interface technology, May 2011. http://www.emotiv.com
  17. 17.
    Friedrich, E.V., Scherer, R., Neuper, C.: The effect of distinct mental strategies on classification performance for brain-computer interfaces. Int. J. Psychophysiol. 84(1), 86–94 (2012)CrossRefGoogle Scholar
  18. 18.
    Hammer, E.M., Halder, S., Blankertz, B., Sannelli, C., Dickhaus, T., Kleih, S., Müller, K.R., Kübler, A.: Psychological predictors of SMR-BCI performance. Biol. Psychol. 89(1), 80–86 (2012)CrossRefGoogle Scholar
  19. 19.
    Jeunet, C., NKaoua, B., Subramanian, S., Hachet, M., Lotte, F.: Predicting mental imagery-based BCI performance from personality, cognitive profile and neurophysiological patterns. PloS One 10(12), e0143962 (2015)CrossRefGoogle Scholar
  20. 20.
    Kindermans, P.-J., Verschore, H., Verstraeten, D., Schrauwen, B.: A p300 BCI for the masses: prior information enables instant unsupervised spelling. In: Advances in Neural Information Processing Systems, pp. 710–718 (2012)Google Scholar
  21. 21.
    Kothe, C.A., Makeig, S., Onton, J.A.: Emotion recognition from EEG during self-paced emotional imagery. In: Proceedings - 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, ACII, pp. 855–858 (2013)Google Scholar
  22. 22.
    Kübler, A., Müller, K.R.: An introduction to brain computer interfacing. In: Dornhege, G., del Millán, J.R., Hinterberger, T., McFarland, D., Müller, K.R. (eds.) Toward Brain-Computer Interfacing. MIT Press, Cambridge (2007)Google Scholar
  23. 23.
    Kübler, A., Neumann, N., Kaiser, J., Kotchoubey, B., Hinterberger, T., Birbaumer, N.P.: Brain-computer communication: self-regulation of slow cortical potentials for verbal communication. Arch. Phys. Med. Rehabil. 82(11), 1533–1539 (2001)CrossRefGoogle Scholar
  24. 24.
    Kus, R., Valbuena, D., Zygierewicz, J., Malechka, T., Graeser, A., Durka, P.: Asynchronous BCI based on motor imagery with automated calibration and neurofeedback training. IEEE Trans. Neural Syst. Rehabil. Eng.: Publ. IEEE Eng. Med. Biol. Soc. 20(6), 823–835 (2012)CrossRefGoogle Scholar
  25. 25.
    Lievesley, R., Wozencroft, M., Ewins, D., Lievesley, M., Wozencroft, R.: The Emotiv EPOC neuroheadset: an inexpensive method of controlling assistive technologies using facial expressions and thoughts? J. Assist. Technol. 5(2), 67–82 (2011)CrossRefGoogle Scholar
  26. 26.
    Lin, C.J., Weng, R.C., et al.: Simple Probabilistic Predictions for Support Vector Regression. National Taiwan University, Taipei (2004)Google Scholar
  27. 27.
    Lin, H.-T., Lin, C.-J., Weng, R.C.: A note on Platts probabilistic outputs for support vector machines. Mach. Learn. 68(3), 267–276 (2007)CrossRefGoogle Scholar
  28. 28.
    Liu, Y., Jiang, X., Cao, T., Wan, F., Mak, P.U., Mak, P.I., Vai, M.I.: Implementation of SSVEP based BCI with Emotiv EPOC. In: Proceedings of IEEE International Conference on Virtual Environments, Human-Computer Interfaces, and Measurement Systems, VECIMS, pp. 34–37 (2012)Google Scholar
  29. 29.
    Lotte, F., Larrue, F., Mühl, C.: Flaws in current human training protocols for spontaneous Brain-Computer Interfaces: lessons learned from instructional design. Front. Hum. Neurosci. 7(September), 568 (2013)Google Scholar
  30. 30.
    Mak, J.N., Arbel, Y., Minett, J.W., McCane, L.M., Yuksel, B., Ryan, D., Thompson, D., Bianchi, L., Erdogmus, D.: Optimizing the P300-based brain-computer interface: current status, limitations and future directions. J. Neural Eng. 8(2), 025003 (2011)CrossRefGoogle Scholar
  31. 31.
    Mason, S.G., Birch, G.E.: A brain-controlled switch for asynchronous control applications. IEEE Trans. Biomed. Eng. 47(10), 1297–1307 (2000)CrossRefGoogle Scholar
  32. 32.
    MATLAB. Version 8.2.0 (R2013b). The MathWorks Inc., Natick, Massachusetts (2013)Google Scholar
  33. 33.
    McFarland, D.J., Miner, L.A., Vaughan, T.M., Wolpaw, J.R.: Mu and beta rhythm topographies during motor imagery and actual movements. Brain Topogr. 12(3), 177–186 (2000)CrossRefGoogle Scholar
  34. 34.
    Müller-Gerking, J., Pfurtscheller, G., Flyvbjerg, H.: Designing optimal spatial filters for single-trial EEG classification in a movement task. Clin. Neurophysiol. 110(5), 787–798 (1999)CrossRefGoogle Scholar
  35. 35.
    Neuper, C., Pfurtscheller, G.: Neurofeedback training for BCI control. In: Graimann, B., Pfurtscheller, G., Allison, B. (eds.) Brain-Computer Interfaces, pp. 65–78. Springer, Heidelberg (2010)Google Scholar
  36. 36.
    Neuper, C., Scherer, R., Reiner, M., Pfurtscheller, G.: Imagery of motor actions: differential effects of kinesthetic and visual-motor mode of imagery in single-trial EEG. Cogn. Brain. Res. 25(3), 668–677 (2005)CrossRefGoogle Scholar
  37. 37.
    Nicolas-Alonso, L.F., Gomez-Gil, J.: Brain computer interfaces, a review. Sensors 12(2), 1211–1279 (2012)CrossRefGoogle Scholar
  38. 38.
    Oostenveld, R., Fries, P., Maris, E., Schoffelen, J.M.: FieldTrip: open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data. Comput. Intell. Neurosci. 2011, 1–9 (2011)CrossRefGoogle Scholar
  39. 39.
    Peng, H.C.: Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27, 1226–1238 (2005)CrossRefGoogle Scholar
  40. 40.
    Pfurtscheller, G., Neuper, C., Flotzinger, D., Pregenzer, M.: EEG-based discrimination between imagination of right and left hand movement. Electroencephalogr. Clin. Neurophysiol. 103(6), 642–651 (1997)CrossRefGoogle Scholar
  41. 41.
    Pfurtscheller, G., Lopes da Silva, F.H.: EEG event-related desynchronization (ERD), event-related synchronization (ERS). Electroencephalogr.: Basic Princ. Clin. Appl. Relat. Fields 4, 958–967 (1999)Google Scholar
  42. 42.
    Pfurtscheller, G., Neuper, C.: Motor imagery activates primary sensorimotor area in humans. Neurosci. Lett. 239(2), 65–68 (1997)CrossRefGoogle Scholar
  43. 43.
    Platt, J., et al.: Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. Adv. Large Margin Classif. 10(3), 61–74 (1999)Google Scholar
  44. 44.
    Ramoser, H., Müller-Gerking, J., Pfurtscheller, G.: Optimal spatial filtering of single trial EEG during imagined hand movement. IEEE Trans. Rehabil. Eng. 8(4), 441–446 (2000)CrossRefGoogle Scholar
  45. 45.
    Randolph, A.B.: Not all created equal: individual-technology fit of brain-computer interfaces. In: Proceedings of the Annual Hawaii International Conference on System Sciences, pp. 572–578 (2011)Google Scholar
  46. 46.
    Randolph, A.B., Jackson, M.M., Karmakar, S.: Individual characteristics and their effect on predicting mu rhythm modulation. Int. J. Hum.-Comput. Interact. 27(1), 24–37 (2010)CrossRefGoogle Scholar
  47. 47.
    Scherer, R., Faller, J., Friedrich, E.V., Opisso, E., Costa, U., Kübler, A., Müller-Putz, G.R.: Individually adapted imagery improves brain-computer interface performance in end-users with disability. PloS One 10(5), e0123727 (2015)CrossRefGoogle Scholar
  48. 48.
    Stinear, C.M., Byblow, W.D., Steyvers, M., Levin, O., Swinnen, S.P.: Kinesthetic, but not visual, motor imagery modulates corticomotor excitability. Exp. Brain Res. 168(1–2), 157–164 (2006)CrossRefGoogle Scholar
  49. 49.
    Thomas, E., Dyson, M., Clerc, M.: An analysis of performance evaluation for motor-imagery based BCI. J. Neural Eng. 10(3), 031001 (2013)CrossRefGoogle Scholar
  50. 50.
    Vidaurre, C., Sannelli, C., Müller, K.R., Blankertz, B.: Co-adaptive calibration to improve BCI efficiency. J. Neural Eng. 8(2), 025009 (2011)CrossRefzbMATHGoogle Scholar
  51. 51.
    Vidaurre, C., Sannelli, C., Müller, K.-R., Blankertz, B.: Machine-learning-based coadaptive calibration for brain-computer interfaces. Neural Comput. 23(3), 791–816 (2011)CrossRefzbMATHGoogle Scholar
  52. 52.
    Vuckovic, A., Osuagwu, B.A.: Using a motor imagery questionnaire to estimate the performance of a brain-computer interface based on object oriented motor imagery. Clin. Neurophysiol. 124(8), 1586–1595 (2013)CrossRefGoogle Scholar
  53. 53.
    Wolpaw, J.R., Birbaumer, N., McFarland, D.J., Pfurtscheller, G., Vaughan, T.M.: Brain-computer interfaces for communication and control. Clin. Neurophysiol. Official J. Int. Fed. Clin. Neurophysiol. 113(6), 767–791 (2002)CrossRefGoogle Scholar
  54. 54.
    Ting-Fan, W., Lin, C.-J., Weng, R.C.: Probability estimates for multi-class classification by pairwise coupling. J. Mach. Learn. Res. 5(Aug), 975–1005 (2004)MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.School of Computational Science and EngineeringMcMaster UniversityHamiltonCanada
  2. 2.Department of Psychology, Neuroscience, and BehaviourMcMaster UniversityHamiltonCanada

Personalised recommendations