Developing an Optical Brain-Computer Interface for Humanoid Robot Control

  • Alyssa M. BatulaEmail author
  • Jesse Mark
  • Youngmoo E. Kim
  • Hasan Ayaz
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9743)


This work evaluates the feasibility of a motor imagery-based optical brain-computer interface (BCI) for humanoid robot control. The functional near-infrared spectroscopy (fNIRS) based BCI-robot system developed in this study operates through a high-level control mechanism where user specifies a target action through the BCI and the robot performs the set of micro operations necessary to fulfill the identified goal. For the evaluation of the system, four motor imagery tasks (left hand, right hand, left foot, and right foot) were mapped to operational commands (turn left, turn right, walk forward, walk backward) that were sent to the robot in real time to direct the robot navigating a small room. An ecologically valid offline analysis with minimal preprocessing shows that seven subjects could achieve an average accuracy of 32.5 %. This was increased to 43.6 % just by including calibration data from the same day of the robot control using the same cap setup, indicating that day-of calibration following the initial training may be important for BCI control.


BCI fNIRS Motor imagery Motor cortex Humanoid robot control Teleoperation 



This work was supported in part by the National Science Foundation Graduate Research Fellowship under Grant No. DGE-1002809. Work reported here was run on hardware supported by Drexel’s University Research Computing Facility.


  1. 1.
    Wolpaw, J.R., Birbaumer, N., McFarland, D.J., Pfurtscheller, G., Vaughan, T.M.: Brain–computer interfaces for communication and control. Clin. Neurophysiol. 113, 767–791 (2002)CrossRefGoogle Scholar
  2. 2.
    Leeb, R., Friedman, D., Müller-Putz, G.R., Scherer, R., Slater, M., Pfurtscheller, G.: Self-paced (asynchronous) BCI control of a wheelchair in virtual environments: a case study with a tetraplegic. Comput. Intell. Neurosci. 2007, 79642 (2007)CrossRefGoogle Scholar
  3. 3.
    Chae, Y., Jeong, J., Jo, S.: Toward brain-actuated humanoid robots: asynchronous direct control using an EEG-based BCI. IEEE Trans. Robot. 28, 1131–1144 (2012)CrossRefGoogle Scholar
  4. 4.
    Hochberg, L.R., Bacher, D., Jarosiewicz, B., Masse, N.Y., Simeral, J.D., Vogel, J., Haddadin, S., Liu, J., Cash, S.S., van der Smagt, P., Donoghue, J.P.: Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature 485, 372–375 (2012)CrossRefGoogle Scholar
  5. 5.
    Ayaz, H., Onaral, B., Izzetoglu, K., Shewokis, P.A., McKendrick, R., Parasuraman, R.: Continuous monitoring of brain dynamics with functional near infrared spectroscopy as a tool for neuroergonomic research: empirical examples and a technological development. Front. Hum. Neurosci. 7, 871 (2013)CrossRefGoogle Scholar
  6. 6.
    Jeannerod, M.: Mental imagery in the motor context. Neuropsychologia 33, 1419–1432 (1995)CrossRefGoogle Scholar
  7. 7.
    Naseer, N., Hong, K.-S.: Functional near-infrared spectroscopy based brain activity classification for development of a brain-computer interface. In: International Conference on Robotics and Artificial Intelligence (ICRAI), pp. 174–178 (2012)Google Scholar
  8. 8.
    Lotze, M., Halsband, U.: Motor imagery. J. Physiol. 99, 386–395 (2006)Google Scholar
  9. 9.
    Miller, K.J., Schalk, G., Fetz, E.E., den Nijs, M., Ojemann, J.G., Rao, R.P.N.: Cortical activity during motor execution, motor imagery, and imagery-based online feedback. Proc. Nat. Acad. Sci. 107, 4430–4435 (2010)CrossRefGoogle Scholar
  10. 10.
    Ayaz, H., Shewokis, P.A., Bunce, S., Izzetoglu, K., Willems, B., Onaral, B.: Optical brain monitoring for operator training and mental workload assessment. Neuroimage 59, 36–47 (2012)CrossRefGoogle Scholar
  11. 11.
    Rodrigo, A.H., Di Domenico, S.I., Ayaz, H., Gulrajani, S., Lam, J., Ruocco, A.C.: Differentiating functions of the lateral and medial prefrontal cortex in motor response inhibition. Neuroimage. 85(Part 1), 423–431 (2014)CrossRefGoogle Scholar
  12. 12.
    Wriessnegger, S.C., Kurzmann, J., Neuper, C.: Spatio-temporal differences in brain oxygenation between movement execution and imagery: a multichannel near-infrared spectroscopy study. Int. J. Psychophysiol. 67, 54–63 (2008)CrossRefGoogle Scholar
  13. 13.
    Leff, D.R., Orihuela-Espina, F., Elwell, C.E., Athanasiou, T., Delpy, D.T., Darzi, A.W., Yang, G.-Z.: Assessment of the cerebral cortex during motor task behaviours in adults: a systematic review of functional near infrared spectroscopy (fNIRS) studies. Neuroimage 54, 2922–2936 (2011)CrossRefGoogle Scholar
  14. 14.
    Power, S.D., Kushki, A., Chau, T.: Towards a system-paced near-infrared spectroscopy brain–computer interface: differentiating prefrontal activity due to mental arithmetic and mental singing from the no-control state. J. Neural Eng. 8, 66004 (2011)CrossRefGoogle Scholar
  15. 15.
    Ayaz, H., Shewokis, P.A., Bunce, S., Onaral, B.: An optical brain computer interface for environmental control. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Boston, MA, pp. 6327–6330 (2011)Google Scholar
  16. 16.
    Ayaz, H., Shewokis, P.A., Bunce, S., Schultheis, M., Onaral, B.: Assessment of cognitive neural correlates for a functional near infrared-based brain computer interface system. In: Schmorrow, D.D., Estabrooke, I.V., Grootjen, M. (eds.) FAC 2009. LNCS, vol. 5638, pp. 699–708. Springer, Heidelberg (2009)Google Scholar
  17. 17.
    Gateau, T., Durantin, G., Lancelot, F., Scannella, S., Dehais, F.: Real-time state estimation in a flight simulator using fNIRS. PLoS ONE 10, e0121279 (2015)CrossRefGoogle Scholar
  18. 18.
    Abdelnour, A.F., Huppert, T.: Real-time imaging of human brain function by near-infrared spectroscopy using an adaptive general linear model. Neuroimage 46, 133–143 (2009)CrossRefGoogle Scholar
  19. 19.
    Shin, J., Jeong, J.: Multiclass classification of hemodynamic responses for performance improvement of functional near-infrared spectroscopy-based brain–computer interface. J. Biomed. Opt. 19, 67009 (2014)CrossRefGoogle Scholar
  20. 20.
    Coyle, S.M., Ward, T.E., Markham, C.M.: Brain-computer interface using a simplified functional near-infrared spectroscopy system. J. Neural Eng. 4, 219 (2007)CrossRefGoogle Scholar
  21. 21.
    Naseer, N., Hong, K.-S.: Classification of functional near-infrared spectroscopy signals corresponding to the right- and left-wrist motor imagery for development of a brain–computer interface. Neurosci. Lett. 553, 84–89 (2013)CrossRefGoogle Scholar
  22. 22.
    Ayaz, H., Izzetoglu, M., Bunce, S., Heiman-Patterson, T., Onaral, B.: Detecting cognitive activity related hemodynamic signal for brain computer interface using functional near infrared spectroscopy. In: 3rd International IEEE/EMBS Conference on Neural Engineering, Kohala Coast, Hawaii, pp. 342–345 (2007)Google Scholar
  23. 23.
    Villringer, A., Chance, B.: Non-invasive optical spectroscopy and imaging of human brain function. Trends Neurosci. 20, 435–442 (1997)CrossRefGoogle Scholar
  24. 24.
    Zimmermann, R., Marchal-Crespo, L., Edelmann, J., Lambercy, O., Fluet, M.-C., Riener, R., Wolf, M., Gassert, R.: Detection of motor execution using a hybrid fNIRS-biosignal BCI: a feasibility study. J. Neuroeng. Rehabil. 10, 4 (2013)CrossRefGoogle Scholar
  25. 25.
    Fazli, S., Mehnert, J., Steinbrink, J., Curio, G., Villringer, A., Müller, K.-R., Blankertz, B.: Enhanced performance by a hybrid NIRS-EEG brain computer interface. Neuroimage 59, 519–529 (2012)CrossRefGoogle Scholar
  26. 26.
    Liu, Y., Ayaz, H., Curtin, A., Onaral, B., Shewokis, P.A.: Towards a hybrid P300-based BCI using simultaneous fNIR and EEG. In: Schmorrow, D.D., Fidopiastis, C.M. (eds.) AC 2013. LNCS, vol. 8027, pp. 335–344. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  27. 27.
    Koo, B., Lee, H.-G., Nam, Y., Kang, H., Koh, C.S., Shin, H.-C., Choi, S.: A hybrid NIRS-EEG system for self-paced brain computer interface with online motor imagery. J. Neurosci. Meth. 244, 26–32 (2015)CrossRefGoogle Scholar
  28. 28.
    Doud, A.J., Lucas, J.P., Pisansky, M.T., He, B.: Continuous three-dimensional control of a virtual helicopter using a motor imagery based brain-computer interface. PLoS ONE 6, e26322 (2011)CrossRefGoogle Scholar
  29. 29.
    Ge, S., Wang, R., Yu, D.: Classification of four-class motor imagery employing single-channel electroencephalography. PLoS ONE 9, e98019 (2014)CrossRefGoogle Scholar
  30. 30.
    Yi, W., Zhang, L., Wang, K., Xiao, X., He, F., Zhao, X., Qi, H., Zhou, P., Wan, B., Ming, D.: Evaluation and comparison of effective connectivity during simple and compound limb motor imagery. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, Chicago (2014)Google Scholar
  31. 31.
    Sitaram, R., Zhang, H., Guan, C., Thulasidas, M., Hoshi, Y., Ishikawa, A., Shimizu, K., Birbaumer, N.: Temporal classification of multichannel near-infrared spectroscopy signals of motor imagery for developing a brain-computer interface. Neuroimage 34, 1416–1427 (2007)CrossRefGoogle Scholar
  32. 32.
    Ito, T., Akiyama, H., Hirano, T.: Brain machine interface using portable near-infrared spectroscopy - improvement of classification performance based on ICA analysis and self-proliferating LVQ. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, pp. 851–858 (2013)Google Scholar
  33. 33.
    LaFleur, K., Cassady, K., Doud, A., Shades, K., Rogin, E., He, B.: Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain–computer interface. J. Neural Eng. 10, 46003 (2013)CrossRefGoogle Scholar
  34. 34.
    Barbosa, A.O.G., Achanccaray, D.R., Meggiolaro, M.A.: Activation of a mobile robot through a brain computer interface. In: IEEE International Conference on Robotics and Automation (ICRA), Anchorage, Alaska, pp. 4815–4821 (2010)Google Scholar
  35. 35.
    Li, W., Li, M., Zhao, J.: Control of humanoid robot via motion-onset visual evoked potentials. Front. Syst. Neurosci. 8, 247 (2015)CrossRefGoogle Scholar
  36. 36.
    Choi, B., Jo, S.: A low-cost EEG system-based hybrid brain-computer interface for humanoid robot navigation and recognition. PLoS ONE 8, e74583 (2013)CrossRefGoogle Scholar
  37. 37.
    Cohen, O., Druon, S., Lengagne, S., Mendelsohn, A., Malach, R., Kheddar, A., Friedman, D.: fMRI robotic embodiment: a pilot study. In: 4th IEEE RAS/EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), pp. 314–319 (2012)Google Scholar
  38. 38.
    Ahn, M., Jun, S.C.: Performance variation in motor imagery brain–computer interface: a brief review. J. Neurosci. Meth. 243, 103–110 (2015)CrossRefGoogle Scholar
  39. 39.
    Tidoni, E., Gergondet, P., Kheddar, A., Aglioti, S.M.: Audio-visual feedback improves the BCI performance in the navigational control of a humanoid robot. Front. Neurorobot. 8, 20 (2014)CrossRefGoogle Scholar
  40. 40.
    Canning, C., Scheutz, M.: Functional near-infrared spectroscopy in human-robot interaction. J. Human-Robot Interact. 2, 62–84 (2013)CrossRefGoogle Scholar
  41. 41.
    Kishi, S., Luo, Z., Nagano, A., Okumura, M., Nagano, Y., Yamanaka, Y.: On NIRS-based BRI for a human-interactive robot RI-MAN. In: Joint 4th International Conference on Soft Computing and Intelligent Systems and 9th International Symposium on Advanced Intelligent Systems (SCIS & ISIS), Nagoya, Japan, pp. 124–129 (2008)Google Scholar
  42. 42.
    Takahashi, K., Maekawa, S., Hashimoto, M.: Remarks on fuzzy reasoning-based brain activity recognition with a compact near infrared spectroscopy device and its application to robot control interface. In: International Conference on Control, Decision and Information Technologies (CoDIT), pp. 615–620. IEEE, Metz (2014)Google Scholar
  43. 43.
    Tumanov, K., Goebel, R., Möckel, R., Sorger, B., Weiss, G.: fNIRS-based BCI for robot control. In: Proceedings of the 2015 International Conference on Autonomous Agents and Multiagent Systems. International Foundation for Autonomous Agents and Multiagent Systems, pp. 1953–1954 (2015)Google Scholar
  44. 44.
    Batula, A.M., Ayaz, H., Kim, Y.E.: Evaluating a four-class motor-imagery-based optical brain-computer interface. In: Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 2000–2003. IEEE, Chicago (2014)Google Scholar
  45. 45.
    Ayaz, H., Allen, S.L., Platek, S.M., Onaral, B.: Maze Suite 1.0: A complete set of tools to prepare, present, and analyze navigational and spatial cognitive neuroscience experiments. Behav. Res. Meth. 40, 353–359 (2008)CrossRefGoogle Scholar
  46. 46.
    Ha, I., Tamura, Y., Asama, H., Han, J., Hong, D.W.: Development of open humanoid platform DARwIn-OP. In: Proceedings of SICE Annual Conference, Tokyo, Japan, pp. 2178–2181 (2011)Google Scholar
  47. 47.
    Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: machine learning in python. J. Mach. Learn. Res. 12, 2825–2830 (2011)MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Alyssa M. Batula
    • 1
    Email author
  • Jesse Mark
    • 2
  • Youngmoo E. Kim
    • 1
  • Hasan Ayaz
    • 2
    • 3
    • 4
  1. 1.Department of Electrical and Computer EngineeringDrexel UniversityPhiladelphiaUSA
  2. 2.School of Biomedical Engineering, Science and Health SystemsDrexel UniversityPhiladelphiaUSA
  3. 3.Department of Family and Community HealthUniversity of PennsylvaniaPhiladelphiaUSA
  4. 4.Division of General PediatricsChildren’s Hospital of PhiladelphiaPhiladelphiaUSA

Personalised recommendations