Skip to main content

Advertisement

Log in

A closed-loop brain-computer interface with augmented reality feedback for industrial human-robot collaboration

  • ORIGINAL ARTICLE
  • Published:
The International Journal of Advanced Manufacturing Technology Aims and scope Submit manuscript

Abstract

Industrial human-robot collaboration (HRC) aims to combine human intelligence and robotic capability to achieve higher productiveness. In industrial HRC, the communication between humans and robots is essential to enhance the understanding of the intent of each other to make a more fluent collaboration. Brain-computer interface (BCI) is a technology that could record the user’s brain activity that can be translated into interaction messages (e.g., control commands) to the outside world, which can build a direct and efficient communication channel between human and robot. However, due to lacking information feedback mechanisms, it is challenging for BCI to control robots with a high degree of freedom with a limited number of classifiable mental states. To address this problem, this paper proposes a closed-loop BCI with contextual visual feedback by an augmented reality (AR) headset. In such BCI, the electroencephalogram (EEG) patterns from the multiple voluntary eye blinks are considered the input and its online detection algorithm is proposed whose average accuracy can reach 94.31%. Moreover, an AR-enable information feedback interface is designed to achieve an interactive robotic path planning. A case study of an industrial HRC assembly task is also developed to show that the proposed closed-up BCI could shorten the time of user input in human-robot interaction.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19

Similar content being viewed by others

References

  1. Matsumoto M, Ijomah W (2013) Remanufacturing. In: Handbook of Sustainable Engineering. Springer, pp 389–408

  2. Xu W, Tang Q, Liu J, Liu Z, Zhou Z, Pham DT (2020) Disassembly sequence planning using discrete Bees algorithm for human-robot collaboration in remanufacturing. Robot Comput Integr Manuf 62:101860. https://doi.org/10.1016/j.rcim.2019.101860

    Article  Google Scholar 

  3. Özceylan E, Kalayci CB, Güngör A, Gupta SM (2019) Disassembly line balancing problem: a review of the state of the art and future directions. Int J Prod Res 57(15-16):4805–4827. https://doi.org/10.1080/00207543.2018.1428775

    Article  Google Scholar 

  4. Parsa S, Saadat M (2019) Intelligent selective disassembly planning based on disassemblability characteristics of product components. Int J Adv Manuf Technol 104(5):1769–1783. https://doi.org/10.1007/s00170-019-03857-1

    Article  Google Scholar 

  5. Liu H, Wang L (2018) Gesture recognition for human-robot collaboration: a review. Int J Ind Ergon 68:355–367

    Article  Google Scholar 

  6. Cirillo A, Ficuciello F, Natale C, Pirozzi S, Villani L (2015) A conformable force/tactile skin for physical human–robot interaction. IEEE Robot Autom Lett 1(1):41–48

    Article  Google Scholar 

  7. Hollmann R, Hägele M (2008) The use of voice control for industrial robots in noisy manufacturing environments. In: 39th International Symposium on Robotics, ISR 2008, pp 14-18

  8. Mohammed A, Wang L (2018) Brainwaves driven human-robot collaborative assembly. CIRP Ann 67(1):13–16. https://doi.org/10.1016/j.cirp.2018.04.048

    Article  Google Scholar 

  9. Mohammed A, Wang L (2020) Advanced human-robot collaborative assembly using electroencephalogram signals of human brains. Proc CIRP 93:1200–1205

    Article  Google Scholar 

  10. Wolpaw JR, Birbaumer N, Heetderks WJ, McFarland DJ, Peckham PH, Schalk G, Donchin E, Quatrano LA, Robinson CJ, Vaughan TM (2000) Brain-computer interface technology: a review of the first international meeting. IEEE Trans Rehab Eng 8(2):164–173

    Article  Google Scholar 

  11. Nijholt A, Tan D, Pfurtscheller G, Brunner C, Millán JR, Allison B, Graimann B, Popescu F, Blankertz B, Müller K-R (2008) Brain-computer interfacing for intelligent systems. IEEE Intell Syst 23(3):72–79

    Article  Google Scholar 

  12. Ji Z, Liu Q, Xu W, Yao B, Hu Y, Feng H, Zhou Z (2019) Augmented reality-enabled intuitive interaction for industrial human-robot collaboration. In: 49th International Conference on Computers and Industrial Engineering (CIE 2019)

  13. Agarwal M, Sivakumar R (2019) Blink: a fully automated unsupervised algorithm for eye-blink detection in EEG signals. 2019 57th Annual Allerton Conference on Communication, Control, and Computing (Allerton):1113-1121

  14. Carlson T, Millan JR (2013) Brain-controlled wheelchairs: a robotic architecture. IEEE Robot Autom Mag 20(1):65–73. https://doi.org/10.1109/MRA.2012.2229936

    Article  Google Scholar 

  15. Carabalona R, Grossi F, Tessadri A, Castiglioni P, Caracciolo A, de Munari I (2012) Light on! Real world evaluation of a P300-based brain–computer interface (BCI) for environment control in a smart home. Ergonomics 55(5):552–563

    Article  Google Scholar 

  16. Akram F, Han SM, Kim T-S (2015) An efficient word typing P300-BCI system using a modified T9 interface and random forest classifier. Comput Biol Med 56:30–36

    Article  Google Scholar 

  17. Vialatte FB, Maurice M, Dauwels J, Cichocki A (2010) Steady-state visually evoked potentials: focus on essential paradigms and future perspectives. Prog Neurobiol 90(4):418–438. https://doi.org/10.1016/j.pneurobio.2009.11.005

    Article  Google Scholar 

  18. Donchin E, Spencer KM, Wijesinghe R (2000) The mental prosthesis: assessing the speed of a P300-based brain-computer interface. IEEE Trans Rehab Eng 8(2):174–179. https://doi.org/10.1109/86.847808

    Article  Google Scholar 

  19. Pfurtscheller G, Neuper C (2001) Motor imagery and direct brain-computer communication. Proc IEEE 89(7):1123–1134

    Article  Google Scholar 

  20. Stern JA, Walrath LC, Goldstein R (1984) The endogenous eyeblink. Psychophysiology 21(1):22–33. https://doi.org/10.1111/j.1469-8986.1984.tb02312.x

    Article  Google Scholar 

  21. Li Y, He S, Huang Q, Gu Z, Yu ZL (2018) A EOG-based switch and its application for “start/stop” control of a wheelchair. Neurocomputing 275:1350–1357. https://doi.org/10.1016/j.neucom.2017.09.085

    Article  Google Scholar 

  22. Molina-Cantero AJ, Lebrato-Vázquez C, Merino-Monge M, Quesada-Tabares R, Castro-García JA, Gómez-González IM (2019) Communication technologies based on voluntary blinks: Assessment and design. IEEE Access 7:70770–70798

    Article  Google Scholar 

  23. Hosni SM, Shedeed HA, Mabrouk MS, Tolba MF (2019) EEG-EOG based virtual keyboard: Toward hybrid brain computer interface. Neuroinformatics 1-19

  24. Chang W-D, Cha H-S, Kim K, Im C-H (2016) Detection of eye blink artifacts from single prefrontal channel electroencephalogram. Comput Methods Prog Biomed 124:19–30

    Article  Google Scholar 

  25. b Abd Rani MS (2009) Detection of eye blinks from EEG signals for home lighting system activation. In: 2009 6th International Symposium on Mechatronics and its Applications, IEEE, pp 1-4

  26. Klein A, Skrandies W (2013) A reliable statistical method to detect eyeblink-artefacts from electroencephalogram data only. Brain Topogr 26(4):558–568

    Article  Google Scholar 

  27. Chang W-D, Im C-H (2014) Enhanced template matching using dynamic positional warping for identification of specific patterns in electroencephalogram. J Appl Math:2014

  28. Ghosh R, Sinha N, Biswas SK (2018) Automated eye blink artefact removal from EEG using support vector machine and autoencoder. IET Signal Process 13(2):141–148

    Article  Google Scholar 

  29. Rihana S, Damien P, Moujaess T (2013) EEG-eye blink detection system for brain computer interface. In: Converging Clinical and Engineering Research on Neurorehabilitation. Springer, pp 603-608

  30. Si-Mohammed H, Sanz FA, Casiez G, Roussel N, Lécuyer (2017) A Brain-computer interfaces and augmented reality: a state of the art. In: Graz Brain-Computer Interface Conference

  31. Angrisani L, Arpaia P, Esposito A, Moccaldi N (2019) A wearable brain–computer interface instrument for augmented reality-based inspection in Industry 4.0. IEEE Trans Instrum Meas 69(4):1530–1539

    Article  Google Scholar 

  32. Escolano C, Antelis JM, Minguez J (2011) A telepresence mobile robot controlled with a noninvasive brain–computer interface. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 42(3):793–804

    Article  Google Scholar 

  33. Lampe T, Fiederer LD, Voelker M, Knorr A, Riedmiller M, Ball T (2014) A brain-computer interface for high-level remote control of an autonomous, reinforcement-learning-based robotic system for reaching and grasping. In: Proceedings of the 19th international conference on Intelligent User Interfaces, pp 83-88

  34. Fang H, Ong S, Nee A (2014) A novel augmented reality-based interface for robot path planning. International Journal on Interactive Design and Manufacturing (IJIDeM) 8(1):33–42

    Article  Google Scholar 

  35. Lambrecht J, Krüger J (2012) Spatial programming for industrial robots based on gestures and Augmented Reality. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, 7-12 Oct. 2012, pp 466-472. 10.1109/IROS.2012.6385900

  36. Quintero CP, Li S, Pan MK, Chan WP, Loos HFMVd, Croft E (2018) Robot programming through augmented trajectories in augmented reality. In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 1-5 Oct. 2018, pp 1838-1844. 10.1109/IROS.2018.8593700

  37. Rosen E, Whitney D, Phillips E, Chien G, Tompkin J, Konidaris G, Tellex S (2020) Communicating robot arm motion intent through mixed reality head-mounted displays. In: Robotics Research. Springer, pp 301-316

  38. OpenBCI. https://openbci.com/.

  39. Khazi M, Kumar A, Vidya M (2012) Analysis of EEG using 10: 20 electrode system. Int J Innov Res Sci Eng Technol 1(2):185–191

    Google Scholar 

  40. OpenBCI GUI. https://github.com/OpenBCI/OpenBCI_GUI/.

  41. Islam MK, Rastegarnia A, Yang Z (2016) Methods for artifact detection and removal from scalp EEG: a review. Neurophysiologie Clinique/Clinical Neurophysiology 46(4):287–305. https://doi.org/10.1016/j.neucli.2016.07.002

    Article  Google Scholar 

  42. PTC Vuforia Engine. https://developer.vuforia.com/.

  43. Microsoft (2019) Gaze and commit. https://docs.microsoft.com/en-us/windows/mixed-reality/design/gaze-and-commit

  44. Erdmann D (2019) Fast IK. https://assetstore.unity.com/packages/tools/animation/fast-ik-139972

  45. Connolly C (2009) Technology and applications of ABB RobotStudio. Industrial Robot: An International Journal

  46. Liu Q, Liu Z, Xu W, Tang Q, Zhou Z, Pham DT (2019) Human-robot collaboration in disassembly for sustainable manufacturing. Int J Prod Res 57(12):4027–4044

    Article  Google Scholar 

Download references

Availability of data and materials

The data and materials of this paper are available from the corresponding author on reasonable request.

Funding

This research is supported by the National Natural Science Foundation of China (Grant No. 51775399) and the Fundamental Research Funds for the Central Universities (WUT: 2020III047).

Author information

Authors and Affiliations

Authors

Contributions

Zhenrui Ji designed and conducted the case study, analyzed the results, and wrote the paper. Quan Liu proposed the basic idea and contributed the materials. Wenjun Xu proposed the method design and experimental idea and also modified this paper. Bitao Yao proposed the idea of EEG signal processing. Jiayi Liu analyzed the data and conducted the case study. Zude Zhou modified the structure of the paper and contributed the method development.

Corresponding author

Correspondence to Wenjun Xu.

Ethics declarations

Ethical approval

The subjects involved in this research are all volunteers, and the authors warrant that the paper fulfills the ethical standards of the journal. No conflict of interest exists in the submission of this paper, and the paper is approved by all authors for publication. This paper has not been published or presented elsewhere in part or entirety and is not submitted to another journal.

Consent to participate

The subjects involved in this research are all volunteers; the authors warrant that the paper fulfills the ethical standards of the journal.

Consent to publish

All authors are consent to publish this paper in the International Journal of Advanced Manufacturing Technology.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ji, Z., Liu, Q., Xu, W. et al. A closed-loop brain-computer interface with augmented reality feedback for industrial human-robot collaboration. Int J Adv Manuf Technol 124, 3083–3098 (2023). https://doi.org/10.1007/s00170-021-07937-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00170-021-07937-z

Keywords

Navigation