Advertisement

Human–Robot Interaction Interface

  • Chenguang YangEmail author
  • Hongbin MaEmail author
  • Mengyin Fu
Chapter

Abstract

Human–robot interaction is an advanced technology and plays an increasingly important role in robot applications. This chapter first gives a brief introduction to various human–robot interfaces and several technologies of human–robot interaction using visual sensors and electroencephalography (EEG) signals. Next, a hand gesture-based robot control system is developed using Leap Motion, with noise suppression, coordinate transformation, and inverse kinematics. Then, another hand gesture control, which is one of natural user interfaces, is then developed based on a parallel system. ANFIS and SVM algorithms are employed to realize the classification. We also investigate controlling the commercialized Spykee mobile robot using EEG signals transmitted by the Emotiv EPOC neuroheadset. The Emotiv headset is connected to the OpenViBE to control a virtual manipulator moving in 3D Cartesian space, using a P300 speller.

Keywords

Inverse Kinematic Hand Gesture Robot Interaction Fourier Descriptor Hand Gesture Recognition 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Green, S.A., Billinghurst, M., Chen, X., Chase, G.: Human-robot collaboration: a literature review and augmented reality approach in design (2008)Google Scholar
  2. 2.
    Bauer, A., Wollherr, D., Buss, M.: Human-robot collaboration: a survey. Int. J. Humanoid Robot. 5(01), 47–66 (2008)CrossRefGoogle Scholar
  3. 3.
    Lu, J.-M., Hsu, Y.-L.: Contemporary Issues in Systems Science and Engineering. Telepresence robots for medical and homecare applications, p. 725. Wiley, New Jersey (2015)Google Scholar
  4. 4.
    Arai, T., Kato, R., Fujita, M.: Assessment of operator stress induced by robot collaboration in assembly. CIRP Ann. Manuf. Technol. 59(1), 5–8 (2010)CrossRefGoogle Scholar
  5. 5.
    Bechar, A., Edan, Y.: Human-robot collaboration for improved target recognition of agricultural robots. Ind. Robot: Int. J. 30(5), 432–436 (2003)CrossRefGoogle Scholar
  6. 6.
    Matthias, B., Kock, S., Jerregard, H., Kallman, M., Lundberg, I., Mellander, R.: Safety of collaborative industrial robots: certification possibilities for a collaborative assembly robot concept. In: 2011 IEEE International Symposium on Assembly and Manufacturing (ISAM), pp. 1–6 IEEE (2011)Google Scholar
  7. 7.
    Morabito, V.: Big Data and Analytics. Big data driven business models, pp. 65–80. Springer, Berlin (2015)Google Scholar
  8. 8.
    Top 10 emerging technologies of 2015. https://agenda.weforum.org. (2015)
  9. 9.
    Scassellati, B., Tsui, K.M.: Co-robots: Humans and robots operating as partners (2015)Google Scholar
  10. 10.
    Reardon, C., Tan, H., Kannan, B., Derose, L.: Towards safe robot-human collaboration systems using human pose detection. In: 2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA) (2015)Google Scholar
  11. 11.
    Kronander, K., Billard, A.: Learning compliant manipulation through kinesthetic and tactile human-robot interaction. IEEE Trans. Haptics 7(3), 367–380 (2014)CrossRefGoogle Scholar
  12. 12.
    Peternel, L., Petric, T., Oztop, E., Babic, J.: Teaching robots to cooperate with humans in dynamic manipulation tasks based on multi-modal human-in-the-loop approach. Autonomous Robots 36(1–2), 123–136 (2014)CrossRefGoogle Scholar
  13. 13.
    Rozo, L., Calinon, S., Caldwell, D.G., Jimnez, P., Torras, C.: Learning collaborative impedance-based robot behaviors. In: Association for the Advancement of Artificial Intelligence (2013)Google Scholar
  14. 14.
    Ganesh, G., Takagi, A., Osu, R., Yoshioka, T., Kawato, M., Burdet, E.: Two is better than one: physical interactions improve motor performance in humans. Sci. Rep. 4(7484), 3824–3824 (2014)Google Scholar
  15. 15.
    Burdet, E., Osu, R., Franklin, D.W., Milner, T.E., Kawato, M.: The central nervous system stabilizes unstable dynamics by learning optimal impedance. Nature 414(6862), 446–449 (2001). doi: 10.1038/35106566 CrossRefGoogle Scholar
  16. 16.
    Burdet, E., Ganesh, G., Yang, C., Albu-Schaffer, A.: Interaction force, impedance and trajectory adaptation: by humans, for robots. Springer Tracts Adv. Robot. 79, 331–345 (2010)CrossRefGoogle Scholar
  17. 17.
    Yang, C., Ganesh, G., Haddadin, S., Parusel, S., Albu-Schäeffer, A., Burdet, E.: Human-like adaptation of force and impedance in stable and unstable interactions. In: IEEE Transactions on Robotics, vol.27(5) (2011)Google Scholar
  18. 18.
    Ajoudani, A., Tsagarakis, N.G., Bicchi, A.: Tele-impedance: preliminary results on measuring and replicating human arm impedance in tele operated robots. In: 2011 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 216 – 222 (2011)Google Scholar
  19. 19.
    Gradolewski, D., Tojza, P.M., Jaworski, J., Ambroziak, D., Redlarski, G., Krawczuk, M.: Arm EMG Wavelet-Based Denoising System. Springer International Publishing, Berlin (2015)Google Scholar
  20. 20.
    Ajoudani, A.: Tele-impedance: teleoperation with impedance regulation using a bodycmachine interface. Int. J. Robot. Res. 31(13), 1642–1656 (2012)CrossRefGoogle Scholar
  21. 21.
    Dragan, A.D., Bauman, S., Forlizzi, J., Srinivasa, S.S.: Effects of robot motion on human-robot collaboration. In: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, pp. 51–58. ACM (2015)Google Scholar
  22. 22.
    Saktaweekulkit, K., Maneewarn, T.: Motion classification using imu for human-robot interaction. In: 2010 International Conference on Control Automation and Systems (ICCAS), pp. 2295–2299. IEEE (2010)Google Scholar
  23. 23.
    Shi, G.Y., Zou, Y.X., Li, W.J., Jin, Y.F., Guan, P.: Towards multi-classification of human motions using micro imu and svm training process. In: Advanced Materials Research, vol. 60, pp. 189–193. Trans Tech Publication (2009)Google Scholar
  24. 24.
    Yoshimoto, H., Arita, D., Taniguchi et al., R.-I.: Real-time human motion sensing based on vision-based inverse kinematics for interactive applications. In: Proceedings of the 17th International Conference on Pattern Recognition, ICPR 2004, vol. 3, pp. 318–321. IEEE (2004)Google Scholar
  25. 25.
    Starner, T., Weaver, J., Pentland, A.: Real-time american sign language recognition using desk and wearable computer based video. IEEE Trans. Pattern Anal. Mach. Intell. 20(12), 1371–1375 (1998)CrossRefGoogle Scholar
  26. 26.
    Shanableh, T., Assaleh, K., Al-Rousan, M.: Spatio-temporal feature-extraction techniques for isolated gesture recognition in arabic sign language. IEEE Trans. Syst. Man Cybern. Part B: Cybern. 37(3), 641–650 (2007)CrossRefGoogle Scholar
  27. 27.
    Liang, R.-H., Ouhyoung, M.: A sign language recognition system using hidden markov model and context sensitive search. Proc. ACM Symp. Virtual Real. Softw. Technol. 96, 59–66 (1996)Google Scholar
  28. 28.
    Starner, T., Pentland, A.: Real-time american sign language recognition from video using hidden markov models. In: Motion-Based Recognition, pp. 227–243. Springer (1997)Google Scholar
  29. 29.
    Kim, J., Mastnik, S., André, E.: Emg-based hand gesture recognition for realtime biosignal interfacing. In: Proceedings of the 13th international conference on Intelligent user interfaces, pp. 30–39. ACM (2008)Google Scholar
  30. 30.
    Moradi, H., Lee, S.: Joint limit analysis and elbow movement minimization for redundant manipulators using closed form method. In: Advances in Intelligent Computing, pp. 423–432. Springer (2005)Google Scholar
  31. 31.
    Fang, C., Ding, X.: A set of basic movement primitives for anthropomorphic arms. In: 2013 IEEE International Conference on Mechatronics and Automation (ICMA), pp. 639–644. IEEE (2013)Google Scholar
  32. 32.
    Pan, J.J., Xu, K.: Leap motion based 3D. gesture. CHINA. SCIENCEPAPER 10(2), 207–212 (2015)MathSciNetGoogle Scholar
  33. 33.
    Jiang, Y.C.: Menacing motion-sensing technology, different leap motion. PC. Fan 11, 32–33 (2013)Google Scholar
  34. 34.
    Chen, S., Ma, H., Yang, C., Fu, M.: Hand gesture based robot control system using leap motion. In: Intelligent Robotics and Applications, pp. 581–591. Springer (2015)Google Scholar
  35. 35.
    V-rep introduction. http://www.v-rep.eu/
  36. 36.
  37. 37.
    Barretthand introduction. http://wiki.ros.org/Robots/BarrettHand
  38. 38.
    Qian, K., Jie, N., Hong, Y.: Developing a gesture based remote human-robot interaction system using Kinect. Int. J. Smart Home 7(4), 203–208 (2013)Google Scholar
  39. 39.
    Casiez, G., Roussel, N., Vogel, D.: 1 filter: a simple speed-based low-pass filter for noisy input in interactive systems. In: Proceedings of the 2012 ACM Annual Conference on Human Factors in Computing Systems, pp. 2527–2530. (Austin, TX, USA, 2012)Google Scholar
  40. 40.
    Craig, J.J.: Introduction to Rbotics: Mechanics and Control, 3rd edn. China Machine Press, Beijing (2006)Google Scholar
  41. 41.
    Li, C., Ma, H., Yang, C., Fu, M.: Teleoperation of a virtual icub robot under framework of parallel system via hand gesture recognition. In: 2014 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), pp. 1469–1474. IEEE (2014)Google Scholar
  42. 42.
    Wang, F.-Y.: Parallel system methods for management and control of complex systems. Control Decis. 19, 485–489 (2004)zbMATHGoogle Scholar
  43. 43.
    S. W. S. (2012), ““wiki for the icub simulator specifications from its installation to its use”.” http://www.eris.liralab.it/wiki/
  44. 44.
    Yet another robot platform. http://yarp0.sourceforge.net/
  45. 45.
    Wachs, J.P., Kölsch, M., Stern, H., Edan, Y.: Vision-based hand-gesture applications. Commun. ACM 54(2), 60–71 (2011)CrossRefGoogle Scholar
  46. 46.
    Hsu, R.-L., Abdel-Mottaleb, M., Jain, A.K.: Face detection in color images. Pattern Anal. Mach. Intell. 24(5), 696–706 (2002)CrossRefGoogle Scholar
  47. 47.
    Hu, M.-K.: Visual pattern recognition by moment invariants. IRE Trans. Inf. Theory 8(2), 179–187 (1962)CrossRefzbMATHGoogle Scholar
  48. 48.
    Granlund, G.H.: Fourier preprocessing for hand print character recognition. IEEE Trans. Comput. 100(2), 195–201 (1972)MathSciNetCrossRefzbMATHGoogle Scholar
  49. 49.
    Nedjah, N.e.: Adaptation of fuzzy inference system using neural learning, fuzzy system engineering: theory and practice. In: Studies in Fuzziness and Soft Computing, pp. 53–83 (2001)Google Scholar
  50. 50.
    Chang, C.-C., Lin, C.-J.: Libsvm: a library for support vector machines. ACM Trans. Intell. Syst. Technol. (TIST) 2(3), 27 (2011)Google Scholar
  51. 51.
    Corke, P.I.: Robotics toolbox (2008)Google Scholar
  52. 52.
    Zhu, G.C., Wang, T.M., Chou, W.S., Cai, M.: Research on augmented reality based teleoperation system. Acta Simul. Syst. Sin. 5, 021 (2004)Google Scholar
  53. 53.
    TF, C.: History and evolution of electroencephalographic instruments and techniques. J. Clin. Neurophysiol. 10(4), 476–504 (1993)CrossRefGoogle Scholar
  54. 54.
    Grude, S., Freeland, M., Yang, C., Ma, H.: Controlling mobile spykee robot using emotiv neuro headset. In: Control Conference (CCC), 2013 32nd Chinese, pp. 5927–5932. IEEE (2013)Google Scholar
  55. 55.
    Hoffmann, A.: Eeg signal processing and emotivs neuro headset, Hessen: sn (2010)Google Scholar
  56. 56.
    Sanei, S., Chambers, J.A.: EEG Signal Processing. Wiley, New Jerssey (2013)Google Scholar
  57. 57.
    Jasper, H.H.: The ten twenty electrode system of the international federation. Electroencephal. Clin. Neurophysiol. 10, 371–375 (1958)Google Scholar
  58. 58.
    Lotte, F.: Les interfaces cerveau-ordinateur: Conception et utilisation en réalité virtuelle. Revue Technique et Science Informatiques 31(3), 289–310 (2012)CrossRefGoogle Scholar
  59. 59.
    Carmena, J.M., Lebedev, M.A., Crist, R.E., O’Doherty, J.E., Santucci, D.M., Dimitrov, D.F., Patil, P.G., Henriquez, C.S., Nicolelis, M.A.: Learning to control a brain-machine interface for reaching and grasping by primates. Plos Biol. 1(2), 193–208 (2003)CrossRefGoogle Scholar
  60. 60.
    J. D. R. Milln, R. Frdric, and a. W. G. Josep, Mourio, “Non-invasive brain-actuated control of a mobile robot by human eeg,” IEEE Trans. on Biomedical Engineering, Special Issue on Brain-Machine Interfaces, vol. 51, no. 6, pp. 1026–1033, 2004Google Scholar
  61. 61.
    Wang, Y., Gao, X., Hong, B., Jia, C., Gao, S.: Brain-computer interfaces based on visual evoked potentials. IEEE Engineering in Medicine & Biology Magazine the Quarterly Magazine of the Engineering in Medicine & Biology Society 27(5), 64–71 (2008)CrossRefGoogle Scholar
  62. 62.
    Diez, P.F., Mut, V.A., Perona, E.M.A., Leber, E.L.: Asynchronous bci control using high-frequency ssvep. Journal of Neuroengineering & Rehabilitation 8(2), 642–650 (2011)Google Scholar
  63. 63.
    Bashashati, A., Fatourechi, M., Ward, R.K., Birch, G.E.: A survey of signal processing algorithms in brainccomputer interfaces based on electrical brain signals. Journal of Neural Engineering 4(2), R32–R57 (2007)CrossRefGoogle Scholar
  64. 64.
    J. Ding, G. Sperling, and R. Srinivasan, “Attentional modulation of ssvep power depends on the network tagged by the flicker frequency.,” Cerebral Cortex, vol. 16, no. 7, pp. 1016–1029(14), 2006Google Scholar
  65. 65.
    H. Ekanayake, “P300 and emotiv epoc: Does emotiv epoc capture real eeg?,” P300 and Emotiv EPOC: Does Emotiv EPOC capture real EEG? - ResearchGate, 2011Google Scholar
  66. 66.
    Vidal, J.-J.: Toward direct brain-computer communication. Annual review of Biophysics and Bioengineering 2(1), 157–180 (1973)CrossRefGoogle Scholar
  67. 67.
    T. J. Sullivan, S. R. Deiss, T. P. Jung, and G. Cauwenberghs, “A brain-machine interface using dry-contact, low-noise eeg sensors,” in In Circuits and Systems, 2008. ISCAS 2008. IEEE International Symposium on, pp. 1986–1989, 2008Google Scholar
  68. 68.
    A. Malki, C. Yang, N. Wang, and Z. Li, “Mind guided motion control of robot manipulator using eeg signals,” in Information Science and Technology (ICIST), 2015 5th International Conference on, 2015Google Scholar
  69. 69.
    Coppeliarobotics, “Vrep description.” http://www.coppeliarobotics.com, 2014
  70. 70.
    Piccione, F., Priftis, K., Tonin, P., Vidale, D., Furlan, R., Cavinato, M., Merico, A., Piron, L.: Task and stimulation paradigm effects in a p300 brain computer interface exploitable in a virtual environment: A pilot study. Psychnology Journal 6(1), 99–108 (2008)Google Scholar
  71. 71.
    Chen, W.D., Zhang, J.H., Zhang, J.C., Li, Y., Qi, Y., Su, Y., Wu, B., Zhang, S.M., Dai, J.H., Zheng, X.X.: A p300 based online brain-computer interface system for virtual hand control. Journal of Zhejiang University Science C 11(08), 587–597 (2010)CrossRefGoogle Scholar
  72. 72.
    Renard, Y., Lotte, F., Gibert, G., Congedo, M., Maby, E., Delannoy, V., Bertrand, O., Cuyer, A.: Openvibe: An open-source software platform to design, test, and use brain-computer interfaces in real and virtual environments. Presence Teleoperators & Virtual Environments 19(1), 35–53 (2010)CrossRefGoogle Scholar
  73. 73.
    Inria, “P300: Old p300 speller.” http://openvibe.inria.fr/openvibe-p300-speller, 2014

Copyright information

© Science Press and Springer Science+Business Media Singapore 2016

Authors and Affiliations

  1. 1.Key Lab of Autonomous Systems and Networked Control, Ministry of EducationSouth China University of TechnologyGuangzhouChina
  2. 2.Centre for Robotics and Neural SystemsPlymouth UniversityDevonUK
  3. 3.School of AutomationBeijing Institute of TechnologyBeijingChina
  4. 4.State Key Lab of Intelligent Control and Decision of Complex SystemBeijing Institute of TechnologyBeijingChina

Personalised recommendations