Skip to main content
Log in

Gestures-teleoperation of a heterogeneous multi-robot system

  • Application
  • Published:
The International Journal of Advanced Manufacturing Technology Aims and scope Submit manuscript

Abstract

This work presents a solution for the teleoperation of a heterogeneous team of mobile robots. Regarding the team of robots, two possibilities are considered which are UAV-UGV (Unmanned Aerial Vehicle-Unmanned Ground Vehicle) and UAV-UAV. To execute this task, high-level gesture patterns are made in a remote station, and we proposed an easy-to-train Artificial Neural Network (ANN) classifier to identify the skeletal data extracted by an RGB-D (Red, Green, Blue-Depth) camera. Our classifier uses custom data to build the gesture patterns, allowing the use of smooth and intuitive gestures for the teleoperation of mobile robots. To validate our proposal, experiments were run using two off-the-shelf Parrot AR.Drone 2 quadrotors and the differential drive platform Pioneer 3-DX. The results of such experiments allow concluding that the proposed teleoperation system is able to accomplish inspection/surveillance tasks, and it can be easily modified to similar applications, as emergency response or load transportation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. See https://www.ros.org/

References

  1. Goodrich MA, Schultz AC (2008) Human-robot interaction: a survey Now Publishers Inc

  2. Qian K, Niu J, Yang H (2013) Developing a gesture based remote human-robot interaction system using kinect. International Journal of Smart Home 7(4):203–208

    Google Scholar 

  3. Liu H, Wang L (2020) Remote human–robot collaboration: a cyber–physical system application for hazard manufacturing environment. Journal of manufacturing systems 54:24–34

    Article  Google Scholar 

  4. Chivarov N, Chikurtev D, Chivarov S, Pleva M, Ondas S, Juhar J, Yovchev K (2019) Case study on human-robot interaction of the remote-controlled service robot for elderly and disabled care. Comput. Informatics 38(5):1210–1236

    Article  Google Scholar 

  5. Jing X, Gong C, Wang Z, Li X, Ma Z, Gong L (2017) “Remote live-video security surveillance via mobile robot with raspberry pi ip camera,” in International Conference on Intelligent Robotics and Applications, Springer, pp 776–788

  6. Trevelyan J, Hamel WR, Kang S-C (2016) “Robotics in hazardous applications,” in Springer handbook of robotics. Springer, pp 1521–1548

  7. Legeza P, Britz GW, Loh T, Lumsden A (2020) Current utilization and future directions of robotic-assisted endovascular surgery. Expert Review of Medical Devices 17(9):919–927

    Article  Google Scholar 

  8. Murphy RR, Steimle E, Hall M, Lindemuth M, Trejo D, Hurlebaus S, Medina-Cetina Z, Slocum D (2011) Robot-assisted bridge inspection. Journal of Intelligent & Robotic Systems 64(1):77–95

    Article  Google Scholar 

  9. Washburn A, Matsumoto S, Riek LD (2021) “Trust-aware control in proximate human-robot teaming,” in Trust in Human-Robot Interaction. Elsevier, pp 353–377

  10. Washburn A, Adeleye A, An T, Riek LD (2020) Robot errors in proximate hri: how functionality framing affects perceived reliability and trust. ACM Transactions on Human-Robot Interaction (THRI) 9(3):1–21

    Article  Google Scholar 

  11. Leichtmann B, Nitsch V (2020) “How much distance do humans keep toward robots? literature review, meta-analysis, and theoretical considerations on personal space in human-robot interaction,” Journal of Environmental Psychology, vol. 68, p 101386

  12. Pérez L, Rodríguez-jiménez S, Rodríguez N, Usamentiaga R, García DF, Wang L (2020) “Symbiotic human–robot collaborative approach for increased productivity and enhanced safety in the aerospace manufacturing industry,’. The International Journal of Advanced Manufacturing Technology 106(3):851–863

    Article  Google Scholar 

  13. Saenz J, Behrens R, Schulenburg E, Petersen H, Gibaru O, Neto P, Elkmann N (2020) Methods for considering safety in design of robotics applications featuring human-robot collaboration. The International Journal of Advanced Manufacturing Technology, pp 1–19

  14. Bruno G, Antonelli D (2018) Dynamic task classification and assignment for the management of human-robot collaborative teams in workcells. The International Journal of Advanced Manufacturing Technology 98 (9):2415–2427

    Article  Google Scholar 

  15. Liu R, Zhang X (2019) “A review of methodologies for natural-language-facilitated human–robot cooperation,” International Journal of Advanced Robotic Systems, vol. 16, no. 3, p 1729881419851402

  16. Hudson C, Bethel CL, Carruth DW, Pleva M, Juhar J, Ondas S (2017) “A training tool for speech driven human-robot interaction applications,” in 2017 15th International Conference on Emerging eLearning Technologies and Applications (ICETA). IEEE, pp 1–6

  17. Podpora M, Gardecki A, Beniak R, Klin B, Vicario JL, Kawala-Sterniuk A (2020) Human interaction smart subsystem—extending speech-based human-robot interaction systems with an implementation of external smart sensors. Sensors 20(8):2376

    Article  Google Scholar 

  18. Wang P, Zhang S, Bai X, Billinghurst M, He W, Sun M, Chen Y, Lv H, Ji H (2019) 2.5 Dhands: a gesture-based mr remote collaborative platform. The International Journal of Advanced Manufacturing Technology 102(5):1339–1353

    Article  Google Scholar 

  19. Huttenrauch H, Eklundh KS (2002) Fetch-and-carry with cero: Observations from a long-term user study with a service robot. Inproceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication. IEEE, pp 158–163

  20. Montemerlo M, Pineau J, Roy N, Thrun S, Verma V (2002) Experiences with a mobile robotic guide for the elderly. AAAI/IAAI 2002:587–592

    Google Scholar 

  21. Chen B, Hua C, Dai B, He Y, Han J (2019) Online control programming algorithm for human–robot interaction system with a novel real-time human gesture recognition method. Int J Adv Robot Syst 16 (4):1–18

    Google Scholar 

  22. Neto P, Simão M., Mendes N, Safeea M (2019) Gesture-based human-robot interaction for human assistance in manufacturing. The International Journal of Advanced Manufacturing Technology 101 (1):119–135

    Article  Google Scholar 

  23. Chen F, Lv H, Pang Z, Zhang J, Hou Y, Gu Y, Yang H, Yang G (2018) Wristcam: a wearable sensor for hand trajectory gesture recognition and intelligent human–robot interaction. IEEE Sensors J 19(19):8441–8451

    Article  Google Scholar 

  24. Oyedotun OK, Khashman A (2017) Deep learning in vision-based static hand gesture recognition. Neural Comput & Applic 28(12):3941–3951

    Article  Google Scholar 

  25. Li G, Tang H, Sun Y, Kong J, Jiang G, Jiang D, Tao B, Xu S, Liu H (2019) Hand gesture recognition based on convolution neural network. Clust Comput 22(2):2719–2729

    Article  Google Scholar 

  26. Cheng W, Sun Y, Li G, Jiang G, Liu H (2019) Jointly network: a network based on cnn and rbm for gesture recognition. Neural Comput & Applic 31(1):309–323

    Article  Google Scholar 

  27. Ma X, Peng J (2018) Kinect sensor-based long-distance hand gesture recognition and fingertip detection with depth information. Journal of Sensors, vol. 2018

  28. Devineau G, Moutarde F, Xi W, Yang J (2018) “Deep learning for hand gesture recognition on skeletal data,” in 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018). IEEE, pp 106–113

  29. De Smedt Q, Wannous H, Vandeborre J. -P. (2019) Heterogeneous hand gesture recognition using 3d dynamic skeletal data. Comput Vis Image Underst 181:60–72

    Article  Google Scholar 

  30. Liu X, Zhao G (2020) “3D skeletal gesture recognition via discriminative coding on time-warping invariant riemannian trajectories,” IEEE Transactions on Multimedia

  31. Vezzani R, Baltieri D, Cucchiara R (2010) “Hmm based action recognition with projection histogram features,” in International Conference on Pattern Recognition. Springer, pp 286–293

  32. Song S, Lan C, Xing J, Zeng W, Liu J (2018) Spatio-temporal attention-based lstm networks for 3d action recognition and detection. IEEE Transactions on image processing 27(7):3459–3471

    Article  MathSciNet  Google Scholar 

  33. Hu Y, Wong Y, Wei W, Du Y, Kankanhalli M, Geng W (2018) “A novel attention-based hybrid cnn-rnn architecture for semg-based gesture recognition,” PloS one, vol. 13, no. 10, p e0206049

  34. Liu Z, Zhang C, Tian Y (2016) 3D-based deep convolutional neural network for action recognition with depth sequences. Image Vis Comput 55:93–100

    Article  Google Scholar 

  35. Capitan J, Spaan MT, Merino L, Ollero A (2013) Decentralized multi-robot cooperation with auctioned pomdps. The International Journal of Robotics Research 32(6):650–671

    Article  Google Scholar 

  36. Darmanin RN, Bugeja MK (2017) “A review on multi-robot systems categorised by application domain,” in 2017 25th mediterranean conference on control and automation (MED). IEEE, pp 701–706

  37. Hashtrudi-Zaad K, Salcudean SE (2002) Transparency in time-delayed systems and the effect of local force feedback for transparent teleoperation. IEEE Trans Robot Autom 18(1):108–114

    Article  Google Scholar 

  38. Lawrence DA (1993) Stability and transparency in bilateral teleoperation. IEEE transactions on robotics and automation 9(5):624–637

    Article  Google Scholar 

  39. Akagi J, Morris TD, Moon B, Chen X, Peterson CK (2021) Gesture commands for controlling high-level uav behavior. SN Applied Sciences 3(6):1–23

    Article  Google Scholar 

  40. Ma Y, Liu Y, Jin R, Yuan X, Sekha R, Wilson S, Vaidyanathan R (2017) Hand gesture recognition with convolutional neural networks for the multimodal UAV control

  41. Liu C, Szirányi T (2021) “Gesture recognition for uav-based rescue operation based on deep learning.” in Improve, pp 180–187

  42. Liu C, Szirányi T (2021) Real-time human detection and gesture recognition for on-board uav rescue. Sensors 21(6):2180

    Article  Google Scholar 

  43. Li X, Zhang Y, Liao D (2017) Mining key skeleton poses with latent svm for action recognition. Applied Computational Intelligence and Soft Computing, vol. 2017

  44. “Aviation signs, howpublished = https://www.pilotopolicial.com.br/Documentos/Legislacao/Portaria/ICA10012.pdf, pages = 247-254 note = Accessed: 2021-02-16.”

  45. Lee KU, Choi YH, Park JB (2017) Backstepping based formation control of quadrotors with the state transformation technique. Appl Sci 7(11):1170

    Article  Google Scholar 

  46. Muñoz F, Espinoza ES, González-Hernández I, Salazar S, Lozano R (2019) Robust trajectory tracking for unmanned aircraft systems using a nonsingular terminal modified super-twisting sliding mode controller. Journal of Intelligent & Robotic Systems 93(1-2):55–72

    Article  Google Scholar 

  47. Santana LV, Brandão AS, Sarcinelli-Filho M (2016) “Navigation and cooperative control using the ar.drone quadrotor,’. Journal of Intelligent & Robotic Systems 84(1):327–350

    Article  Google Scholar 

  48. Tang S, Kumar V (2018) Autonomous flight. Annual Review of Control, Robotics, and Autonomous Systems 1:29–52

    Article  Google Scholar 

  49. Pinto AO, Marciano HN, Bacheti VP, Moreira MSM, Brandao AS, Sarcinelli-Filho M (2020) “High-level modeling and control of the Bebop 2 micro aerial vehicle,” in The 2020 International Conference on Unmanned Aircraft Systems. Athens, Greece:, IEEE, oo 939–947

  50. De La Cruz C, Carelli R (2008) Dynamic model based formation control and obstacle avoidance of multi-robot systems. Robotica 26(3):345–356

    Article  Google Scholar 

  51. Martins FN, Sarcinelli-Filho M, Carelli R (2017) A velocity-based dynamic model and its properties for differential drive mobile robots. Journal of Intelligent & Robotic Systems 85(2):277–292

    Article  Google Scholar 

  52. Santos MCP, Rosales CD, Sarapura JA, Sarcinelli-Filho M, Carelli R (2019) An adaptive dynamic controller for quadrotor to perform trajectory tracking tasks. Journal of Intelligent & Robotic Systems 93(1):5–16

    Article  Google Scholar 

  53. López-Gutiérrez R, Rodriguez-Mata AE, Salazar S, González-Hernández I, Lozano R (2017) Robust quadrotor control: attitude and altitude real-time results. Journal of Intelligent & Robotic Systems 88 (2):299–312

    Article  Google Scholar 

  54. Martins FN, Celeste WC, Carelli R, Sarcinelli-Filho M, Bastos-Filho TF (2008) An adaptive dynamic controller for autonomous mobile robot trajectory tracking. Control Eng Pract 16(11):1354–1363

    Article  Google Scholar 

  55. Boubezoula M, Hassam A, Boutalbi O (2018) Robust-flatness controller design for a differentially driven wheeled mobile robot. Int J Control Autom Syst 16(4):1895–1904

    Article  Google Scholar 

  56. Rabelo MFS, Brandao AS, Sarcinelli-Filho M (2020) “Landing a uav on static or moving platforms using a formation controller,” IEEE Systems Journal

  57. Santos MCP, Rosales CD, Sarcinelli-Filho M, Carelli R (2017) A novel null-space-based uav trajectory tracking controller with collision avoidance. IEEE/ASME Transactions on Mechatronics 22(6):2543–2553

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by FAPEMIG - Fundação de Amparo à Pesquisa de Minas Gerais, an agency of the State of Minas Gerais, Brazil, for scientific development, and FAPES - Fundação de Amparo à Pesquisa e Inovação do Espírito Santo, an agency of the State of Espírito Santo, Brazil, for scientific, technological and innovative development. The authors would also like to acknowledge Vitor Thinassi, for helping with the gesture images.

Funding

This work was funded by CNPq - Conselho Nacional de Desenvolvimento Científico e Tecnológico, a Brazilian agency that supports scientific and technological development, CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior, a Brazilian federal government agency under the Ministry of Education, responsible for quality assurance in undergraduate and postgraduate institutions in Brazil, FAPEMIG - Fundação de Amparo à Pesquisa de Minas Gerais, an agency of the State of Minas Gerais, Brazil, for scientific development, and FAPES - Fundação de Amparo à Pesquisa e Inovação do Espírito Santo, an agency of the State of Espírito Santo, Brazil, for scientific, technological and innovative development.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kevin Braathen de Carvalho.

Ethics declarations

Ethics approval

The authors followed the COPE guidelines, which include but are not limited to: Not submitting the paper to multiple journals; the work being original and not split in multiple smaller papers for the sake of number of publication; not fabricating or manipulating data or results of this work by any means.

Consent for publication

All the individuals that are having their information disclosured in this paper consent to have them published.

Conflict of interest

The authors declare no competing interests.

Additional information

Availability of data and materials

The authors have not provided the materials used in this paper aside from the experiment uncut video footage.

Consent to participate

All the involved people in this paper, including who provided samples to the dataset, consented to participate in this work.

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

de Carvalho, K.B., Villa, D.K.D., Sarcinelli-Filho, M. et al. Gestures-teleoperation of a heterogeneous multi-robot system. Int J Adv Manuf Technol 118, 1999–2015 (2022). https://doi.org/10.1007/s00170-021-07659-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00170-021-07659-2

Keywords

Navigation