Skip to main content
Log in

Gesture-based human-robot interaction for human assistance in manufacturing

  • ORIGINAL ARTICLE
  • Published:
The International Journal of Advanced Manufacturing Technology Aims and scope Submit manuscript

Abstract

The paradigm for robot usage has changed in the last few years, from a scenario in which robots work isolated to a scenario where robots collaborate with human beings, exploiting and combining the best abilities of robots and humans. The development and acceptance of collaborative robots is highly dependent on reliable and intuitive human-robot interaction (HRI) in the factory floor. This paper proposes a gesture-based HRI framework in which a robot assists a human co-worker delivering tools and parts, and holding objects to/for an assembly operation. Wearable sensors, inertial measurement units (IMUs), are used to capture the human upper body gestures. Captured data are segmented in static and dynamic blocks recurring to an unsupervised sliding window approach. Static and dynamic data blocks feed an artificial neural network (ANN) for static, dynamic, and composed gesture classification. For the HRI interface, we propose a parameterization robotic task manager (PRTM), in which according to the system speech and visual feedback, the co-worker selects/validates robot options using gestures. Experiments in an assembly operation demonstrated the efficiency of the proposed solution.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Yang G-Z, Bellingham J, Dupont PE, Fischer P, Floridi L, Full R, Jacobstein N, Kumar V, McNutt M, Merrifield R, Nelson BJ, Scassellati B, Taddeo M, Taylor R, Veloso M, Wang ZL, Wood R (2018) The grand challenges of science robotics. Science Robotics 3:14

    Google Scholar 

  2. Johannsmeier L, Haddadin S (Jan 2017) A hierarchical human robot interaction planning framework for task allocation in collaborative industrial assembly processes. IEEE Robot Autom Lett 2(1):41–48

  3. Sadrfaridpour B, Wang Y (2017) Collaborative assembly in hybrid manufacturing cells: an integrated framework for human-robot interaction. IEEE Trans Autom Sci Eng PP(99):1–15

    Google Scholar 

  4. Kaipa K, Morato C, Liu J, Gupta S (2014) Human-robot collaboration for bin-picking tasks to support low-volume assemblies. Robotics Science and Systems Conference

  5. Matsas E, Vosniakos G-C, Batras D (2017) Effectiveness and acceptability of a virtual environment for assessing human–robot collaboration in manufacturing. Int J Adv Manuf Technol 92(9):3903–3917

    Article  Google Scholar 

  6. Makrini IE, Merckaert K, Lefeber D, Vanderborght B (2017) Design of a collaborative architecture for human-robot assembly tasks. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1624–1629

  7. Ende T, Haddadin S, Parusel S, Wüsthoff T, Hassenzahl M, Albu-Schäffer A (2011) A human-centered approach to robot gesture based communication within collaborative working processes. IROS 2011, San Francisco

  8. Sheikholeslami S, Moon A, Croft E (2017) Cooperative gestures for industry: Exploring the efficacy of robot hand configurations in expression of instructional gestures for human-robot interaction. Int J Robot Res 36 (5-7):699–720

    Article  Google Scholar 

  9. Rouanet P, Oudeyer P, Danieau F, Filliat D (2013) The impact of human-robot interfaces on the learning of visual objects. IEEE Trans Robot 29(2):525–541

    Article  Google Scholar 

  10. Radmard S, Moon AJ, Croft E (2015) Interface design and usability analysis for a robotic telepresence platform. In: 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp 511–516

  11. Simao M, Neto P, Gibaru O (2016) Natural control of an industrial robot using hand gesture recognition with neural networks. In: IECON 2016 - 42nd Annual Conference of the IEEE Industrial Electronics Society, pp 5322–5327

  12. Wolf MT, Assad C, Vernacchia MT, Fromm J, Jethani HL (2013) Gesture-based robot control with variable autonomy from the JPL biosleeve. in: 2013 IEEE International Conference on Robotics and Automation. IEEE, pp 1160–1165

  13. Neto P, Pereira D, Pires JN, Moreira AP (2013) Real-time and continuous hand gesture spotting: An approach based on artificial neural networks, 2013 IEEE International Conference on Robotics and Automation, pp 178–183

  14. Gleeson B, MacLean K, Haddadi A, Croft E, Alcazar J (2013) Gestures for industry intuitive human-robot communication from human observation, in 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp 349–356

  15. Goldin-Meadow S (1999) The role of gesture in communication and thinking. Trends Cogn Sci 3(11):419–429

    Article  Google Scholar 

  16. Feldman RS (1991) Fundamentals of nonverbal behavior. Cambridge University Press, Cambridge

    Google Scholar 

  17. Waldherr S, Romero R, Thrun S (2000) A Gesture Based Interface for Human-Robot Interaction. Auton Rob 9(2):151–173

    Article  Google Scholar 

  18. Quintero CP, Fomena RT, Shademan A, Wolleb N, Dick T, Jagersand M (2013) Sepo: Selecting by pointing as an intuitive human-robot command interface. In: 2013 IEEE International Conference on Robotics and Automation (ICRA), pp 1166–1171

  19. Burke M, Lasenby J (2015) Pantomimic gestures for human–robot interaction. IEEE Trans Robot 31 (5):1225–1237

    Article  Google Scholar 

  20. Okuno Y, Kanda T, Imai M, Ishiguro H, Hagita N (2009) Providing route directions: Design of robot’s utterance, gesture, and timing, pp 53–60

  21. Salem M, Kopp S, Wachsmuth I, Rohlfing K, Joublin F (2012) Generation and Evaluation of Communicative Robot Gesture. Int J Soc Robot 4(2):201–217

    Article  Google Scholar 

  22. Huang C-M, Mutlu B (2013) Modeling and evaluating narrative gestures for humanlike robots. In: Inproceedings of Robotics: Science and Systems, pp 15

  23. Wongphati M, Osawa H, Imai M (2015) Gestures for manually controlling a helping hand robot. Int J Soc Robot 7(5):731–742

    Article  Google Scholar 

  24. Shao Z, Li Y (2015) Integral invariants for space motion trajectory matching and recognition. Pattern Recogn 48(8):2418–2432

    Article  MATH  Google Scholar 

  25. Simao MA, Neto P, Gibaru O (2017) Unsupervised gesture segmentation by motion detection of a real-time data stream. IEEE Trans Ind Inf 13(2):473–481

    Article  Google Scholar 

  26. Alon J, Athitsos V, Yuan Q, Sclaroff S (2009) A unified framework for gesture recognition and spatiotemporal gesture segmentation. IEEE Trans Pattern Anal Mach Intell 31(9):1685–99

    Article  Google Scholar 

  27. Yang R, Sarkar S, Loeding B (2010) Handling movement epenthesis and hand segmentation ambiguities in continuous sign language recognition using nested dynamic programming. IEEE Trans Pattern Anal Mach Intell 32(3):462–77

    Article  Google Scholar 

  28. Burger B, Ferrané I, Lerasle F, Infantes G (2011) Two-handed gesture recognition and fusion with speech to command a robot. Auton Robot 32(2):129–147

    Article  Google Scholar 

  29. Villani V, Sabattini L, Riggio G, Secchi C, Minelli M, Fantuzzi C (2017) A natural infrastructure less human robot interaction system. IEEE Robot Autom Lett 2(3):1640–1647

    Article  Google Scholar 

  30. Wu D, Pigou L, Kindermans PJ, LE N, Shao L, Dambre J, Odobez JM (2016) Deep dynamic neural networks for multimodal gesture segmentation and recognition. IEEE Trans Pattern Anal Mach Intell PP (99):1–1

    Google Scholar 

  31. Ordonez FJ, Roggen D (2016) Deep convolutional and lstm recurrent neural networks for multimodal wearable activity recognition. Sensors 16:1

    Article  Google Scholar 

  32. Field M, Stirling D, Pan Z, Ros M, Naghdy F (2015) Recognizing human motions through mixture modeling of inertial data. Pattern Recogn 48(8):2394–2406

    Article  Google Scholar 

  33. Song Y, Demirdjian D, Davis R (2012) Continuous body and hand gesture recognition for natural human-computer interaction. ACM Trans Interact Intell Syst 2(1):1–28

    Article  Google Scholar 

  34. Monnier C, German S, Ost A (2015) A Multi-scale Boosted Detector for Efficient and Robust Gesture Recognition. Springer International Publishing, Cham, pp 491–502

    Google Scholar 

  35. Mei K, Zhang J, Li G, Xi B, Zheng N, Fan J (2015) Training more discriminative multi-class classifiers for hand detection. Pattern Recogn 48(3):785–797

    Article  Google Scholar 

  36. Pedersen MR, Krüger V (2015) Gesture-based extraction of robot skill parameters for intuitive robot programming. J Intell Robot Syst 80(1):149–163

    Article  Google Scholar 

  37. Rossi S, Leone E, Fiore M, Finzi A, Cutugno F (2013) An extensible architecture for robust multimodal human-robot communication. In: 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp 2208–2213

  38. Cacace J, Finzi A, Lippiello V, Furci M, Mimmo N, Marconi L (2016) A control architecture for multiple drones operated via multimodal interaction in search rescue mission. In: 2016 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), pp 233–239

  39. Møller MF (1993) A scaled conjugate gradient algorithm for fast supervised learning. neural Netw 6(4):525–533

    Article  Google Scholar 

  40. Safeea M, Bearee R, Neto P (2018) End-Effector Precise Hand-Guiding for collaborative robots. Springer International Publishing, Cham, pp 95–101

    Google Scholar 

  41. Madsen O, Bøgh S, Schou C, Andersen RS, Damgaard JS, Pedersen MR, Krüger V (2015) Integration of mobile manipulators in an industrial production. Ind Robot: Int J 42(1):11–18

    Article  Google Scholar 

Download references

Funding

This work was supported in part by the Portuguese Foundation for Science and Technology (FCT) project COBOTIS (PTDC/EME-EME/32595/2017), and the Portugal 2020 project DM4Manufacturing POCI-01-0145-FEDER- 016418 by UE/FEDER through the program COMPETE2020.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pedro Neto.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Neto, P., Simão, M., Mendes, N. et al. Gesture-based human-robot interaction for human assistance in manufacturing. Int J Adv Manuf Technol 101, 119–135 (2019). https://doi.org/10.1007/s00170-018-2788-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00170-018-2788-x

Keywords

Navigation