Advertisement

Development and experimental validation of algorithms for human–robot interaction in simulated and real scenarios

  • A. Freddi
  • S. IarloriEmail author
  • S. Longhi
  • A. Monteriù
Original Research
  • 15 Downloads

Abstract

The development of robots, which can safely and effectively interact with people and assist them in structured environments, is an open research problem whose importance has been growing rapidly in the last years. Indeed working in shared environments with human beings, these robots require new ways to achieve human–robot interaction and cooperation. This work presents an approach for performing human–robot interaction by means of robotic manipulators. The interaction is composed by three main steps, namely the selection, the recognition and the grasping of an object. The object selection is recorded on the base of a gesture execution, realized by the user in front of a RGB-D camera and associated to each particular object. The object recognition is achieved by means of the RGB cameras mounted on the two manipulator arms, which send the workspace information to a specific classifier. With the aim of realizing the grasping step, the object position and orientation are extracted in order to correctly rotate the gripper according to the object on the desk in front of the robot. The final goal is to release the grasped object on the hand of the user standing in front of the desk. This system could support people with limited motor skills who are not able to get an object on their own, playing an important role in structured assistive and smart environments, thus promoting the human–robot interaction in activity of daily living.

Keywords

Human–robot interaction Human–robot cooperation Robotic manipulators 

Notes

References

  1. Achic F, Montero J, Penaloza C, Cuellar F (2016) Hybrid BCI system to operate an electric wheelchair and a robotic arm for navigation and manipulation tasks. In: 2016 IEEE workshop on advanced robotics and its social impacts (ARSO), pp 249–254.  https://doi.org/10.1109/ARSO.2016.7736290
  2. Bekele E, Wade J, Bian D, Fan J, Swanson A, Warren Z, Sarkar N (2016) Multimodal adaptive social interaction in virtual environment (MASI-VR) for children with autism spectrum disorders (ASD). In: IEEE virtual reality (VR), pp 121–130.  https://doi.org/10.1109/VR.2016.7504695
  3. Canal G, Escalera S, Angulo C (2016) A real-time human-robot interaction system based on gestures for assistive scenarios. Comput Vis Image Underst 149:65–77.  https://doi.org/10.1016/j.cviu.2016.03.004(special issue on Assistive Computer Vision and Robotics - Assistive Solutions for Mobility, Communication and HMI) CrossRefGoogle Scholar
  4. Capecci M, Ceravolo M, Ferracuti F, Iarlori S, Longhi S, Romeo L, Russi S, Verdini F (2016) Accuracy evaluation of the kinect v2 sensor during dynamic movements in a rehabilitation scenario, vol 2016, pp 5409–5412.  https://doi.org/10.1109/EMBC.2016.7591950
  5. Cavallo F, Aquilano M, Bonaccorsi M, Limosani R, Manzi A, Carrozza MC, Dario P (2013) On the design, development and experimentation of the astro assistive robot integrated in smart environments. In: 2013 IEEE international conference on robotics and automation, pp 4310–4315.  https://doi.org/10.1109/ICRA.2013.6631187
  6. Chance G, Camilleri A, Winstone B, Caleb-Solly P, Dogramadzi S (2016) An assistive robot to support dressing - strategies for planning and error handling. In: 6th IEEE international conference on biomedical robotics and biomechatronics (BioRob), pp 774–780.  https://doi.org/10.1109/BIOROB.2016.7523721
  7. Chen L, Wei H, Ferryman J (2013) A survey of human motion analysis using depth imagery. Pattern Recogn Lett 34:1995–2006.  https://doi.org/10.1016/j.patrec.2013.02.006(smart Approaches for Human Action Recognition) CrossRefGoogle Scholar
  8. Clodic A, Pacherie E, Alami R, Chatila R (2017) Key elements for human–robot joint action. Springer, Berlin, pp 159–177.  https://doi.org/10.1007/978-3-319-53133-5_8 CrossRefGoogle Scholar
  9. Coupeté E, Moutarde F, Manitsaris S (2015) Gesture recognition using a depth camera for human robot collaboration on assembly line. Proced Manuf 3:518–525.  https://doi.org/10.1016/j.promfg.2015.07.216(6th International Conference on Applied Human Factors and Ergonomics (AHFE 2015) and the Affiliated Conferences, AHFE 2015) CrossRefGoogle Scholar
  10. Cremer S, Mastromoro L, Popa DO (2016) On the performance of the baxter research robot. In: 2016 IEEE international symposium on assembly and manufacturing (ISAM), pp 106–111.  https://doi.org/10.1109/ISAM.2016.7750722
  11. Ding IJ, Chang YJ (2017) Hmm with improved feature extraction-based feature parameters for identity recognition of gesture command operators by using a sensed kinect-data stream. Neurocomputing 262:108–119.  https://doi.org/10.1016/j.neucom.2016.11.089(online Real-Time Learning Strategies for Data Streams) CrossRefGoogle Scholar
  12. D’Onofrio G, Fiorini L, de Mul M, Fabbricotti I, Okabe Y, Hoshino H, Limosani R, Vitanza A, Greco F, Giuliani F, Guiot D, Senges E, Kung A, Cavallo F, Sancarlo D, Greco A (2018) Agile co-creation for robots and aging (accra) project: new technological solutions for older people. Eur Geriatr Med 9(6):795–800.  https://doi.org/10.1007/s41999-018-0106-7 CrossRefGoogle Scholar
  13. Feil-Seifer D, Matarić M (2005) Defining socially assistive robotics. Robotics 2005:465–468.  https://doi.org/10.1109/ICORR.2005.1501143 CrossRefGoogle Scholar
  14. Foresi G, Freddi A, Monteriù A, Ortenzi D, Pagnotta DP (2018) Improving mobility and autonomy of disabled users via cooperation of assistive robots. In: IEEE international conference on consumer electronics (ICCE), pp 1–2.  https://doi.org/10.1109/ICCE.2018.8326291
  15. Foresi G, Freddi A, Iarlori S, Longhi S, Monteriù A, Ortenzi D, Proietti Pagnotta D (2019) Human–robot cooperation via brain computer interface in assistive scenario. In: Casiddu N, Porfirione C, Monteriù A, Cavallo F (eds) Ambient assisted living. Springer, Berlin, pp 115–131CrossRefGoogle Scholar
  16. Fujii T, Lee J, Okamoto S (2014) Gesture recognition system for human–robot interaction and its application to robotic service task. Lect Notes Eng Comput Sci 2209:63–68Google Scholar
  17. Le HH, Loomes MJ, Loureiro RCV (2016) Group interaction through a multi-modal haptic framework. In: 2016 12th international conference on intelligent environments (IE), pp 62–67.  https://doi.org/10.1109/IE.2016.18
  18. Lee HR, Tan H, Šabanović S (2016) That robot is not for me: addressing stereotypes of aging in assistive robot design. In: 25th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 312–317.  https://doi.org/10.1109/ROMAN.2016.7745148
  19. Mendes N, Ferrer J, Vitorino J, Safeea M, Neto P (2017) Human behavior and hand gesture classification for smart human-robot interaction. Proced Manuf 11:91–98.  https://doi.org/10.1016/j.promfg.2017.07.156(27th international conference on flexible automation and intelligent manufacturing, FAIM2017, 27–30 June 2017, Modena, Italy)CrossRefGoogle Scholar
  20. Mettel MR, Alekseew M, Stocklöw C, Braun A (2019) Designing and evaluating safety services using depth cameras. J Ambient Intell Humaniz Comput 10:747–759.  https://doi.org/10.1007/s12652-018-0898-9 CrossRefGoogle Scholar
  21. Mishra A, Skubic M, Abbott C (2015) Development and preliminary validation of an interactive remote physical therapy system, vol 2015.  https://doi.org/10.1109/EMBC.2015.7318332
  22. Mobini A, Behzadipour S, Foumani M (2013) Accuracy of kinect’s skeleton tracking for upper body rehabilitation applications. Disabil Rehabi Assist Technol.  https://doi.org/10.3109/17483107.2013.805825 CrossRefGoogle Scholar
  23. Morales Y, Miyashita T, Hagita N (2017) Social robotic wheelchair centered on passenger and pedestrian comfort. Robot Auton Syst 87:355–362.  https://doi.org/10.1016/j.robot.2016.09.010 CrossRefGoogle Scholar
  24. Poncela A, Coslado F, García B, FernÁndez-Carmona M, Ariza J, Peinado G, Demetrio C, Sandoval F (2018) Smart care home system: a platform for eassistance. J Ambient Intell Human Comput.  https://doi.org/10.1007/s12652-018-0979-9 CrossRefGoogle Scholar
  25. Pons P, Jaen J (2019) Interactive spaces for children: gesture elicitation for controlling ground mini-robots. J Ambient Intell Human Comput.  https://doi.org/10.1007/s12652-019-01290-6 CrossRefGoogle Scholar
  26. Quigley M, Conley K, Gerkey B, Faust J, Foote T, Leibs J, Wheeler R, Ng A (2009) Ros: an open-source robot operating system, vol 3Google Scholar
  27. Raheja JL, Chandra M, Chaudhary A (2018) 3d gesture based real-time object selection and recognition. Pattern Recogn Lett 115:14–19.  https://doi.org/10.1016/j.patrec.2017.09.034(multimodal Fusion for Pattern Recognition) CrossRefGoogle Scholar
  28. Romer GRBE, Stuyt HJA, Peters A (2005) Cost-savings and economic benefits due to the assistive robotic manipulator (arm). In: 9th international conference on rehabilitation robotics, 2005. ICORR 2005, pp 201–204.  https://doi.org/10.1109/ICORR.2005.1501084
  29. Sarcevic P, Kincses Z, Pletl S (2019) Online human movement classification using wrist-worn wireless sensors. J Ambient Intell Humaniz Comput 10:89–106.  https://doi.org/10.1007/s12652-017-0606-1 CrossRefGoogle Scholar
  30. Sciutti A, Sandini G (2017) Interacting with robots to investigate the bases of social interaction. IEEE transactions on neural systems and rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.  https://doi.org/10.1109/TNSRE.2017.2753879 CrossRefGoogle Scholar
  31. Tellaeche A, Kildal J, Maurtua I (2018) A flexible system for gesture based human-robot interaction. Proced CIRP 72:57–62.  https://doi.org/10.1016/j.procir.2018.03.017 (51st CIRP Conference on Manufacturing Systems) CrossRefGoogle Scholar
  32. Tsui K, Yanco H, Kontak D, Beliveau L (2008) Development and evaluation of a flexible interface for a wheelchair mounted robotic arm. In: 3rd ACM/IEEE international conference on human–robot interaction (HRI), pp 105–112.  https://doi.org/10.1145/1349822.1349837
  33. Wen Z, Liu D, Liu X, Zhong L, Lv Y, Jia Y (2019) Deep learning based smart radar vision system for object recognition. J Ambient Intell Humaniz Comput 10(3):829–839.  https://doi.org/10.1007/s12652-018-0853-9 CrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2020

Authors and Affiliations

  1. 1.Department of Information Engineering, Università Politecnica delle MarcheAnconaItaly

Personalised recommendations