Advertisement

Interaction with Collaborative Robot Using 2D and TOF Camera

  • Aleš VysockýEmail author
  • Robert Pastor
  • Petr Novák
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11472)

Abstract

With increasing count of applications with collaborative robots it is important to be able to set up conditions for controlling the robot and its interaction with human operator. This article describes a quick response system detecting hands of the operator using 2D and TOF camera in the shared operator-robot workspace. The technology provides data about position of hands and gestures which is used to instant reactions of the robot on presence of the operator and to control the robot. Interaction with robot with gestures is for the operator more intuitive with no need of expertise knowledge. Operator can use both hands, so the operation is more efficient and faster. This system was tested on prototype workplace and results are evaluated in this article.

Keywords

Cobot HMI TOF camera Gesture control HRC Industrial robot 

Notes

Acknowledgements

This article has been elaborated in the research project Research Centre of Advanced Mechatronic Systems, reg. no. CZ.02.1.01/0.0/0.0/16_019/0000867 in the frame of the Operational Program Research, Development and Education, project “Coboty – vývoj periferií” HS3541801 in cooperation with Moravskoslezský automobilový klastr, z.s. This article has been supported by specific research project SP2018/86 and financed by the state budget of the Czech Republic.

References

  1. 1.
    Allied Market Research: Robotics Technology Market by Type (Industrial Robots, Service Robots, Mobile Robots and Others) and Application (Defense and Security, Aerospace, Automotive, Domestic and Electronics) Global Opportunity Analysis and Industry Forecast, 2013–2020 (2019). https://www.alliedmarketresearch.com/robotics-technology-market. Accessed 23 Jan 2019
  2. 2.
    Universal-robots.com: Applications for collaborative robot arms—Universal Robots (2019). https://www.universal-robots.com/applications/. Accessed 23 Jan 2019
  3. 3.
    Vysocky, A., Novak, P.: Human – robot collaboration in industry. MM Sci. J. 2016(02), 903–906 (2016).  https://doi.org/10.17973/mmsj.2016_06_201611CrossRefGoogle Scholar
  4. 4.
    Francis, S.: Collaborative robotic system makes ‘monotonous and physically demanding tasks’ at BMW easier. Robotics & Automation News (2019). https://roboticsandautomationnews.com/2017/06/15/collaborative-robotic-system-makes-monotonous-and-physically-demanding-tasks-at-bmw-easier/12889/. Accessed 23 Jan 2019
  5. 5.
    Han, X., Rashid, M.: Gesture and voice control of Internet of Things. In: 2016 IEEE 11th Conference on Industrial Electronics and Applications (ICIEA) (2016).  https://doi.org/10.1109/iciea.2016.7603877
  6. 6.
    Marin, G., Dominio, F., Zanuttigh, P.: Hand gesture recognition with leap motion and kinect devices. In: 2014 IEEE International Conference on Image Processing (ICIP) (2014).  https://doi.org/10.1109/icip.2014.7025313
  7. 7.
    Hodicky, J., Frantis, P.: Gesture and body movement recognition in the military decision support system. In: Proceedings of the 9th International Conference on Informatics in Control, Automation and Robotics, pp. 301–304 (2012).  https://doi.org/10.5220/0003971903010304
  8. 8.
    Das, A., Murmann, D., Cohrn, K., Raskar, R.: A method for rapid 3D scanning and replication of large paleontological specimens. PLoS ONE 12(7), e0179264 (2017).  https://doi.org/10.1371/journal.pone.0179264CrossRefGoogle Scholar
  9. 9.
    Wasenmüller, O., Stricker, D.: Comparison of kinect V1 and V2 depth images in terms of accuracy and precision. In: Chen, C.-S., Lu, J., Ma, K.-K. (eds.) ACCV 2016. LNCS, vol. 10117, pp. 34–45. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-54427-4_3CrossRefGoogle Scholar
  10. 10.
    Liu, H., Wang, L.: Gesture recognition for human-robot collaboration: a review. Int. J. Ind. Ergon. 68, 355–367 (2018).  https://doi.org/10.1016/j.ergon.2017.02.004CrossRefGoogle Scholar
  11. 11.
    Xu, C., Cheng, L.: Efficient hand pose estimation from a single depth image. In: 2013 IEEE International Conference on Computer Vision, pp. 3456–3462 (2013).  https://doi.org/10.1109/iccv.2013.429
  12. 12.
    Chen, F., Deng, J., Pang, Z., Baghaei Nejad, M., Yang, H., Yang, G.: Finger angle-based hand gesture recognition for smart infrastructure using wearable wrist-worn camera. Appl. Sci. 8(3), 369 (2018).  https://doi.org/10.3390/app8030369CrossRefGoogle Scholar
  13. 13.
    Chaudhary, A., Raheja, J., Das, K., Raheja, S.: Intelligent approaches to interact with machines using hand gesture recognition in natural way: a survey. Int. J. Comput. Sci. Eng. Surv. 2(1), 122–133 (2011).  https://doi.org/10.5121/ijcses.2011.2109CrossRefGoogle Scholar
  14. 14.
    Jalab, H.: Static hand gesture recognition for human computer interaction. Inf. Technol. J. 11(9), 1265–1271 (2012).  https://doi.org/10.3923/itj.2012.1265.1271CrossRefGoogle Scholar
  15. 15.
    Ganokratanaa, T., Pumrin, S.: The vision-based hand gesture recognition using blob analysis. In: 2017 International Conference on Digital Arts, Media and Technology (ICDAMT), pp. 336–341 (2017).  https://doi.org/10.1109/icdamt.2017.7904987
  16. 16.
    Nalepa, J., Kawulok, M.: Fast and accurate hand shape classification. In: Kozielski, S., Mrozek, D., Kasprowski, P., Małysiak-Mrozek, B., Kostrzewa, D. (eds.) BDAS 2014. CCIS, vol. 424, pp. 364–373. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-06932-6_35CrossRefGoogle Scholar
  17. 17.
    Kawulok, M., Kawulok, J., Nalepa, J., Smolka, B.: Self-adaptive algorithm for segmenting skin regions. EURASIP J. Adv. Signal Process. 2014(1), 170 (2014).  https://doi.org/10.1186/1687-6180-2014-170CrossRefGoogle Scholar
  18. 18.
    Mesbahi, S., Mahraz, M., Riffi, J., Tairi, H.: Hand gesture recognition based on convexity approach and background subtraction. In: 2018 International Conference on Intelligent Systems and Computer Vision (ISCV), pp. 1–5 (2018).  https://doi.org/10.1109/isacv.2018.8354074
  19. 19.
    Oyedotun, O., Khashman, A.: Deep learning in vision-based static hand gesture recognition. Neural Comput. Appl. 28(12), 3941–3951 (2016).  https://doi.org/10.1007/s00521-016-2294-8CrossRefGoogle Scholar
  20. 20.
    Barros, P., Maciel-Junior, N., Fernandes, B., Bezerra, B., Fernandes, S.: A dynamic gesture recognition and prediction system using the convexity approach. Comput. Vis. Image Underst. 155, 139–149 (2017).  https://doi.org/10.1016/j.cviu.2016.10.006CrossRefGoogle Scholar
  21. 21.
    Kerber, F., Puhl, M., Krüger, A.: User-independent real-time hand gesture recognition based on surface electromyography. In: Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services - MobileHCI 2017, pp. 1–7 (2017).  https://doi.org/10.1145/3098279.3098553
  22. 22.
    Su, Z., et al.: Microsphere-assisted robust epidermal strain gauge for static and dynamic gesture recognition. Small 13(47), 1702108 (2017).  https://doi.org/10.1002/smll.201702108CrossRefGoogle Scholar
  23. 23.
    Universal-robots.com: UR3 collaborative table-top robot arm that automates almost anything (2019). https://www.universal-robots.com/products/ur3-robot/. Accessed 23 Jan 2019
  24. 24.
    Cognex.com: In-Sight 8000 Vision Systems (2019). https://www.cognex.com/products/machine-vision/2d-machine-vision-systems/in-sight-8000-series. Accessed 23 Jan 2019
  25. 25.
    Ilonen, J., Kyrki, V.: Robust robot-camera calibration. In: 2011 15th International Conference on Advanced Robotics (ICAR), pp. 67–74 (2011).  https://doi.org/10.1109/icar.2011.6088553
  26. 26.
    opencv.org: Camera Calibration—OpenCV 3.0.0-dev documentation (2019). https://docs.opencv.org/3.0-beta/doc/py_tutorials/py_calib3d/py_calibration/py_calibration.html. Accessed 23 Jan 2019
  27. 27.
    Grover, P.: Evolution of Object Detection and Localization Algorithms. Towards Data Science (2019). https://towardsdatascience.com/evolution-of-object-detection-and-localization-algorithms-e241021d8bad. Accessed 23 Jan 2019
  28. 28.
    Rogalla, O., Ehrenmann, M., Zollner, R., Becher, R., Dillmann, R.: Using gesture and speech control for commanding a robot assistant. In: Proceedings 11th IEEE International Workshop on Robot and Human Interactive Communication, pp. 454–459 (2002).  https://doi.org/10.1109/roman.2002.1045664
  29. 29.
    Malima, A., Ozgur, E., Cetin, M.: A fast algorithm for vision-based hand gesture recognition for robot control. In: 2006 IEEE 14th Signal Processing and Communications Applications, pp. 1–4 (2006).  https://doi.org/10.1109/siu.2006.1659822
  30. 30.
    Arachchi, S., Hakim, N., Hsu, H., Klimenko, S., Shih, T.: Real-time static and dynamic gesture recognition using mixed space features for 3D virtual world’s interactions. In: 2018 32nd International Conference on Advanced Information Networking and Applications Workshops (WAINA), pp. 627–632 (2018).  https://doi.org/10.1109/waina.2018.00157
  31. 31.
    Software.intel.com: Intel® RealSense™ SDK, Hand Tracking Tutorial (2019). https://software.intel.com/sites/default/files/Hand_Tracking.pdf. Accessed 23 Jan 2019
  32. 32.
    Frantis, P., Hodicky, J.: Human machine interface in command and control system. In: 2010 IEEE International Conference on Virtual Environments, Human-Computer Interfaces and Measurement Systems, pp. 38–41 (2010).  https://doi.org/10.1109/vecims.2010.5609345
  33. 33.
    Frantis, P., Hodicky, J.: Virtual reality in presentation layer of C3I system. In Proceedings of International Congress on Modelling and Simulation: Advances and Applications for Management and Decision Making, MODSIM, pp. 3045–3050 (2005)Google Scholar
  34. 34.
    Kot, T., Novák, P., Babjak, J.: Application of augmented reality in mobile robot teleoperation. In: Mazal, J. (ed.) MESAS 2017. LNCS, vol. 10756, pp. 223–236. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-76072-8_16CrossRefGoogle Scholar
  35. 35.
    Holada, M., Pelc, M.: The robot voice-control system with interactive learning. In: New Developments in Robotics Automation and Control (2008).  https://doi.org/10.5772/6284
  36. 36.
    ISO/TS 15066:2016: Robots and robotic devices, collaborative robots (2016)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Faculty of Mechanical Engineering, Department of RoboticsVŠB-Technical University OstravaOstravaCzech Republic

Personalised recommendations