Towards seamless human robot collaboration: integrating multimodal interaction

Abstract

This paper discusses the challenges in the collaboration between human operators and industrial robots for assembly operations focusing on safety and simplified interaction. A case study is presented, involving perception technologies for the robot in conjunction with wearable devices used by the operator. In terms of robot perception, a manual guidance module, an air pressor contact sensor namely skin, and a vision system for recognition and tracking of objects have been developed and integrated. Concerning the wearable devices, an advanced user interface including audio and haptic commands accompanied by augmented reality technology are used to support the operator and provide awareness by visualizing information related to production and safety aspects. In parallel, safety functionalities are implemented through collision detection technologies such as a safety skin and safety monitored regions delimiting the area of the robot activities. The complete system is coordinated under a common integration platform and it is validated in a case study of the white goods industry.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19

References

  1. 1.

    Tsarouchi P, Makris S, Chryssolouris G (2016) Human-robot interaction review and challenges on task planning and programming. Int J Comput Integr Manuf 29:916–931. https://doi.org/10.1080/0951192X.2015.1130251

    Article  Google Scholar 

  2. 2.

    Chryssolouris G (2006) Manufacturing systems: theory and practice, 2nd edn. Springer-Verlag, New York

    Google Scholar 

  3. 3.

    Bortolini M, Faccio M, Gamberi M, Pilati F (2018) Motion analysis system (MAS) for production and ergonomics assessment in the manufacturing processes. Comput Ind Eng. https://doi.org/10.1016/j.cie.2018.10.046

  4. 4.

    Bortolini M, Faccio M, Gamberi M, Pilati F (2017a) Multi-objective assembly line balancing considering component picking and ergonomic risk. Comput Ind Eng 112:348–367. https://doi.org/10.1016/j.cie.2017.08.029

    Article  Google Scholar 

  5. 5.

    Bortolini M, Ferrari E, Gamberi M, Pilati F, Faccio M (2017b) Assembly system design in the industry 4.0 era: a general framework. 20th IFAC World Congress. IFAC-PapersOnLine 50:5700–5705. https://doi.org/10.1016/j.ifacol.2017.08.1121

    Article  Google Scholar 

  6. 6.

    Kousi N, Michalos G, Aivaliots S, Makris S (2018) An outlook on future assembly systems introducing robotic mobile dual arm workers. 51st CIRP Conference on Manufacturing System. Proced CIRP 72:33–38. https://doi.org/10.1016/j.procir.2018.03.130

    Article  Google Scholar 

  7. 7.

    Chryssolouris G, Mourtzis D (2012) Proceedings on manufacturing, modeling, management and control. 45th CIRP Conference on Manufacturing Systems, Procedia CIRP 3:1–650

  8. 8.

    Pilz GmbH (2015) Safe camera system SafetyEYE. https://www.pilz.com/en-DE/eshop/00014000337042/SafetyEYE-Safe-camera-system. Accessed 2 Dec 2015

  9. 9.

    Bley H, Reinhart G, Seliger G, Bernardi M, Korne T (2004) Appropriate human involvement in assembly and disassembly. CIRP Ann 53:487–509. https://doi.org/10.1016/S0007-8506(07)60026-2

    Article  Google Scholar 

  10. 10.

    Kruger J, Bernhardt R, Surdilovic D, Spur G (2006) Intelligent assist systems for flexible assembly. CIRP Ann 55:29–33. https://doi.org/10.1016/S0007-8506(07)60359-X

    Article  Google Scholar 

  11. 11.

    Helms E, Schraft RD, Hagele M (2002) Rob@work: robot assistant in industrial environments. Proceedings of the 11th IEEE Int. Workshop on Robot and Human Interactive Communication, pp 399–404. https://doi.org/10.1109/ROMAN.2002.1045655

  12. 12.

    Wannasuphoprasit W, Akella P, Peshkin M, Colgate JE (1998) A novel material handling technology. International Mechanical Engineering Congress and Exposition, pp 1–7

  13. 13.

    Brecher C, Schroter B, Almeida C (2005) Development and programming of portable robot systems for material handling tasks. Proceedings of the CIRP International Conference on Reconfigurable Manufacturing

  14. 14.

    Bernhardt R, Surdilovic D, Katschinski V, Schroer K, Schroer K (2008) Next generation of flexible assembly systems. Innovation in Manufacturing Networks, pp 279–288. https://doi.org/10.1007/978-0-387-09492-2_30

  15. 15.

    Bernhardt R, Surdilovic D, Katschinski V, Schroer K (2008) Flexible assembly systems through human integration. IEEE SMC International Conference on Distributed Human-Machine Systems Proceedings, pp 497–502. https://doi.org/10.3182/20070523-3-ES-4908.00041

    Article  Google Scholar 

  16. 16.

    Michalos G, Makris S, Tsarouchi P, Guasch T, Kontovrakis D, Chryssolouris G (2015) Design considerations for safe human–robot collaborative workplaces. Proced CIRP 37:248–253. https://doi.org/10.1016/j.procir.2015.08.014

    Article  Google Scholar 

  17. 17.

    Michalos G, Makris S, Spiliotopoulos J, Misios I, Tsarouchi P, Chryssolouris G (2014) ROBO-PARTNER: seamless human-robot cooperation for intelligent, flexible and safe operations in the assembly factories of the future. Proced CIRP 23:71–76. https://doi.org/10.1016/j.procir.2014.10.079

    Article  Google Scholar 

  18. 18.

    Goodrich M, Schultz C (2008) Human-robot interaction: a survey. Found Trends Hum Comput Interact 1(3):203–275. https://doi.org/10.1561/1100000005

    Article  MATH  Google Scholar 

  19. 19.

    Universal robots https://www.universal-robots.com/. Accessed on 2019

  20. 20.

    KUKA robots https://www.kuka.com/. Accessed on 2019

  21. 21.

    Michalos G, Kousi N, Karagiannis P, Gkournelos C, Dimoulas K, Koukas S, Mparis K, Papavasiliou A, Makris S (2018) Seamless human robot collaborative assembly—an automotive case study. Mechatronics 55:194–211. https://doi.org/10.1016/j.mechatronics.2018.08.006

    Article  Google Scholar 

  22. 22.

    Elkeman N Tactile sensors. Fraunhofer IFF. http://www.iff.fraunhofer.de/en/business-units/robotic-systems/technologies/tactile-sensor-systems.html. Accessed on 2016

  23. 23.

    Perez L, Rodriguez I, Rodriguez N, Usamentiaga R, Garcia DF (2016) Robot guidance using machine vision techniques in industrial environments: a comparative review. Sensors 16(3):335. https://doi.org/10.3390/s16030335

    Article  Google Scholar 

  24. 24.

    Statista (2018) Household appliances. Statista digital market outlook. https://www.statista.com/outlook/256/102/home-appliances/europe#. Accessed on 2018

  25. 25.

    Kang MK, Lee S, Kim JH (2014) Shape optimization of a mechanically decoupled six-axis force/torque sensor. Sensors Actuators 209:41–51. https://doi.org/10.1016/j.sna.2014.01.001

    Article  Google Scholar 

  26. 26.

    Chen S, Li Y, Kwok NM (2011) Active vision in robotic systems: a survey of recent developments. Int J Robot Res 30:1343–1377. https://doi.org/10.1177/0278364911410755

    Article  Google Scholar 

  27. 27.

    Davies ER (1998) Automated visual inspection. Mach Vis 19:471–502. https://doi.org/10.1016/B978-0-12-206090-8.50027-X

    Article  Google Scholar 

  28. 28.

    Sanz J, Petkovic D (1988) Machine vision algorithm for automated inspection of thin-film disk heads. IEEE Trans PAMI 10:830–848. https://doi.org/10.1109/34.9106

    Article  Google Scholar 

  29. 29.

    Tucker JW (1989) Inside beverage can inspection: an application from start to finish. Proceedings of the Vision '89 Conference

  30. 30.

    Ker J, Kengskool K (1990) An efficient method for inspecting machine parts by a fixtureless machine vision system. Proceedings of the Vision '90 Conference

  31. 31.

    Li H, Lin JC (1994) Using fuzzy logic to detect dimple defects of polished wafer surfaces. IEEE Trans Ind Appl 30:1530–1543. https://doi.org/10.1109/28.287528

    Article  Google Scholar 

  32. 32.

    Rentzos L, Papanastasiou S, Papakostas N, Chryssolouris G (2013) Augmented reality for human-based assembly: using product and process semantics. IFAC Proceed 46(15):98–101. https://doi.org/10.3182/20130811-5-US-2037.00053

    Article  Google Scholar 

  33. 33.

    Michalos G, Karagiannis P, Makris S, Tokcalar O, Chryssolouris G (2015) Augmented reality (AR) applications for supporting human-robot interactive cooperation. Proced CIRP 41:370–375. https://doi.org/10.1016/j.procir.2015.12.005

    Article  Google Scholar 

  34. 34.

    Makris S, Karagiannis P, Koukas S, Matthaiakis A-S (2016) Augmented reality system for operator support in human–robot collaborative assembly. CIRP Ann 65:61–64. https://doi.org/10.1016/j.cirp.2016.04.038

    Article  Google Scholar 

  35. 35.

    Gkournelos C, Karagiannis P, Kousi N, Michalos G, Koukas S, Makris S (2018) Application of wearable devices for supporting operators in human-robot cooperative assembly tasks. 7th CIRP Conference on Assembly Technologies And Systems, (CATS 2018), Tianjin, China

  36. 36.

    Kokkalis K, Michalos G, Aivaliotis P, Makris S (2018) An approach for implementing power and force limiting in sensorless industrial robots. 7th CIRP Conference on Assembly Technologies And Systems, (CATS 2018), Tianjin, China

  37. 37.

    Blue Danube Robotics http://www.bluedanuberobotics.com. Accessed on 2016

  38. 38.

    Argyrou A, Giannoulis C, Sardelis A, Karagiannis P, Michalos G, Makris S (2018) A data fusion system for controlling the execution status in human-robot collaborative cells. Proced CIRP 76:193–198. https://doi.org/10.1016/j.procir.2018.01.012

    Article  Google Scholar 

  39. 39.

    Karagiannis P, Giannoulis C, Michalos G, Makris S (2018) Configuration and control approach for flexible production stations. Proced CIRP 78:166–171. https://doi.org/10.1016/j.procir.2018.09.053

    Article  Google Scholar 

  40. 40.

    Romanelli F (2011) Advanced methods for robot-environment interaction towards an industrial robot aware of its volume. J Robot 2011:1–12. https://doi.org/10.1155/2011/389158

    Article  Google Scholar 

  41. 41.

    Makris S, Rentzos L, Pintzos G, Mavrikios D, Chryssolouris G (2012) Semantic-based taxonomy for immersive product design using VR techniques. CIRP Ann 61:147–150. https://doi.org/10.1016/j.cirp.2012.03.008

    Article  Google Scholar 

  42. 42.

    Patil M, Joshi N, Tilekar A, Shinde P (2015) Li-fi based voice controlled robot. National Conference on Emerging Trends in Advanced Communication Technologies 35–37

  43. 43.

    Silaghi H, Rohde U, Spoial V, Silaghi A, Gergely E, Nagy Z (2014) Voice command of an industrial robot in a noisy environment. Fundamentals of Electrical Engineering (ISFEE), pp 1–5. https://doi.org/10.1109/ISFEE.2014.7050596

Download references

Acknowledgements

This research was supported by the EC research project “ROBO-PARTNER – Seamless Human-Robot Cooperation for Intelligent, Flexible and Safe Operations in the Assembly Factories of the Future” (Grant Agreement: 608855) (www.robo-partner.eu). The authors would like to specially thank Electrolux Italia S.P.A. (ELUX) for providing valuable input for the current status, the challenges, and the requirements of the refrigerator’s assembly line used as a case study in the present work.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Sotiris Makris.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Papanastasiou, S., Kousi, N., Karagiannis, P. et al. Towards seamless human robot collaboration: integrating multimodal interaction. Int J Adv Manuf Technol 105, 3881–3897 (2019). https://doi.org/10.1007/s00170-019-03790-3

Download citation

Keywords

  • Human robot collaboration
  • Interaction
  • Augmented reality
  • Wearable devices
  • Safety
  • Integration