Advertisement

International Journal of Social Robotics

, Volume 11, Issue 3, pp 373–388 | Cite as

A Study on Different User Interfaces for Teaching Virtual Borders to Mobile Robots

  • Dennis SpruteEmail author
  • Klaus Tönnies
  • Matthias König
Article
  • 194 Downloads

Abstract

Human-aware robot navigation is an essential aspect to increase the acceptance of mobile service robots in human-centered environments, e.g. home environments. Robots need to navigate in a human-acceptable way according to the users’ conventions, presence and needs. In order to address the users’ needs, we employ virtual borders, which are non-physical borders and respected by the robots while working, to effectively restrict the workspace of a mobile robot and change its navigational behavior. To this end, we consider different user interfaces, i.e. visual markers, a laser pointer, a graphical user interface and a RGB-D Google Tango tablet with augmented reality application, to allow non-expert users the flexible and interactive definition of virtual borders. These user interfaces were evaluated with respect to their correctness, flexibility, accuracy, teaching effort and user experience. Experimental results show that the RGB-D Google Tango tablet as user interface yields the best overall results compared to the other user interfaces. Apart from a low teaching effort and high flexibility and accuracy, it features the highest user ratings acquired from a comprehensive user study with 25 participants for intuitiveness, comfort, learnability and its feedback system.

Keywords

Human-aware navigation Socially-aware navigation Human-centered environments Social human–robot interaction Virtual borders User interfaces 

Notes

Acknowledgements

We would like to thank all participants of the user study for their time and valuable feedback.

Funding

This study was funded by the German Federal Ministry of Education and Research (Grant Number 03FH006PX5).

Compliance with Ethical Standards

Conflict of interest

The authors declare that they have no conflict of interest.

References

  1. 1.
    Hess J, Beinhofer M, Burgard W (2014) A probabilistic approach to high-confidence cleaning guarantees for low-cost cleaning robots. In: 2014 IEEE international conference on robotics and automation (ICRA), pp 5600–5605Google Scholar
  2. 2.
    Gross HM, Mueller S, Schroeter C, Volkhardt M, Scheidig A, Debes K, Richter K, Doering N (2015) Robot companion for domestic health assistance: Implementation, test and case study under everyday conditions in private apartments. In: 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 5992–5999Google Scholar
  3. 3.
    Biswas J, Veloso M (2016) The 1,000-km challenge: insights and quantitative and qualitative results. IEEE Intell Syst 31(3):86–96CrossRefGoogle Scholar
  4. 4.
    Sprute D, Pörtner A, Rasch R, Battermann S, König M (2017) Ambient assisted robot object search. In: Enhanced quality of life and smart living: 15th international conference on smart homes and health telematics, pp 112–123Google Scholar
  5. 5.
    Moravec H, Elfes A (1985) High resolution maps from wide angle sonar. In: 1985 IEEE international conference on robotics and automation. Proceedings, pp 116–121Google Scholar
  6. 6.
    Sprute D, Rasch R, Tönnies K, König M (2017a) A framework for interactive teaching of virtual borders to mobile robots. In: 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 1175–1181Google Scholar
  7. 7.
    Sprute D, Tönnies K, König M (2017b) This far, no further: introducing virtual borders to mobile robots using a laser pointer. ArXiv:1708.06274 (currently under review)
  8. 8.
    Sprute D, Tönnies K, König M (2018) Virtual borders: Accurate definition of a mobile robot’s workspace using augmented reality. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 8574–8581Google Scholar
  9. 9.
    Sisbot EA, Marin-Urias LF, Broquère X, Sidobre D, Alami R (2010) Synthesizing robot motions adapted to human presence. Int J Soc Robot 2(3):329–343CrossRefGoogle Scholar
  10. 10.
    Rios-Martinez J, Spalanzani A, Laugier C (2015) From proxemics theory to socially-aware navigation: a survey. Int J Soc Robot 7(2):137–153CrossRefGoogle Scholar
  11. 11.
    Lindner F (2015) A conceptual model of personal space for human-aware robot activity placement. In: 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 5770–5775Google Scholar
  12. 12.
    Mead R, Matarić MJ (2017) Autonomous human–robot proxemics: socially aware navigation based on interaction potential. Auton Robots 41(5):1189–1201CrossRefGoogle Scholar
  13. 13.
    Tipaldi GD, Arras KO (2011) Please do not disturb! minimum interference coverage for social robots. In: 2011 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 1968–1973Google Scholar
  14. 14.
    Pandey AK, Alami R (2010) A framework towards a socially aware mobile robot motion in human-centered dynamic environment. In: 2010 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 5855–5860Google Scholar
  15. 15.
    Yamaoka F, Kanda T, Ishiguro H, Hagita N (2010) A model of proximity control for information-presenting robots. IEEE Trans Robot 26(1):187–195CrossRefGoogle Scholar
  16. 16.
    Ramrez OAI, Khambhaita H, Chatila R, Chetouani M, Alami R (2016) Robots learning how and where to approach people. In: 2016 25th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 347–353Google Scholar
  17. 17.
    Zender H, Jensfelt P, Kruijff GJM (2007) Human- and situation-aware people following. In: RO-MAN 2007—the 16th IEEE international symposium on robot and human interactive communication, pp 1131–1136Google Scholar
  18. 18.
    Ferrer G, Zulueta AG, Cotarelo FH, Sanfeliu A (2017) Robot social-aware navigation framework to accompany people walking side-by-side. Auton Robots 41(4):775–793CrossRefGoogle Scholar
  19. 19.
    Kruse T, Pandey AK, Alami R, Kirsch A (2013) Human-aware robot navigation: a survey. Robo Auton Syst 61(12):1726–1743CrossRefGoogle Scholar
  20. 20.
    Truong XT, Ngo TD (2016) Dynamic social zone based mobile robot navigation for human comfortable safety in social environments. Int J Soc Robot 8(5):663–684CrossRefGoogle Scholar
  21. 21.
    Hansen ST, Svenstrup M, Andersen HJ, Bak T (2009) Adaptive human aware navigation based on motion pattern analysis. In: RO-MAN 2009—the 18th IEEE international symposium on robot and human interactive communication, pp 927–932Google Scholar
  22. 22.
    Alempijevic A, Fitch R, Kirchner N (2013) Bootstrapping navigation and path planning using human positional traces. In: 2013 IEEE international conference on robotics and automation (ICRA), pp 1242–1247Google Scholar
  23. 23.
    Kim B, Pineau J (2016) Socially adaptive path planning in human environments using inverse reinforcement learning. Int J Soc Robot 8(1):51–66CrossRefGoogle Scholar
  24. 24.
    Charalampous K, Kostavelis I, Gasteratos A (2017) Recent trends in social aware robot navigation. Robot Auton Syst 93:85–104CrossRefGoogle Scholar
  25. 25.
    O’Callaghan ST, Singh SPN, Alempijevic A, Ramos FT (2011) Learning navigational maps by observing human motion patterns. In: 2011 IEEE international conference on robotics and automation (ICRA), pp 4333–4340Google Scholar
  26. 26.
    Hall E (1969) The hidden dimension: man’s use of space in public and private. Doubleday Anchor Books, Bodley Head, LondonGoogle Scholar
  27. 27.
    Sakamoto D, Honda K, Inami M, Igarashi T (2009) Sketch and run: a stroke-based interface for home robots. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, CHI ’09, pp 197–200Google Scholar
  28. 28.
    Chiu TY (2011) Virtual wall system for a mobile robotic device. European patent application EP000002388673A1Google Scholar
  29. 29.
    Neato (2017) Boundary markers for Neato Botvac series. https://www.neatorobotics.com/accessories/botvac/boundary-markers-for-neato-robot-vacuums/. Access: 12/17
  30. 30.
    Tölgyessy M, Dekan M, Duchoň F, Rodina J, Hubinský P, Chovanec L (2017) Foundations of visual linear human–robot interaction via pointing gesture navigation. Int J Soc Robot 9(4):509–523CrossRefGoogle Scholar
  31. 31.
    Kemp CC, Anderson CD, Nguyen H, Trevor AJ, Xu Z (2008) A point-and-click interface for the real world: laser designation of objects for mobile manipulation. In: Proceedings of the 3rd ACM/IEEE international conference on human robot interaction, HRI ’08, pp 241–248Google Scholar
  32. 32.
    Ishii K, Zhao S, Inami M, Igarashi T, Imai M (2009) Designing laser gesture interface for robot control. Human–computer interaction—INTERACT 2009, pp 479–492Google Scholar
  33. 33.
    Choi YS, Anderson CD, Glass JD, Kemp CC (2008) Laser pointers and a touch screen: intuitive interfaces for autonomous mobile manipulation for the motor impaired. In: Proceedings of the 10th international ACM SIGACCESS conference on computers and accessibility, ASSETS ’08, pp 225–232Google Scholar
  34. 34.
    Chen TL, Kemp CC (2010) Lead me by the hand: evaluation of a direct physical interface for nursing assistant robots. In: 2010 5th ACM/IEEE international conference on human–robot interaction (HRI), pp 367–374Google Scholar
  35. 35.
    Jevtic A, Doisy G, Parmet Y, Edan Y (2015) Comparison of interaction modalities for mobile indoor robot guidance: direct physical interaction, person following, and pointing control. IEEE Trans Hum Mach Syst 45(6):653–663CrossRefGoogle Scholar
  36. 36.
    Marques C, Cristoao J, Lima P, Frazao J, Ribeiro I, Ventura R (2006) RAPOSA: semi-autonomous robot for rescue operations. In: 2006 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 3988–3993Google Scholar
  37. 37.
    Hebert P, Ma J, Borders J, Aydemir A, Bajracharya M, Hudson N, Shankar K, Karumanchi S, Douillard B, Burdick J (2015) Supervised remote robot with guided autonomy and teleoperation (surrogate): a framework for whole-body manipulation. In: 2015 IEEE international conference on robotics and automation (ICRA), pp 5509–5516Google Scholar
  38. 38.
    Garrido-Jurado S, Munoz-Salinas R, Madrid-Cuevas F, Marín-Jiménez M (2014) Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognit 47(6):2280–2292CrossRefGoogle Scholar
  39. 39.
    Sprute D, Rasch R, Pörtner A, Battermann S, König M (2018) Gesture-based object localization for robot applications in intelligent environments. In: 2018 international conference on intelligent environments (IE), pp 48–55Google Scholar
  40. 40.
    Rouanet P, Oudeyer PY, Danieau F, Filliat D (2013) The impact of human–robot interfaces on the learning of visual objects. IEEE Trans Robot 29(2):525–541CrossRefGoogle Scholar
  41. 41.
    Vaughan J, Kratz S, Kimber D (2016) Look where you’re going: visual interfaces for robot teleoperation. In: 2016 25th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 273–280Google Scholar
  42. 42.
    Grisetti G, Stachniss C, Burgard W (2007) Improved techniques for grid mapping with rao-blackwellized particle filters. IEEE Trans Robot 23(1):34–46CrossRefGoogle Scholar
  43. 43.
    Quigley M, Conley K, Gerkey B, Faust J, Foote TB, Leibs J, Wheeler R, Ng AY (2009) ROS: an open-source robot operating system. In: ICRA workshop on open source softwareGoogle Scholar

Copyright information

© Springer Nature B.V. 2018

Authors and Affiliations

  1. 1.Bielefeld University of Applied SciencesMindenGermany
  2. 2.Faculty of Computer ScienceOtto-von-Guericke University MagdeburgMagdeburgGermany

Personalised recommendations