Towards a Wearable Interface for Immersive Telepresence in Robotics

  • Uriel Martinez-Hernandez
  • Michael Szollosy
  • Luke W. Boorman
  • Hamideh Kerdegari
  • Tony J. Prescott
Conference paper
Part of the Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering book series (LNICST, volume 196)

Abstract

In this paper we present an architecture for the study of telepresence, immersion and human-robot interaction. The architecture is built around a wearable interface that provides the human user with visual, audio and tactile feedback from a remote location. We have chosen to interface the system with the iCub humanoid robot, as it mimics many human sensory modalities, including vision (with gaze control) and tactile feedback, which offers a richly immersive experience for the human user. Our wearable interface allows human participants to observe and explore a remote location, while also being able to communicate verbally with others located in the remote environment. Our approach has been tested from a variety of distances, including university and business premises, and using wired, wireless and Internet based connections, using data compression to maintain the quality of the experience for the user. Initial testing has shown the wearable interface to be a robust system of immersive teleoperation, with a myriad of potential applications, particularly in social networking, gaming and entertainment.

Keywords

Telepresence Immersion Wearable computing Human-robot interaction Virtual reality 

References

  1. 1.
    Sheridan, T.B.: Telerobotics. Automatica 4, 487–507 (1989). ElsevierCrossRefGoogle Scholar
  2. 2.
    Sheridan, T.B.: Teleoperation, telerobotics and telepresence: a progress report. Control Eng. Pract. 2, 205–214 (1995). ElsevierCrossRefGoogle Scholar
  3. 3.
    Rae, I., Venolia, G., Tang, J.C., Molnar, D.: A framework for understanding and designing telepresence. In: 18th ACM Conference on Computer Supported Cooperative Work & Social Computing, pp. 1552–1566. ACM (2015)Google Scholar
  4. 4.
    Li, L., Cox, B., Diftler, M., Shelton, S., Rogers, B.: Development of a telepresence controlled ambidextrous robot for space applications. IEEE Int. Conf. Robot. Autom. 1, 58–63 (1996)CrossRefGoogle Scholar
  5. 5.
    Fisher, S.S., Wenzel, E.M., Coler, C., McGreevy, M.W.: Virtual interface environment workstations. In: Annual Meeting of the Human Factors and Ergonomics Society, vol. 2, pp. 91–95. SAGE (1988)Google Scholar
  6. 6.
    Stone, R.J.: Haptic feedback: a brief history from telepresence to virtual reality. In: Brewster, S., Murray-Smith, R. (eds.) Haptic HCI 2000. LNCS, vol. 2058, pp. 1–16. Springer, Heidelberg (2001). doi:10.1007/3-540-44589-7_1 CrossRefGoogle Scholar
  7. 7.
    Gibert, G., Petit, M., Lance, F., Pointeau, G., Dominey, P.F.: What makes human so different? Analysis of human-humanoid robot interaction with a super Wizard of Oz platform. In: International Conference on Intelligent Robots and Systems (2013)Google Scholar
  8. 8.
    Akamatsu, M., Sato, S., MacKenzie, I.S.: Multimodal mouse: a mouse-type device with tactile and force display. Presence Teleoperators Virtual Environ. 1, 73–80 (1994). MIT PressCrossRefGoogle Scholar
  9. 9.
    Stassen, H.G.: The rehabilitation of severely disabled persons. A man-machine system approach. Adv. Man-Mach. Syst. Res. 5, 153–227 (1989)Google Scholar
  10. 10.
    Stassen, H.G., Smets, G.J.F.: Telemanipulation and telepresence. Control Eng. Pract. 3, 363–374 (1997). ElsevierCrossRefGoogle Scholar
  11. 11.
    Martinez-Hernandez, U., Boorman, L.W., Prescott, Tony J.: Telepresence: immersion with the iCub humanoid robot and the oculus rift. In: Wilson, Stuart P., Verschure, Paul F.M.J., Mura, Anna, Prescott, Tony J. (eds.) LIVINGMACHINES 2015. LNCS (LNAI), vol. 9222, pp. 461–464. Springer, Cham (2015). doi:10.1007/978-3-319-22979-9_46
  12. 12.
    Martinez-Hernandez, U., Damianou, A., Camilleri, D., Boorman, L.W., Lawrence, N., Prescott, T.J.: An integrated probabilistic framework for robot perception, learning and memory. In: IEEE International ROBIO Conference (2016, in press)Google Scholar
  13. 13.
    Metta, G., Natale, L., Nori, F., Sandini, G., Vernon, D., Fadiga, L., Von-Hofsten, C., Rosander, K., Lopes, M., Santos-Victor, J., Bernardino, A., Montesano, L.: The iCub humanoid robot an open-systems platform for research in cognitive development. Neural Netw. 8, 1125–1134 (2010). ElsevierCrossRefGoogle Scholar
  14. 14.
    Martinez-Hernandez, U., Rubio-Solis, A., Prescott, T.J.: Bayesian perception of touch for control of robot emotion. In: IEEE International Joint Conference on Neural Networks (2016)Google Scholar
  15. 15.
    Desai, P.R., Desai, P.N., Ajmera, K.D., Mehta, K.: A review paper on oculus rift-a virtual reality headset. arXiv preprint arXiv:1408.1173 (2014)
  16. 16.
    Martinez-Hernandez, U., Dodd, T.J., Evans, M.H., Prescott, T.J., Lepora, N.F.: Active sensorimotor control for tactile exploration. In: Robotics and Autonomous Systems, vol. 87, pp. 15–27. Elsevier (2017)Google Scholar
  17. 17.
    Martinez-Hernandez, U., Dodd, T.J., Natale, L., Metta, G., Prescott, T.J., Lepora, N.F.: Active contour following to explore object shape with robot touch. In: IEEE World Haptics Conference, pp. 341–346 (2013)Google Scholar
  18. 18.
    Martinez-Hernandez, U., Dodd, T.J., Prescott, T.J., Lepora, N.F.: Active Bayesian perception for angle and position discrimination with a biomimetic fingertip. In: International Conference on Intelligent Robots and Systems, pp. 5968–5973 (2013)Google Scholar
  19. 19.
    Martinez-Hernandez, U., Dodd, T.J., Prescott, T.J., Lepora, N.F.: Angle and position perception for exploration with active touch. In: Lepora, N.F., Mura, A., Krapp, H.G., Verschure, P.F.M.J., Prescott, T.J. (eds.) Living Machines 2013. LNCS (LNAI), vol. 8064, pp. 405–408. Springer, Heidelberg (2013). doi:10.1007/978-3-642-39802-5_49 CrossRefGoogle Scholar
  20. 20.
    Martinez-Hernandez, U., Lepora, N.F., Prescott, T.J.: Active haptic shape recognition by intrinsic motivation with a robot hand. In: World Haptics Conference (WHC), pp. 299–304. IEEE (2015)Google Scholar
  21. 21.
    Martinez-Hernandez, U.: Tactile sensors. In: Scholarpedia of Touch, pp. 783–796. Atlantis Press (2016)Google Scholar
  22. 22.
    Yet another robot platform. http://eris.liralab.it/yarpdoc/index.html
  23. 23.
    Pattacini, U.: Modular cartesian controllers for humanoid robots: design and implementation on the iCub. Ph.D. dissertation. RBCS, IIT, Genova (2011)Google Scholar
  24. 24.
    PortAudio Portable Real-Time Audio Library. http://www.portaudio.com
  25. 25.
    Kim, H., Giacomo, T., Egges, A., Lyard, L., Garchery, S., Magnenat-Thalmann, N.: Believable virtual environment: sensory and perceptual believability (2008)Google Scholar
  26. 26.
    Luciani, A.: Dynamics as common criterion to enhance the sense of presence in virtual environments. In: 7th International Workshop on Presence, pp. 96–103. ISPR (2004)Google Scholar

Copyright information

© ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering 2017

Authors and Affiliations

  • Uriel Martinez-Hernandez
    • 1
    • 2
  • Michael Szollosy
    • 2
  • Luke W. Boorman
    • 2
  • Hamideh Kerdegari
    • 2
  • Tony J. Prescott
    • 2
  1. 1.Institute of Design, Robotics and OptimisationUniversity of LeedsLeedsUK
  2. 2.Sheffield Robotics LaboratoryUniversity of SheffieldSheffieldUK

Personalised recommendations