Advertisement

Feel It on Your Fingers: Dataglove with Vibrotactile Feedback for Virtual Reality and Telerobotics

  • Burathat Junput
  • Xuyi Wei
  • Lorenzo JamoneEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11649)

Abstract

With the rise of Virtual Reality (VR) applications it is interesting to see how immersion can be improved, especially by providing haptic feedback on the user hands, using affordable technologies. Indeed, while several commercial products exist that can be used as input devices (i.e. from the user to the virtual reality), such as data gloves or optical trackers, solutions for effective feedback (i.e. from the virtual reality to the user) are still lacking, especially at low prices. We describe here the design and realization of an affordable data glove to provide vibrotactile feedback to human users using small vibrating motors, and we report preliminary user studies to prove its effectiveness; interestingly, combined with a commercially available optical tracker (i.e. Leap Motion) to be used as input device, the data glove can be used in a wide range of Virtual Reality and Telerobotics applications. User studies include (i) rendering a feedback to multiple fingers at the same time, and recording how many stimuli the users could correctly differentiate, and (ii) simulating the stiffness of a virtual object, and testing through a Just Noticeable Difference (JND) experiment whether participants could differentiate two objects chosen among 20 pairs of objects with varying stiffness. It was found that participants (i) can easily detect simultaneous feedback on up to two fingers, but struggle to precisely localize feedback on more than three fingers, and they (ii) can differentiate virtual objects of different stiffness by virtually “squeezing” them, up to a certain JND.

Keywords

Vibrotactile feedback Data glove Vibration frequency JND Positional difference Frequency mappings Stiffness Telerobotics 

Notes

Acknowledgments

This work was partially supported by the EPSRC UK: project NCNR, EP/R02572X/1, and project MAN3, EP/S00453X/1.

References

  1. 1.
    Pantic, M., Rothkrantz, L.J.: Toward an affect-sensitive multimodal human-computer interaction. Proc. IEEE 91(9), 1370–1390 (2003)CrossRefGoogle Scholar
  2. 2.
    Zeng, Z., Pantic, M., Roisman, G.I., Huang, T.S.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009)CrossRefGoogle Scholar
  3. 3.
    Parsons, S., Cobb, S.: State-of-the-art of virtual reality technologies for children on the autism spectrum. Eur. J. Spec. Needs Educ. 26(3), 355–366 (2011)CrossRefGoogle Scholar
  4. 4.
    Zilles, C.B., Salisbury, J.K.: A constraint-based god-object method for haptic display. In: Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots, vol. 3, pp. 146–151 (1995)Google Scholar
  5. 5.
    Stone, R.J.: Haptic feedback: a brief history from telepresence to virtual reality. In: Brewster, S., Murray-Smith, R. (eds.) Haptic HCI 2000. LNCS, vol. 2058, pp. 1–16. Springer, Heidelberg (2001).  https://doi.org/10.1007/3-540-44589-7_1CrossRefGoogle Scholar
  6. 6.
    Hayward, V., Astley, O.R., Cruz-Hernandez, M., Grant, D., Robles-De-La-Torre, G.: Haptic interfaces and devices. Sens. Rev. 24(1), 16–29 (2004)CrossRefGoogle Scholar
  7. 7.
    Stevens, S.S.: Tactile vibration: dynamics of sensory intensity. J. Exp. Psychol. 57(4), 210–218 (1959)CrossRefGoogle Scholar
  8. 8.
    Desai, P.R., Desai, P.N., Ajmera, K.D., Mehta, K.: A review paper on oculus rift-a virtual reality headset. arXiv preprint arXiv:1408.1173 (2014)
  9. 9.
    Tan, C.T., Leong, T.W., Shen, S., Dubravs, C., Si, C.: Exploring gameplay experiences on the oculus rift. In: Proceedings of the 2015 Annual Symposium on Computer-Human Interaction in Play, pp. 253–263 (2015)Google Scholar
  10. 10.
    Burdea, G.C.: Haptics issues in virtual environments. In: Proceedings Computer Graphics International 2000, pp. 295–302 (2000)Google Scholar
  11. 11.
    Pongrac, H.: Vibrotactile perception: examining the coding of vibrations and the just noticeable difference under various conditions. Multimed. Syst. 13(4), 297–307 (2007)CrossRefGoogle Scholar
  12. 12.
    Hatzfeld, C., Cao, S., Kupnik, M., Werthschützky, R.: Vibrotactile force perception - absolute and differential thresholds and external influences. IEEE Trans. Haptics 9(4), 586–597 (2016)CrossRefGoogle Scholar
  13. 13.
    Culbertson, H., Unwin, J., Kuchenbecker, K.J.: Modeling and rendering realistic textures from unconstrained tool-surface interactions. IEEE Trans. Haptics 7(3), 381–393 (2014)CrossRefGoogle Scholar
  14. 14.
    Hollins, M., Bensmaïa, S.J.: The coding of roughness. Can. J. Exp. Psychol./Revue canadienne de psychologie experimentale 61(3), 184 (2007)CrossRefGoogle Scholar
  15. 15.
    Bensmaïa, S.J., Hollins, M.: The vibrations of texture. Somatosens. Mot. Res. 20(1), 33–43 (2003)CrossRefGoogle Scholar
  16. 16.
    Bolanowski, S.J., Verillo, R.T., Gescheider, G.A., Checkosky, C.M.: Four channels mediate the mechanical aspects of touch. J. Acoust. Soc. Am. 84(5), 1680–1694 (1988)CrossRefGoogle Scholar
  17. 17.
    Kontarinis, D.A., Howe, R.D.: Tactile display of vibratory information in teleoperation and virtual environments. Presence Teleoperators Virtual Environ. 4(4), 387–402 (1995)CrossRefGoogle Scholar
  18. 18.
    Buechley, L., Eisenberg, M., Catchen, J., Crockett, A.: The LilyPad Arduino: using computational textiles to investigate engagement, aesthetics, and diversity in computer science education. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 423–432 (2008)Google Scholar
  19. 19.
    Freeman, C.W.: Bluetooth low energy platform with simblee (2016)Google Scholar
  20. 20.
    Weichert, F., Bachmann, D., Rudak, B., Fisseler, D.: Analysis of the accuracy and robustness of the leap motion controller. Sensors 13(5), 6380–6393 (2013)CrossRefGoogle Scholar
  21. 21.
    Paulino, T., et al.: Low-cost 3-axis soft tactile sensors for the human-friendly robot Vizzy. In: IEEE-RAS ICRA (2017)Google Scholar
  22. 22.
    Tomo, T.P., et al.: Covering a robot fingertip with uSkin: a soft electronic skin with distributed 3-axis force sensitive elements for robot hands. IEEE Robot. Autom. Lett. 3(1), 124–131 (2018)CrossRefGoogle Scholar
  23. 23.
    Nogueira, J., Martinez-Cantin, R., Bernardino, A., Jamone, L.: Unscented Bayesian optimization for safe robot grasping. In: IEEE-RSJ IROS (2016)Google Scholar
  24. 24.
    Castanheira, J., Vicente, P., Martinez-Cantin, R., Jamone, L., Bernardino, A.: Finding safe 3D robot grasps through efficient haptic exploration with unscented Bayesian optimization and collision penalty. In: IEEE-RSJ IROS (2018)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.ARQ (Advanced Robotics at Queen Mary), School of Electronic Engineering and Computer ScienceQueen Mary University of LondonLondonUK

Personalised recommendations