Feel It on Your Fingers: Dataglove with Vibrotactile Feedback for Virtual Reality and Telerobotics
With the rise of Virtual Reality (VR) applications it is interesting to see how immersion can be improved, especially by providing haptic feedback on the user hands, using affordable technologies. Indeed, while several commercial products exist that can be used as input devices (i.e. from the user to the virtual reality), such as data gloves or optical trackers, solutions for effective feedback (i.e. from the virtual reality to the user) are still lacking, especially at low prices. We describe here the design and realization of an affordable data glove to provide vibrotactile feedback to human users using small vibrating motors, and we report preliminary user studies to prove its effectiveness; interestingly, combined with a commercially available optical tracker (i.e. Leap Motion) to be used as input device, the data glove can be used in a wide range of Virtual Reality and Telerobotics applications. User studies include (i) rendering a feedback to multiple fingers at the same time, and recording how many stimuli the users could correctly differentiate, and (ii) simulating the stiffness of a virtual object, and testing through a Just Noticeable Difference (JND) experiment whether participants could differentiate two objects chosen among 20 pairs of objects with varying stiffness. It was found that participants (i) can easily detect simultaneous feedback on up to two fingers, but struggle to precisely localize feedback on more than three fingers, and they (ii) can differentiate virtual objects of different stiffness by virtually “squeezing” them, up to a certain JND.
KeywordsVibrotactile feedback Data glove Vibration frequency JND Positional difference Frequency mappings Stiffness Telerobotics
This work was partially supported by the EPSRC UK: project NCNR, EP/R02572X/1, and project MAN3, EP/S00453X/1.
- 4.Zilles, C.B., Salisbury, J.K.: A constraint-based god-object method for haptic display. In: Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots, vol. 3, pp. 146–151 (1995)Google Scholar
- 8.Desai, P.R., Desai, P.N., Ajmera, K.D., Mehta, K.: A review paper on oculus rift-a virtual reality headset. arXiv preprint arXiv:1408.1173 (2014)
- 9.Tan, C.T., Leong, T.W., Shen, S., Dubravs, C., Si, C.: Exploring gameplay experiences on the oculus rift. In: Proceedings of the 2015 Annual Symposium on Computer-Human Interaction in Play, pp. 253–263 (2015)Google Scholar
- 10.Burdea, G.C.: Haptics issues in virtual environments. In: Proceedings Computer Graphics International 2000, pp. 295–302 (2000)Google Scholar
- 18.Buechley, L., Eisenberg, M., Catchen, J., Crockett, A.: The LilyPad Arduino: using computational textiles to investigate engagement, aesthetics, and diversity in computer science education. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 423–432 (2008)Google Scholar
- 19.Freeman, C.W.: Bluetooth low energy platform with simblee (2016)Google Scholar
- 21.Paulino, T., et al.: Low-cost 3-axis soft tactile sensors for the human-friendly robot Vizzy. In: IEEE-RAS ICRA (2017)Google Scholar
- 23.Nogueira, J., Martinez-Cantin, R., Bernardino, A., Jamone, L.: Unscented Bayesian optimization for safe robot grasping. In: IEEE-RSJ IROS (2016)Google Scholar
- 24.Castanheira, J., Vicente, P., Martinez-Cantin, R., Jamone, L., Bernardino, A.: Finding safe 3D robot grasps through efficient haptic exploration with unscented Bayesian optimization and collision penalty. In: IEEE-RSJ IROS (2018)Google Scholar