Abstract
The current generation of virtual reality (VR) technologies has improved substantially from legacy systems, particularly in the resolution and latency of their visual display. The presentation of haptic cues remains challenging, however, because haptic systems do not generalise well over the range of stimuli (both tactile and proprioceptive) normally present when interacting with objects in the world. This study investigated whether veridical tactile and proprioceptive cues lead to more efficient interaction with a virtual environment. Interaction in the world results in spatial and temporal correlation of tactile, proprioceptive and visual cues. When cues in VR are similarly correlated, observers experience a sense of embodiment and agency of their avatars. We investigated whether sensorimotor performance was mediated by embodiment of the avatar hands. Participants performed a Fitts’ tapping task in different conditions (VR with no haptics, active haptics, passive haptics, and on a real touchscreen). The active-haptic condition provided abstract tactile cues and the passive haptic condition provided veridical tactile and proprioceptive cues. An additional (hybrid haptics) condition simulated an ideal passive haptic system. Movement efficiency (throughput) and embodiment were higher for the passive than for the active and no-haptics conditions. However, components of embodiment (perceived agency and ownership) did not predict unique variance in throughput. Improved sensorimotor performance and ratings of presence and realism support the use of passive haptics in VR environments where objects are in known and stable locations, regardless of whether performance was mediated by the sense of embodiment.
Similar content being viewed by others
Data availability
Data are available at https://doi.org/10.6084/m9.figshare.14745906.v1. The experiment was approved by the University of Queensland Health and Behavioural Sciences Low and Negligible Risk Ethics Sub-Committee. Participants gave informed consent to participate and for their anonymous results to be published.
References
Batmaz, A. U., Mutasim, A. K., Malekmakan, M., Sadr, E., & Stuerzlinger, W. (2020). Touch the wall: Comparison of virtual and augmented reality with conventional 2D screen eye-hand coordination training systems. IEEE Conference on Virtual Reality and 3D User Interfaces (VR). https://doi.org/10.1109/VR46266.2020.00037
Benjamini, Y., & Hochberg, Y. (1995). Controlling the false discovery rate: A practical and powerful approach to multiple testing. Journal of the Royal Statistical Society Series b., 57(1), 289–300. https://doi.org/10.1111/j.2517-6161.1995.tb02031.x
Botvinick, M., & Cohen, J. (1998). Rubber hands ‘feel’ touch that eyes see. Nature, 391(6669), 756–756. https://doi.org/10.1038/35784
Culbertson, H., Schorr, S. B., & Okamura, A. M. (2018). Haptics: The present and future of artificial touch sensation. Annual Review of Control, Robotics, and Autonomous Systems, 1, 385–409. https://doi.org/10.1146/annurev-control-060117-1-5043
Faul, F., Erdfelder, E., Buchner, A., & Lang, A.-G. (2009). Statistical power analyses using G*Power 3.1: Tests for correlation and regression analyses. Behavior Research Methods, 41, 1149–1160.
Fitts, P. M. (1954). The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology, 47, 381–391. https://doi.org/10.1037/h0055392
Fu, M.J., Hershberger, A.D., Sano, K., & Çavuşoğlu, M.C. (2011). Effect of visuo-haptic co-location on 3D Fitts' task performance. In 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 3460–3467). https://doi.org/10.1109/IROS.2011.6094707
Fujisaki, W., & Nishida, S. Y. (2009). Audio–tactile superiority over visuo–tactile and audio–visual combinations in the temporal resolution of synchrony perception. Experimental Brain Research, 198(2–3), 245–259. https://doi.org/10.1007/s00221-009-1870-x
Harrar, V., & Harris, L. R. (2008). The effect of exposure to asynchronous audio, visual, and tactile stimulus combinations on the perception of simultaneity. Experimental Brain Research, 186(4), 517–524. https://doi.org/10.1007/s00221-007-1253-0
Hinckley, K., Pausch, R., Goble, J.C., & Kassell, N.F. (1994). Passive real-world interface props for neurosurgical visualization. In Proceedings of the SIGCHI Conference on Human factors in Computing Systems (pp. 452–458).
Howard, I. P., & Rogers, B. J. (2002). Seeing in depth, vol. 2: Depth perception. University of Toronto Press.
International Organization for Standardization. (2012). ISO/TS 9241-411 Ergonomics of human-system interaction — Part 411: Evaluation methods for the design of physical input devices. ISO/TS.
Joyce, R., & Robinson, S. K. (2019). Evaluation of a virtual reality environment for cockpit design. Proceedings of the Human Factors and Ergonomics Society Annual Meeting. https://doi.org/10.1177/1071181319631309
Kalckert, A., & Ehrsson, H. H. (2014). The moving rubber hand illusion revisited: Comparing movements and visuotactile stimulation to induce illusory ownership. Consciousness and Cognition, 26, 117–132. https://doi.org/10.1016/j.concog.2014.02.003
Kaplan, A. D., Cruit, J., Endsley, M., Beers, S. M., Sawyer, B. D., & Hancock, P. A. (2020). The effects of virtual reality, augmented reality, and mixed reality as training enhancement methods: A meta-analysis. Human Factors. https://doi.org/10.1177/0018720820904229 published online February 2020.
Kilteni, K., Groten, R., & Slater, M. (2012). The sense of embodiment in virtual reality. Presence: Teleoperators and Virtual Environments, 21(4), 373–387. https://doi.org/10.1162/PRES_a_00124
Ljubic, S., Glavinic, V., & Kukec, M. (2015). Finger-based pointing performance on mobile touchscreen devices: Fitts’ law fits. In: M. Antona and C. Stephanidis (Eds.): UAHCI 2015, Part I, LNCS 9175, pp. 318–329. https://doi.org/10.1007/978-3-319-20678-3_31
Long, B., Seah, S. A., Carter, T., & Subramanian, S. (2014). Rendering volumetric haptic shapes in mid-air using ultrasound. ACM Transactions on Graphics (TOG), 33(6), 1–10. https://doi.org/10.1145/2661229.2661257
Longo, M. R., Schüür, F., Kammers, M. P., Tsakiris, M., & Haggard, P. (2008). What is embodiment? A psychometric approach. Cognition, 107(3), 978–998. https://doi.org/10.1016/j.cognition.2007.12.004
MacKenzie, I. S. (2015). Fitts’ throughput and the remarkable case of touch-based target selection. In: M. Kurosu (Ed.): Human-Computer Interaction, Part II, HCII 2015, LNCS 9170, Springer International Publishing, pp. 238–249. https://doi.org/10.1007/978-3-319-20916-6_23
Massie, T. H., & Salisbury, J. K. (1994, November). The PHANTOM haptic interface: A device for probing virtual objects [Paper presentation]. ASME Winter Annual Meeting, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Chicago, IL, United States. http://ai.stanford.edu/~jks/pubs/1994-Massie-salisbury-asme-sohi.pdf
McAnally, K., Wallwork, K. & Wallis, G. (2021). The efficiency of visually-guided movement in real and virtual space [Manuscript submitted for publication]. School of Human Movement and Nutrition Sciences, The University of Queensland.
Perret, J., & Vander Poorten, E. (2018, June 25-27). Touching virtual reality: a review of haptic gloves [Paper presentation]. ACTUATOR 2018; 16th International Conference on New Actuators, Bremen, Germany. https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8470813
Sanchez-Vives, M. V., Spanlang, B., Frisoli, A., Bergamasco, M., & Slater, M. (2010). Virtual hand illusion induced by visuomotor correlations. PLoS One, 5(4), e10381. https://doi.org/10.1371/journal.pone.0010381
Scarfe, P., & Glennerster, A. (2019). The science behind virtual reality displays. Annual Review of Vision Science, 5, 529–547. https://doi.org/10.1146/annurev-vision-091718-014942
Schwind, V., Leusmann, J., & Henze, N. (2019). Understanding visual-haptic integration of avatar hands using a Fitts’ law task in virtual reality. In: Proceedings of MuC’19, Hamburg, Germany. https://doi.org/10.1145/3340764.3340769
Shapiro, L. & Spaulding, S. (2021). "Embodied Cognition", The Stanford Encyclopedia of Philosophy (Fall 2021 Edition), Edward N. Zalta (ed.), forthcoming. https://plato.stanford.edu/archives/fall2021/entries/embodied-cognition/.
Shapiro, L., & Stolz, S. A. (2019). Embodied cognition and its significance for education. Theory and Research in Education, 17(1), 19–39. https://doi.org/10.1177/1477878518822149
Slater, M., Pérez Marcos, D., Ehrsson, H., & Sanchez-Vives, M. V. (2009). Inducing illusory ownership of a virtual body. Frontiers in Neuroscience, 3, 29. https://doi.org/10.3389/neuro.01.029.2009
Soukoreff, R. W., & MacKenzie, I. S. (2004). Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts’ law research in HCI. International Journal of Human-Computer Studies, 61, 751–789. https://doi.org/10.1016/j.ijhcs.2004.09.001
Spence, C., Shore, D. I., & Klein, R. M. (2001). Multisensory prior entry. Journal of Experimental Psychology: General, 130(4), 799. https://doi.org/10.1037/0096-3445.130.4.799
Teather, R. J., & Stuerzlinger, W. (2010, May 31-June 2). Target pointing in 3D user interfaces [Poster presentation]. Graphics Interface 2010. http://www.cas.mcmaster.ca/~teather/pdfs/GI2010.pdf
Teather, R. J., & Stuerzlinger, W. (2011). Pointing at 3D targets in a stereo head-tracked virtual environment, IEEE symposium on 3D User Interfaces, 19–20 March 2011, Singapore. https://doi.org/10.1109/3DUI.2011.5759222
Toet, A., Kuling, I. A., Krom, B. N., & Van Erp, J. B. (2020). Toward enhanced teleoperation through embodiment. Frontiers Robotics AI, 7, 14. https://doi.org/10.3389/frobt.2020.00014
Viciana-Abad, R., Lecuona, A. R., & Poyade, M. (2010). The influence of passive haptic feedback and difference interaction metaphors on presence and task performance. Presence: Teleoperators and Virtual Environments, 19(3), 197–212. https://doi.org/10.1162/pres.19.3.197
Yuan, Y., & Steed, A. (2010). Is the rubber hand illusion induced by immersive virtual reality? In 2010 IEEE Virtual Reality Conference (VR) (pp. 95–102). IEEE. https://doi.org/10.1109/VR.2010.5444807
Funding
This research was supported by ARC Discovery project grant DP190100533 (to GW) and ARC Linkage grant LP180100377 (Industry Partner: Boeing) (to GW).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflicts of interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
McAnally, K., Wallis, G. Visual–haptic integration, action and embodiment in virtual reality. Psychological Research 86, 1847–1857 (2022). https://doi.org/10.1007/s00426-021-01613-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00426-021-01613-3