Skip to main content
Log in

Visual–haptic integration, action and embodiment in virtual reality

  • Original Article
  • Published:
Psychological Research Aims and scope Submit manuscript

Abstract

The current generation of virtual reality (VR) technologies has improved substantially from legacy systems, particularly in the resolution and latency of their visual display. The presentation of haptic cues remains challenging, however, because haptic systems do not generalise well over the range of stimuli (both tactile and proprioceptive) normally present when interacting with objects in the world. This study investigated whether veridical tactile and proprioceptive cues lead to more efficient interaction with a virtual environment. Interaction in the world results in spatial and temporal correlation of tactile, proprioceptive and visual cues. When cues in VR are similarly correlated, observers experience a sense of embodiment and agency of their avatars. We investigated whether sensorimotor performance was mediated by embodiment of the avatar hands. Participants performed a Fitts’ tapping task in different conditions (VR with no haptics, active haptics, passive haptics, and on a real touchscreen). The active-haptic condition provided abstract tactile cues and the passive haptic condition provided veridical tactile and proprioceptive cues. An additional (hybrid haptics) condition simulated an ideal passive haptic system. Movement efficiency (throughput) and embodiment were higher for the passive than for the active and no-haptics conditions. However, components of embodiment (perceived agency and ownership) did not predict unique variance in throughput. Improved sensorimotor performance and ratings of presence and realism support the use of passive haptics in VR environments where objects are in known and stable locations, regardless of whether performance was mediated by the sense of embodiment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Data availability

Data are available at https://doi.org/10.6084/m9.figshare.14745906.v1. The experiment was approved by the University of Queensland Health and Behavioural Sciences Low and Negligible Risk Ethics Sub-Committee. Participants gave informed consent to participate and for their anonymous results to be published.

References

Download references

Funding

This research was supported by ARC Discovery project grant DP190100533 (to GW) and ARC Linkage grant LP180100377 (Industry Partner: Boeing) (to GW).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ken McAnally.

Ethics declarations

Conflict of interest

The authors declare that they have no conflicts of interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

McAnally, K., Wallis, G. Visual–haptic integration, action and embodiment in virtual reality. Psychological Research 86, 1847–1857 (2022). https://doi.org/10.1007/s00426-021-01613-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00426-021-01613-3

Navigation