Advertisement

The Effectiveness of Multimodal Sensory Feedback on VR Users’ Behavior in an L-Collision Problem

  • Sumin KimEmail author
  • Krzysztof Izdebski
  • Peter König
Conference paper
Part of the Lecture Notes in Mechanical Engineering book series (LNME)

Abstract

Virtual Reality (VR) is highly dependent on visual information, although it offers multimodal channels for sensory feedback. In this study, we compared the effectiveness of different sensory modalities in the context of collision avoidance in the industrial manufacturing process. Participants performed a pick-and-place task with L-shaped objects on a virtual workstation. In a between-subject design each person performed one of four conditions: Baseline, Auditory, Haptic, and Visual condition. We measured the timing and accuracy of the performed actions. Statistical testing by an ANOVA showed a significant main effect, i.e. a difference between the conditions. We observed the lowest number of collisions in the auditory condition followed by the haptic, baseline and visual conditions. Post hoc tests revealed a significant difference between the auditory condition, the most accurate, and the visual condition, the least accurate. This implies that giving additional feedback by the visual modality is not optimal and utilizing a fully multimodal interface has increased effectivity.

Keywords

VR Multisensory feedback Collision Simulation 

Notes

Acknowledgments

We gratefully acknowledge the support by the project ErgoVR (BMBF, KMU Innovativ V5KMU17/221) and the SALT AND PEPPER Software GbmH & Co.KG.

References

  1. 1.
    Ragan, E.D., Bowman, D.A., Kopper, R., Stinson, C., Scerbo, S., McMahan, R.P.: Effects of field of view and visual complexity on virtual reality training effectiveness for a visual scanning task. IEEE Trans. Vis. Comput. Graph. 21(7), 794–807 (2015)CrossRefGoogle Scholar
  2. 2.
    Sigrist, R., Rauter, G., Riener, R., Wolf, P.: Augmented visual, auditory, haptic, and multimodal feedback in motor learning: a review. Psychon. Bull. Rev. 20(1), 21–53 (2013)CrossRefGoogle Scholar
  3. 3.
    Hayward, V., Astley, O.R., Cruz-Hernandez, M., Grant, D., Robles-De-La-Torre, G.: Haptic interfaces and devices. Sens. Rev. 24(1), 16–29 (2014)CrossRefGoogle Scholar
  4. 4.
    Burke, J.L., Prewett, M.S., Gray, A.A., Yang, L., Stilson, F.R., Coovert, M.D., Elliot, L.R., Redden, E.: Comparing the effects of visual-auditory and visual-tactile feedback on user performance: a meta-analysis. In: Proceedings of the 8th International Conference on Multimodal Interfaces, pp. 108–117. ACM (2006)Google Scholar
  5. 5.
    Scott, J.J., Gray, R.: A comparison of tactile, visual, and auditory warnings for rear-end collision prevention in simulated driving. Hum. Factors 50(2), 264–275 (2008)CrossRefGoogle Scholar
  6. 6.
    Burdea, G.C.: Keynote address: haptics feedback for virtual reality. In: Proceedings of International Workshop on Virtual Prototyping, Laval, France, pp. 87–96 (1999)Google Scholar
  7. 7.
    Srinivasan, M.A., Basdogan, C.: Haptics in virtual environments: taxonomy, research status, and challenges. Comput. Graph. 21(4), 393–404 (1997)CrossRefGoogle Scholar
  8. 8.
    Petzold, B., Zaeh, M.F., Faerber, B., Deml, B., Egermeier, H., Schilp, J., Clarke, S.: A study on visual, auditory, and haptic feedback for assembly tasks. Presence: Teleoper. Virtual Environ. 13(1), 16–21 (2004)CrossRefGoogle Scholar
  9. 9.
    Jacko, J.A., Scott, I.U., Sainfort, F., Barnard, L., Edwards, P.J., Emery, V.K., Kongnakorn, T., Moloney, K.P., Zorich, B.S.: Older adults and visual impairment: what do exposure times and accuracy tell us about performance gains associated with multimodal feedback? In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 33–40. ACM (2003)Google Scholar
  10. 10.
    Khan, A.Z., Blohm, G., McPeek, R.M., Lefevre, P.: Differential influence of attention on gaze and head movements. J. Neurophysiol. 101(1), 198–206 (2009)CrossRefGoogle Scholar
  11. 11.
    Wahn, B., König, P.: Can limitations of visuospatial attention be circumvented? A review. Front. Psychol. 8, 1896 (2017)CrossRefGoogle Scholar
  12. 12.
    Wahn, B., König, P.: Is attentional resource allocation across sensory modalities task-dependent? Adv. Cogn. Psychol. 13(1), 83 (2017)CrossRefGoogle Scholar
  13. 13.
    Wahn, B., Schwandt, J., Krüger, M., Crafa, D., Nunnendorf, V., König, P.: Multisensory teamwork: using a tactile or an auditory display to exchange gaze information improves performance in joint visual search. Ergonomics 59(6), 781–795 (2016)CrossRefGoogle Scholar
  14. 14.
    Lee, J.H., Spence, C.: Assessing the benefits of multimodal feedback on dual-task performance under demanding conditions. In: Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction-Volume 1. British Computer Society, pp. 185–192 (2008)Google Scholar
  15. 15.
    Frith, C.D., Blakemore, S.J., Wolpert, D.M.: Abnormalities in the awareness and control of action. Philos. Trans. R. Soc. London. Ser. B 355(1404), 1771–1788 (2000).  https://doi.org/10.1080/00140139.2015.1099742CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Institute of Cognitive ScienceUniversität OsnabrückOsnabrückGermany
  2. 2.Institut für Neurophysiologie und PathophysiologieUniversitätsklinikum Hamburg EppendorfHamburgGermany
  3. 3.SALT AND PEPPER Software GmbH & Co. KGOsnabrückGermany

Personalised recommendations