Skip to main content

Non-verbal Communication and Joint Attention Between People with and Without Visual Impairments: Deriving Guidelines for Inclusive Conversations in Virtual Realities

  • Conference paper
  • First Online:
Computers Helping People with Special Needs (ICCHP-AAATE 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13341))

Abstract

With the emergence of mainstream virtual reality (VR) platforms for social interactions, non-verbal communicative cues are increasingly being transmitted into the virtual environment. Since VR is primarily a visual medium, accessible VR solutions are required for people with visual impairments (PVI). However, existing propositions do not take into account social interactions, and therefore PVI are excluded from this type of experience. To address this issue, we conducted semi-structured interviews with eleven participants, seven of whom were PVI and four of whom were partners or close friends without visual impairments, to explore how non-verbal cues and joint attention are used and perceived in everyday social situations and conversations. Our goal was to provide guidelines for inclusive conversations in virtual environments for PVI. Our findings suggest that gaze, head direction, head movements, and facial expressions are important for both groups in conversations but often difficult to identify visually for PVI. From our findings, we provide concrete suggestions for the design of social VR spaces, inclusive to PVI.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    www.vive.com/us/accessory/facial-tracker/.

References

  1. Blindness and vision impairment: World Health Organization (2021). https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment

  2. Cha, H.S., Choi, S.J., Im, C.H.: Real-time recognition of facial expressions using facial electromyograms recorded around the eyes for social virtual reality applications. IEEE Access 8, 62065–62075 (2020)

    Article  Google Scholar 

  3. Freeman, G., Maloney, D.: Body, avatar, and me: the presentation and perception of self in social virtual reality. Proc. ACM Hum.-Comput. Interact. 4(CSCW3), 1–27 (2021)

    Article  Google Scholar 

  4. Kim, J.: VIVR: presence of immersive interaction for visual impairment virtual reality. IEEE Access 8, 196151–196159 (2020)

    Article  Google Scholar 

  5. Kobayashi, H., Kohshima, S.: Unique morphology of the human eye and its adaptive meaning: comparative studies on external morphology of the primate eye. J. Hum. Evol. 40(5), 419–435 (2001)

    Article  Google Scholar 

  6. Lang, F., Schmidt, A., Machulla, T.: Augmented reality for people with low vision: symbolic and alphanumeric representation of information. In: Miesenberger, K., Manduchi, R., Covarrubias Rodriguez, M., Peňáz, P. (eds.) ICCHP 2020. LNCS, vol. 12376, pp. 146–156. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58796-3_19

    Chapter  Google Scholar 

  7. Latoschik, M.E., Roth, D., Gall, D., Achenbach, J., Waltemate, T., Botsch, M.: The effect of avatar realism in immersive social virtual realities. In: Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology, pp. 1–10. ACM, Gothenburg Sweden (2017)

    Google Scholar 

  8. Masnadi, S., Williamson, B., Gonzalez, A.N.V., LaViola, J.J.: VRiAssist: an eye-tracked virtual reality low vision assistance tool. In: 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), pp. 808–809. IEEE, Atlanta, GA, USA (2020)

    Google Scholar 

  9. Mundy, P., Newell, L.: Attention, joint attention, and social cognition. Curr. Dir. Psychol. Sci. 16(5), 269–274 (2007)

    Article  Google Scholar 

  10. Oh Kruzic, C., Kruzic, D., Herrera, F., Bailenson, J.: Facial expressions contribute more than body movements to conversational outcomes in avatar-mediated virtual environments. Sci. Rep. 10(1), 20626 (2020)

    Article  Google Scholar 

  11. Qiu, S., Hu, J., Han, T., Osawa, H., Rauterberg, M.: An evaluation of a wearable assistive device for augmenting social interactions. IEEE Access 8, 164661–164677 (2020)

    Article  Google Scholar 

  12. Roth, D., Klelnbeck, C., Feigl, T., Mutschler, C., Latoschik, M.E.: Beyond replication: augmenting social behaviors in multi-user virtual realities. In: 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 215–222. IEEE, Tuebingen/Reutlingen, Germany (2018)

    Google Scholar 

  13. Thevin, L., Machulla, T.: Guidelines for inclusive avatars and agents: how persons with visual impairments detect and recognize others and their activities. In: Miesenberger, K., Manduchi, R., Covarrubias Rodriguez, M., Peňáz, P. (eds.) ICCHP 2020. LNCS, vol. 12376, pp. 164–175. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58796-3_21

    Chapter  Google Scholar 

  14. Tomasello, M., Hare, B., Lehmann, H., Call, J.: Reliance on head versus eyes in the gaze following of great apes and human infants: the cooperative eye hypothesis. J. Hum. Evol. 52(3), 314–320 (2007)

    Article  Google Scholar 

  15. Torres-Gil, M.A., Casanova-Gonzalez, O., Gonzalez-Mora, J.L.: Applications of virtual reality for visually impaired people. WSEAS Trans. Comput. 9(2), 184–193 (2010)

    Google Scholar 

  16. Waltemate, T., Gall, D., Roth, D., Botsch, M., Latoschik, M.E.: The impact of avatar personalization and immersion on virtual body ownership, presence, and emotional response. IEEE Trans. Vis. Comput. Graph. 24(4), 1643–1652 (2018)

    Article  Google Scholar 

  17. Zhao, Y., et al.: Enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1–14. ACM, Montreal, QC, Canada (2018)

    Google Scholar 

  18. Zhao, Y., Cutrell, E., Holz, C., Morris, M.R., Ofek, E., Wilson, A.D.: SeeingVR: a set of tools to make virtual reality more accessible to people with low vision. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1–14. ACM, Glasgow, Scotland, UK (2019)

    Google Scholar 

Download references

Acknowledgments

Research was supported by the BMBF (project HIVE: grant no. 16SV8183). We would also like to thank the reviewers and ACs for their work and valuable feedback.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Markus Wieland .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wieland, M., Thevin, L., Schmidt, A., Machulla, T. (2022). Non-verbal Communication and Joint Attention Between People with and Without Visual Impairments: Deriving Guidelines for Inclusive Conversations in Virtual Realities. In: Miesenberger, K., Kouroupetroglou, G., Mavrou, K., Manduchi, R., Covarrubias Rodriguez, M., Penáz, P. (eds) Computers Helping People with Special Needs. ICCHP-AAATE 2022. Lecture Notes in Computer Science, vol 13341. Springer, Cham. https://doi.org/10.1007/978-3-031-08648-9_34

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-08648-9_34

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-08647-2

  • Online ISBN: 978-3-031-08648-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics