Skip to main content

Accessibility of Co-Located Meetings

Introduction to the Special Thematic Session

  • Conference paper
  • First Online:
Computers Helping People with Special Needs (ICCHP-AAATE 2022)

Abstract

Non-verbal communication is an important carrier of information. Even though the spoken word can be heard by blind and visually impaired persons, up to 60% of the overall information still remains inaccessibly to them due to its visual character [11]. However, there is a wide spectrum of non-verbal communication elements, and not all of them are of the same importance. In particular for group meetings, facial expressions and pointing gestures are relevant, which need to be captured, interpreted and output to the blind and visually impaired person.

This session first gives a systematic approach to gather the accessibility requirements for blind and visually impaired persons, from which two typical requirements are selected and discussed in more detail. Here, solutions for capturing and interpreting are provided, and finally the session introduces a concept for accessible user interfaces.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://cs.anu.edu.au/few/AFEW.html.

References

  1. Dhingra, N., Valli, E., Kunz, A.: Recognition and localisation of pointing gestures using a RGB-D camera. In: Stephanidis, C., Antona, M. (eds.) HCII 2020. CCIS, vol. 1224, pp. 205–212. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-50726-8_27

    Chapter  Google Scholar 

  2. El-Gayyar, M., ElYamany, H.F., Gaber, T., Hassanien, A.E.: Social network framework for deaf and blind people based on cloud computing. In: 2013 Federated Conference on Computer Science and Information Systems, pp. 1313–1319. IEEE (2013)

    Google Scholar 

  3. Gorobets, V., Merkle, C., Kunz, A.: Pointing, pairing and grouping gesture recognition in virtual reality. In: Computers Helping People with Special Needs, 18th International Conference; Joint Conference ICCHP-AAATE, Lecco, Italy, Proceedings. Springer (2022)

    Google Scholar 

  4. Kane, S.K., Wobbrock, J.O., Ladner, R.E.: Usable gestures for blind people: understanding preference and performance. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 413–422 (2011)

    Google Scholar 

  5. Kim, J.: VIVR: presence of immersive interaction for visual impairment virtual reality. IEEE Access 8, 196151–196159 (2020)

    Article  Google Scholar 

  6. Koutny, R., Miesenberger, K.: Accessible user interface concept for business meeting tool support including spatial and non-verbal information for blind and visually impaired people. In: Computers Helping People with Special Needs, 18th International Conference; Joint Conference ICCHP-AAATE, Lecco, Italy, Proceedings. Springer (2022)

    Google Scholar 

  7. Kunz, A., et al.: Accessibility of brainstorming sessions for blind people. In: Miesenberger, K., Fels, D., Archambault, D., Peňáz, P., Zagler, W. (eds.) ICCHP 2014. LNCS, vol. 8547, pp. 237–244. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-08596-8_38

    Chapter  Google Scholar 

  8. Liechti, S., Dhingra, N., Kunz, A.: Detection and localisation of pointing, pairing and grouping gestures for brainstorming meeting applications. In: Stephanidis, C., Antona, M., Ntoa, S. (eds.) HCII 2021. CCIS, vol. 1420, pp. 22–29. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-78642-7_4

    Chapter  Google Scholar 

  9. Lutfallah, M., Käch, B., Hirt, C., Kunz, A.: Emotion recognition - a tool to improve meeting experience for visually impaired. In: Computers Helping People with Special Needs, 18th International Conference; Joint Conference ICCHP-AAATE, Lecco, Italy, Proceedings. Springer (2022)

    Google Scholar 

  10. Marinoiu, E., Zanfir, M., Olaru, V., Sminchisescu, C.: 3D human sensing, action and emotion recognition in robot assisted therapy of children with autism. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2158–2167 (2018)

    Google Scholar 

  11. Mehrabian, A., Ferris, S.: Inference of attitudes from nonverbal communication in two channels. J. Consult. Clin. Psychol. 3, 248–252 (1967)

    Article  Google Scholar 

  12. Oh Kruzic, C., Kruzic, D., Herrera, F., Bailenson, J.: Facial expressions contribute more than body movements to conversational outcomes in avatar-mediated virtual environments. Sci. Rep. 10(1), 1–23 (2020)

    Article  Google Scholar 

  13. Roth, D., Klelnbeck, C., Feigl, T., Mutschler, C., Latoschik, M.E.: Beyond replication: augmenting social behaviors in multi-user virtual realities. In: 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 215–222. IEEE (2018)

    Google Scholar 

  14. Tu, J.: Meetings in the Metaverse: Exploring Online Meeting Spaces through Meaningful Interactions in Gather. Town. Master’s thesis, University of Waterloo (2022)

    Google Scholar 

  15. Wieland, M., Thevin, L., Machulla, T.: Non-verbal communication and joint attention between people with and without visual impairments. guidelines for inclusive conversations in virtual realities. In: Computers Helping People with Special Needs, 18th International Conference; Joint Conference ICCHP-AAATE, Lecco, Italy, Proceedings. Springer (2022)

    Google Scholar 

  16. Yildirim, S., et al.: An acoustic study of emotions expressed in speech. In: Proceedings of the Eighth International Conference on Spoken Language Processing (2004)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andreas Kunz .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kunz, A., Koutny, R., Miesenberger, K. (2022). Accessibility of Co-Located Meetings. In: Miesenberger, K., Kouroupetroglou, G., Mavrou, K., Manduchi, R., Covarrubias Rodriguez, M., Penáz, P. (eds) Computers Helping People with Special Needs. ICCHP-AAATE 2022. Lecture Notes in Computer Science, vol 13341. Springer, Cham. https://doi.org/10.1007/978-3-031-08648-9_33

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-08648-9_33

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-08647-2

  • Online ISBN: 978-3-031-08648-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics