Advertisement

3D Interaction Accessible to Visually Impaired Users: A Systematic Review

  • Erico de Souza VeriscimoEmail author
  • João Luiz BernardesJr.
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9738)

Abstract

There is currently a large number of visually impaired people in Brazil and worldwide. And just as any citizen they have their rights, including in them the right to education and other services that accelerate the process of social. With advent of technology increasingly virtual environments in three dimensions are being used for various areas. But often these environments are not accessible to visually impaired becoming a digital divide. In this context, a review of interactions in three dimensions accessible to visually impaired people may facilitate the work of researchers and developers to build such accessible applications. This paper presents the results of such a systematic literature review.

Keywords

3D interaction Visual impairment Virtual environments 

References

  1. 1.
    IBGE, Diretoria de Pesquisas. Departamento de População e Indicadores Sociais.Rio de Janeiro (2010)Google Scholar
  2. 2.
    OMS, Organização Mundial da Saúde.: Global data on visual impairments 2010. Geneva, 17 p 2010. Disponível em. <http://www.who.int/entity/blindness/GLOBALDATAFINALforweb.pdf> Acessado em: 21 Nov. 2014 (2010)
  3. 3.
    ONU. Declaração de Direitos das Pessoas Deficientes in: Assembléia Geral da Organização das Nações Unidas. 09 Dez (1975)Google Scholar
  4. 4.
    White, G., Fitzpatrick, G., McAllister, G.: Toward accessible 3D virtual environments for the blind and visually impaired. In: Proceedings of the 3rd International Conference on Digital Interactive Media in Entertainment and Arts. DIMEA 2008, vol. 349, pp. 134–141. ACM, New York (2008)Google Scholar
  5. 5.
    Kitchenham, B., Brereton, O., Budegen, D., Turner, M., Bailey, J., Linkman, S.: Systematic literature reviews in software engineering–a systematic literature review. Inf. Softw. Technol. 51(1), 7–15 (2009)CrossRefGoogle Scholar
  6. 6.
    Schätzle, S., Weber, B.: Towards vibrotactile direction and distance information for virtual reality and workstations for blind people. In: Antona, M., Stephanidis, C. (eds.) UAHCI 2015. LNCS, vol. 9176, pp. 148–160. Springer, Heidelberg (2015)CrossRefGoogle Scholar
  7. 7.
    Jain, D.: Path-guided indoor navigation for the visually impaired using minimal building retrofitting. In: Proceedings of the 16th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 225–232 (2014)Google Scholar
  8. 8.
    Gallo, S., Chapuis, D., Santos-Carreras, L., Kim, Y., Retornaz, P., Bleuler, H., Gassert, R.: Augmented white cane with multimodal haptic feedback. In: 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics, pp. 149–155 (2010)Google Scholar
  9. 9.
    Shangguan, L., Yang, Z., Zhou, Z.: CrossNavi: enabling real-time crossroad navigation for the blind with commodity phones. In: UbiComp 2014 - Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (2014)Google Scholar
  10. 10.
    Amemiya, T., Yamashita, J., Hirota, K., Hirose, M.: Virtual leading blocks for the deaf-blind: a real-time way-finder by verbal-nonverbal hybrid interface and high-density RFID tag space. In: IEEE Virtual Reality, pp. 165–287 (2004)Google Scholar
  11. 11.
    Berretta, L., Soares, F., Ferreira, D.J., Nascimento, H.A.D., Cardoso, A., Lamounier, E.: Virtual environment manipulated by recognition of poses using kinect: a study to help blind locomotion. In: 2013 XV Symposium on Unfamiliar Surroundings in Virtual and Augmented Reality (SVR), pp. 10–16 (2013)Google Scholar
  12. 12.
    Chuang, C., Hsieh, J., Fan, K.: A smart handheld device navigation system based on detecting visual code. In: 2013 International Conference on Machine Learning and Cybernetics, vol. 1, pp. 1407–1412 (2013)Google Scholar
  13. 13.
    Fallah, N., Apostolopoulos, I., Bekris, K., Folmer, E.: The user as a sensor. In: Proceedings of the 2012 ACM Annual Conference on Human Factors in Computing Systems - CHI 2012, p. 425 (2012)Google Scholar
  14. 14.
    Heller, F., Borchers, J.: AudioTorch: using a smartphone as directional microphone in virtual audio spaces. In: Proceedings of the 16th International Conference on Human-computer Interaction with Mobile Devices & Services, pp. 483–488 (2014)Google Scholar
  15. 15.
    Jain, D.: Pilot evaluation of a path-guided indoor navigation system for visually impaired in a public museum. In: Proceedings of the 16th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 273–274 (2014)Google Scholar
  16. 16.
    Joseph, S.L., Zhang, X., Dryanovski, I., Xiao, J., Yi, C., Tian, Y.: Semantic indoor navigation with a blind-user oriented augmented reality. In: 2013 IEEE International Conference on Systems, Man, and Cybernetics, 2013, no. 65789, pp. 3585–3591 (2013)Google Scholar
  17. 17.
    Magnusson, C., Molina, M., Grohn, K.R., Szymczak, D.: Pointing for non-visual orientation and navigation. In: Proceedings 6th Nord Conference Human-Computer Interact. Extending Boundaries - Nord. 2010, p. 735 (2010)Google Scholar
  18. 18.
    Magnusson, C., Waern, A., Grohn, K.R., Bjernryd, A., Bernhardsson, H., Jakobsson, A., Salo, J., Wallon, M., Hedvall, P.O.: Navigating the world and learning to like it. In: Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services - MobileHCI 2011, p. 285 (2011)Google Scholar
  19. 19.
    Paneels, S.A., Olmos, A., Blum, J.R., Cooperstock, J.R.: Listen to it yourself!: evaluating usability of what’s around me? for the blind. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2107–2116 (2013)Google Scholar
  20. 20.
    Raposo, N., Rios, H., Lima, D., Gadelha, B., Castro, T.: An application of mobility aids for the visually impaired. In: Proceedings of the 13th International Conference on Mobile and Ubiquitous Multimedia - MUM 2014, pp. 180–189 (2014)Google Scholar
  21. 21.
    Ribeiro, F., Florencio, D., Chou, P.A., Zhang, Z.: Auditory augmented reality: Object sonification for the visually impaired. In: 2012 IEEE 14th International Workshop on Multimedia Signal Processing (MMSP), pp. 319–324 (2012)Google Scholar
  22. 22.
    Schneider, J., Strothotte, T.: Constructive exploration of spatial information by blind users. In: Proceedings of the Fourth International ACM Conference on Assistive Technologies - Assets 2000 (2000)Google Scholar
  23. 23.
    Soukaras, D.P., Chaniotis, I.K., Karagiannis, I.G., Stampologlou, I.S., Triantafyllou, C.A., Tselikas, N.D., Foukarakis, I.E., Boucouvalas, A.C.: Augmented audio reality mobile application specially designed for visually impaired people. In: 2012 16th Panhellenic Conference on Informatics, pp. 13–18 (2012)Google Scholar
  24. 24.
    Zollner, M., Huber, S., Jetter, H.C., Reiterer, H.: NAVI: a proof-of-concept of a mobile navigational aid for visually impaired based on the microsoft kinect. In: Proceedings of the 13th IFIP TC 13 International Conference on Human-computer Interaction - Volume Part IV, pp. 584–587 (2011)Google Scholar
  25. 25.
    Rodriguez-Sanchez, M.C., Moreno-Alvarez, M.A., Martin, E., Borromeo, S., Hernandez-Tamames, J.A.: Accessible smartphones for blind users: A case study for a wayfinding system. In: Expert Systems with Applications (2014)Google Scholar
  26. 26.
    Doush, I.A., Alshattnawi, S., Barhoush, M.: Non-visual navigation interface for completing tasks with a predefined order using mobile phone: a case study of pilgrimage. Int. J. Mobile Netw. Design Innov. 6(1), 1–13 (2015)CrossRefGoogle Scholar
  27. 27.
    Tang, T.J.J., Li, W.H.: An assistive EyeWear prototype that interactively converts 3D object locations into spatial audio. In: Proceedings of the 2014 ACM International Symposium on Wearable Computers - ISWC 2014, pp. 119–126 (2014)Google Scholar
  28. 28.
    Vaananen-Vainio-Mattila, K., Suhonen, K., Laaksonen, J., Kildal, J., Tahiroglu, K.: User experience and usage scenarios of audio-tactile interaction with virtual objects in a physical environment. In: Proceedings of the 6th International Conference on Designing Pleasurable Products and Interfaces - DPPI 2013, p. 67 (2013)Google Scholar
  29. 29.
    Deville, B., Bologna, G., Pun, T.: Detecting objects and obstacles for visually impaired individuals using visual saliency. In: Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility - ASSETS 2010, p. 253 (2010)Google Scholar
  30. 30.
    Dramas, F., Oriola, B., Katz, B.G., Thorpe, S.J., Jouffrais, C.: Designing an assistive device for the blind based on object localization and augmented auditory reality. In: Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility - Assets 2008, p. 263 (2008)Google Scholar
  31. 31.
    Al-Khalifa, A.S., Al-Khalifa, H.S.: Do-It-Yourself object identification using augmented reality for visually impaired people. In: Miesenberger, K., Karshmer, A., Penaz, P., Zagler, W. (eds.) ICCHP 2012, Part II. LNCS, vol. 7383, pp. 560–565. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  32. 32.
    Nanayakkara, S., Shilkrot, R.: EyeRing: a finger-worn input device for seamless interactions with our surroundings. In: AH 2013 Proceedings of the 4th Augmented Human International Conference (2013)Google Scholar
  33. 33.
    Nanayakkara, S., Shilkrot, R., Maes, P.: EyeRing: A finger-worn assistant. In: CHI 2012 Extended Abstracts on Human Factors in Computing Systems, pp. 1961–1966 (2012)Google Scholar
  34. 34.
    Niinimaki M., Tahiroglu, K.: AHNE: a novel interface for spatial interaction. In: CHI 2012 Extended Abstracts on Human Factors in Computing Systems, 2012, pp. 1031–1034 (2012)Google Scholar
  35. 35.
    Ritterbusch, S., Constantinescu, A., Koch, V.: Hapto-acoustic scene representation. In: Miesenberger, K., Karshmer, A., Penaz, P., Zagler, W. (eds.) ICCHP 2012, Part II. LNCS, vol. 7383, pp. 644–650. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  36. 36.
    Buonamici, F., Furferi, R., Governi, L., Volpe, Y.: Making blind people autonomous in the exploration of tactile models: a feasibility study. In: Antona, M., Stephanidis, C. (eds.) UAHCI 2015. LNCS, vol. 9176, pp. 82–93. Springer, Heidelberg (2015)CrossRefGoogle Scholar
  37. 37.
    Baldan, S., Gotzen, A., de Serafin, S.: Mobile rhythmic interaction in a sonic tennis game. In: CHI 2013 Extended Abstracts on Human Factors in Computing Systems on - CHI EA 2013, p. 2903 (2013)Google Scholar
  38. 38.
    Ando, H., Miki, T., Inami, M., Maeda, T.: SmartFinger: nail-mounted tactile display. In: ACM SIGGRAPH 2002 conference abstracts and applications on - SIGGRAPH 2002, 2002, p. 78 (2002)Google Scholar
  39. 39.
    Ba, O., Poupyrev, I., Goc, M.L., Galliot, L., Glisson, M.: REVEL: tactile feedback technology for augmented reality. In: SIGGRAPH 2012 ACM SIGGRAPH 2012 Emerging Technologies (2012)Google Scholar
  40. 40.
    Khambadkar, V., Folmer, E.: GIST: a gestural interface for remote nonvisual spatial perception. In: Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology - UIST 2013, pp. 301–310 (2013)Google Scholar
  41. 41.
    Hermann, T., Neumann, A., Zehe, S.: Head gesture sonification for supporting social interaction. In: Proceedings of the 7th Audio Most. Conf. A Conf. Interact. with Sound - AM 2012, pp. 82–89 (2012)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Erico de Souza Veriscimo
    • 1
    Email author
  • João Luiz BernardesJr.
    • 1
  1. 1.School of Arts Sciences and Humanities – EACHUniversity of São PauloSão PauloBrazil

Personalised recommendations