Advertisement

Virtual Reality

, Volume 16, Issue 4, pp 253–269 | Cite as

NAVIG: augmented reality guidance system for the visually impaired

Combining object localization, GNSS, and spatial audio
  • Brian F. G. KatzEmail author
  • Slim Kammoun
  • Gaëtan Parseihian
  • Olivier Gutierrez
  • Adrien Brilhault
  • Malika Auvray
  • Philippe Truillet
  • Michel Denis
  • Simon Thorpe
  • Christophe Jouffrais
Original Article

Abstract

Navigating complex routes and finding objects of interest are challenging tasks for the visually impaired. The project NAVIG (Navigation Assisted by artificial VIsion and GNSS) is directed toward increasing personal autonomy via a virtual augmented reality system. The system integrates an adapted geographic information system with different classes of objects useful for improving route selection and guidance. The database also includes models of important geolocated objects that may be detected by real-time embedded vision algorithms. Object localization (relative to the user) may serve both global positioning and sensorimotor actions such as heading, grasping, or piloting. The user is guided to his desired destination through spatialized semantic audio rendering, always maintained in the head-centered reference frame. This paper presents the overall project design and architecture of the NAVIG system. In addition, details of a new type of detection and localization device are presented. This approach combines a bio-inspired vision system that can recognize and locate objects very quickly and a 3D sound rendering system that is able to perceptually position a sound at the location of the recognized object. This system was developed in relation to guidance directives developed through participative design with potential users and educators for the visually impaired.

Keywords

Assisted navigation Guidance Spatial audio Visually impaired assistive device Need analysis 

Notes

Acknowledgments

The NAVIG consortium includes IRIT, LIMSI, CerCo, SpikeNet Technology, NAVOCAP, CESDV - Institute for Young Blind, and the community of Grand Toulouse. This work was supported by the French National Research Agency (ANR) through the TecSan program (project NAVIG ANR-08-TECS-011) and the Midi-Pyrénées region through the APRRTT program. This research program has been labeled by the cluster Aerospace Valley.

References

  1. Afonso A, Blum A, Katz BFG, Tarroux P, Borst G, Denis M (2010) Structural properties of spatial representations in blind people: scanning images constructed from haptic exploration or from locomotion in a 3-D audio virtual environment. Memory Cogn 38:591–604CrossRefGoogle Scholar
  2. Allen GL (2000) Principles and practices for communicating route knowledge. Appl Cogn Psychol 14:333–359CrossRefGoogle Scholar
  3. Auvray M, Myin E (2009) Perception with compensatory devices. From sensory substitution to sensorimotor extension. Cogn Sci 33:1036–1058CrossRefGoogle Scholar
  4. Auvray M, Hanneton S, O’Regan JK (2007) Learning to perceive with a visuo-auditory substitution system: localization and object recognition with the voice. Perception 36:416–430CrossRefGoogle Scholar
  5. Bar-Shalom Y (1987) Tracking and data association. Academic Press Professional, ISBN: 0-120-79760-7Google Scholar
  6. Begault DR (1994) 3-D sound for virtual reality and multimedia. Academic Press, CambridgeGoogle Scholar
  7. Bentzen B, Mitchell P (1995) Audible signage as a wayfinding aid: Comparison of Verbal Landmarks\(\circledR\) and Talking Signs\(\circledR\). J Vis Impair Blind 89:494–505Google Scholar
  8. Berger JO (1985) Statistical decision theory and bayesian analysis (2nd edn). Springer Series, ISBN: 978-0387960982Google Scholar
  9. Bisseret A, Sebillote S, Falzon P (1999) Techniques pratiques pour l’étude des activités expertes. Octarès Editions, ToulouseGoogle Scholar
  10. Blattner MM, Sumikawa DA, Greenberg RM (1989) Earcons and icons: their structure and common design principles. SIGCHI Bull 21:123–124CrossRefGoogle Scholar
  11. Brilhault A, Kammoun S, Gutierrez O, Truillet P, Jouffrais C (2011) Fusion of artificial vision and GPS to improve blind pedestrian positioning. International conference on new technologies, mobility and security, IEEE, FranceGoogle Scholar
  12. Brunet L (2010) Étude des besoins et des stratégies des personnes non-voyantes lors de la navigation pour la conception d’un dispositif d’aide performant et accepté (Needs and strategy study of blind people during navigation for the design of a functional and accepted aid device). Master’s thesis, Department of Ergonomics, Université Paris-Sud, Orsay, FranceGoogle Scholar
  13. Buisson M, Bustico A, Chatty S, Colin F-R, Jestin Y, Maury S, Mertz C, Truillet P (2002) Ivy: un bus logiciel au service du développement de prototypes de systèmes interactifs. 14th French-speaking conference on human– computer interaction (IHM ’02) pp 223-2 26Google Scholar
  14. Burrough PA (1986) Principles of geographical information systems for land resources, assessment. Oxford, Clarendon Press, Monographs on soil and resource surveys, No. 12Google Scholar
  15. Canadian Institute for the Blind (2005) Inégalité des chances : Rapport sur les besoins des personnes aveugles ou handicapées visuelles vivant au Canada. Technical Report. http://www.cnib.ca/fr/apropos/publications/recherche
  16. Cappelle C, El Najjar ME, Pomorski D, Charpillet F (2010) Multi-sensors data fusion using dynamic bayesian network for robotised vehicle geo-localisation. International conference on information fusion, June 30–July 3 2008, Cologne, pp 1–8Google Scholar
  17. Denis M (1997) The description of routes: acognitive approach to the production of spatial discourse. Curr Pychol Cogn 16:409–458Google Scholar
  18. Dingler T, Lindsay J, Walker BN (2008) Learnability of sound cues for environmental features: Auditory icons, earcons, spearcons, and speech, methods, pp 1–6Google Scholar
  19. Dramas F, Oriola B, Katz BFG, Thorpe S, Jouffrais C (2008) Designing an assistive device for the blind based on object localization and augmented auditory reality. ACM conference on computers and accessibility, ASSETS, Halifax, Canada, pp 263–264Google Scholar
  20. Dramas F, Thorpe SJ, Jouffrais C (2010) Artificial vision for the blind: a bio-inspired algorithm for objects and obstacles detection. J Image Graph 10(4):531–544CrossRefGoogle Scholar
  21. Durrant-Whyte HF (1988) Sensor models and multisensory integration. Special issue on Sensor Data Fusion, ISSN: 0278-3649 7(6):97–113Google Scholar
  22. Férey N, Nelson J, Martin C, Picinali L, Bouyer G, Tek A, Bourdot P, Burkhardt JM, Katz BFG, Ammi M, Etchebest C, Autin L (2009) Multisensory VR interaction for protein-docking in the CoRSAIRe project. Virtual Reality 13:273–293CrossRefGoogle Scholar
  23. Fletcher JF (1980) Spatial representation in blind children. 1: development compared to sighted children. J Vis Impair Blind 381–385Google Scholar
  24. Gallay M, Denis M, Auvray M (2012) Navigation assistance for blind pedestrians: guidelines for the design of devices and implications for spatial cognition. In: Thora Tenbrink, Jan Wiener, Christophe Claramunt (eds) Representing space in cognition: Interrelations of behaviour, language, and formal models. Oxford University Press, UK (in press)Google Scholar
  25. Gaunet F, Briffault X (2005) Exploring the functional specifications of a localized wayfinding verbal aid for blind pedestrians: simple and structured urban areas. Hum Comp Interact 20:267–314CrossRefGoogle Scholar
  26. Gaver W (1986) Auditory icons: using sound in computer interfaces. Hum-Comput Interact 2:167–177CrossRefGoogle Scholar
  27. Golledge RG, Klatzky RL, Loomis JM, Speigle J, Tietz J (1998) A geographical information system for a GPS-based personal guidance system. Int J Geograph Inform Sci 727–749Google Scholar
  28. Golledge RG, Marston JR, Loomis JM, Klatzky RL (2004) Stated preferences for components of a Personal Guidance System for nonvisual navigation. J Vis Impair Blind 98:135–147Google Scholar
  29. Helal A, Moore SE, Ramachandran B (2001) Drishti: an integrated navigation system for visually impaired and disabled. International symposium on wearable computers ISWC, IEEE Computer Society, Washington DC, pp 149–156Google Scholar
  30. Hub A, Diepstraten J, Ertl T (2004) Design and development of an indoor navigation and object identification system for the blind. ACM SIGACCESS accessibility and computing (77–78):147–152Google Scholar
  31. Jacobson RD, Kitchin RM (1997) GIS and people with visual impairments or blindness: Exploring the potential for education, orientation, and navigation. Trans Geograph Inform Syst 2(4):315–332Google Scholar
  32. Kammoun S, Dramas F, Oriola B, Jouffrais C (2010) Route Selection Algorithm for Blind Pedestrian. International conference on control, automation and systems, IEEE, KINTEX, Gyeonggi-do, Korea, pp 2223–2228Google Scholar
  33. Kammoun S, Parseihian G, Gutierrez O, Brilhault A, Serpa A, Raynal M, Oriola B, Macé M, Auvray M, Denis M, Thorpe S, Truillet P, Katz BFG, Jouffrais C (2012) Navigation and space perception assistance for the visually impaired: the NAVIG project. Ingénierie et Recherche Biomédicale 33:182–189Google Scholar
  34. Katz BFG, Rio E, Picinali L, Warusfel O (2008) The effect of spatialization in a data sonification exploration task. In: Proceedings of 14th meeting of the international conference on auditory display (ICAD), Paris 24–27 June, pp 1-7Google Scholar
  35. Katz BFG, Truillet P, Thorpe S, Jouffrais C (2010) NAVIG: Navigation assisted by artificial vision and GNSS. Workshop on multimodal location based techniques for extreme navigation (Pervasive 2010), HelsinkiGoogle Scholar
  36. Katz BFG, Rio E, Picinali L (2010) LIMSI spatialisation engine, InterDeposit Digital Number IDDN.FR. 001.340014.000.S.P. 2010.000.31235Google Scholar
  37. Katz BFG, Dramas F, Parseihian G, Gutierrez O, Kammoun S, Brilhault A, Brunet L, Gallay M, Oriola B, Auvray M, Truillet P, Denis M, Thorpe S, Jouffrais C (2012) NAVIG: guidance system for the visually impaired using virtual augmented reality. J Technol Disability 24(2) (in press)Google Scholar
  38. Klatzky RL, Marston JR, Giudice NA, Golledge RG, Loomis JM (2006) Cognitive load of navigating without vision when guided by virtual sound versus spatial language. J Exp Psychol Appl 12:223–232CrossRefGoogle Scholar
  39. Knapek M, Oropeza RS, Kriegman DJ (2000) Selecting promising landmarks, Proc. ICRA’00. IEEE international conference on robotics and automation vol 4, pp 3771–3777Google Scholar
  40. Liu X (2008) A camera phone based currency reader for the visually impaired. In: Proceedings of the 10th international ACM SIGACCESS conference on computers and accessibility, Halifax, Nova Scotia, Canada, pp 305–306Google Scholar
  41. Loomis JM, Golledge RG, Klatzlty RL, Speigle J, Tietz J (1994) Personal guidance system for visually impaired. In: Proceedings of first annual ACM conference on assistive technologies (Assets ’94), pp 85–90Google Scholar
  42. Loomis JM, Golledge RG, Klatzky RL (1998) Navigation system for the blind: auditory display modes and guidance. Presence: Teleoperators Virtual Environ 7:193–203CrossRefGoogle Scholar
  43. Loomis JM, Marston JR, Golledge RG, Klatzky RL (2005) Personal guidance system for people with visual impairment: a comparison of spatial displays for route guidance. J Vis Impair Blind 99:219–232Google Scholar
  44. Loomis JM, Golledge RG, Klatzky RL, Marston JR (2006) Assisting wayfinding in visually impaired travelers. In: Allen G (eds) Applied spatial cognition: from research to cognitive technology, Lawrence Erlbaum Associates, MahwahGoogle Scholar
  45. Marston JR, Loomis JM, Klatzky RL, Golledge RG (2006) Smith EL Evaluation of spatial displays for navigation without sight. ACM Trans Appl Percept 3:110–124CrossRefGoogle Scholar
  46. Ménélas B, Picinali L, Katz BFG, Bourdot P (2010) Audio haptic feedbacks in a task of targets acquisition. IEEE symposium on 3d user interfaces (3DUI 2010), Waltham, USA, pp 51–54Google Scholar
  47. Mitchell HB (2007) Multi-sensor data fusion: an introduction. Springer, ISBN: 978-3540714637Google Scholar
  48. Noordzij ML, Zuidhoek S, Postma A (2006) The influence of visual experience on the ability to form spatial mental models based on route and survey descriptions. Cognition 100:321–342CrossRefGoogle Scholar
  49. Park S-K, Suh YS, Do TN (2009) The pedestrian navigation system using vision and inertial sensors. ICROS-SICE international joint conference on 2009, Fukuoka, Japan 18–21 Aug., pp 3970–3974Google Scholar
  50. Parlouar RD, Macé FM, Jouffrais C (2009) Assistive device for the blind based on object recognition: an application to identify currency bills. ACM conference on computers and accessibility (ASSETS 2009), Pittsburgh, PA, USA, pp 227–228Google Scholar
  51. Parseihian G, Katz BFG (2012) Morphocons: a new sonification concept based on morphological earcons. J Audio Eng Soc (accepted 2012–01–19)Google Scholar
  52. Parseihian G, Brilhault A, Dramas F (2010) NAVIG: an object localization system for the blind. Workshop on Multimodal Location Based Techniques for Extreme Navigation (Pervasive 2010), Helsinki, FinlandGoogle Scholar
  53. Ran L, Helal S, Moore (2004) S Drishti: an integrated indoor/outdoor blind navigation system and service. IEEE international conference on pervasive computing and communications (PerCom’04), pp 23–30Google Scholar
  54. Roentgen UR, Gelderblom GJ, Soede M, de Witte LP (2008) Inventory of electronic mobility aids for persons with visual impairments: a literature review. J Vis Impair Blind 102(11):702–724Google Scholar
  55. Shi J, Tomasi C (1994) Good features to track. Proc. CVPR’94. IEEE computer society conference on computer vision and pattern recognition, pp 593–600Google Scholar
  56. Strothotte T, Petrie H, Johnson V Reichert L (1995) Mobic: user needs and preliminary design for a mobility aid for blind and elderly travelers. In: Porrero IP, de la Bellacasa RP (eds) The European context for assistive technology, pp 348–352Google Scholar
  57. Thorpe S, Fize D, Marlot C (1996) Speed of processing in the human visual system. Nature 381(6582):520–522CrossRefGoogle Scholar
  58. Thrun S (2002) Robotic mapping: A survey. In: Lakemeyer G, Nebel B (eds) Exploring artificial intelligence in the new millennium, Elsevier Science, USA, pp 1–35Google Scholar
  59. Tran T, Letowski T, Abouchacra K (2000) Evaluation of acoustic beacon characteristics for navigation tasks. Ergonomics 43(6):807–827Google Scholar
  60. Vézien J-M, Ménélas B, Nelson J, Picinali L, Bourdot P, Ammi M, Katz BFG, Burkhardt JM, Pastur L, Lusseyran F (2009) Multisensory VR exploration for computer fluid dynamics in the CoRSAIRe project. Virtual Reality 13:257–271CrossRefGoogle Scholar
  61. Völkel T, Kühn R, Weber G (2008) Mobility impaired pedestrians are not cars: requirements for the annotation of geographical data. Computers Helping People with Special Needs, LNCS 2008 5105:1085–1092Google Scholar
  62. Walker BN, Lindsay J (2005) Navigation performance in a virtual environment with bonephones. In: Proceedings of international conference on auditory display, Limerick, Ireland, pp 260–263Google Scholar
  63. Walker BN, Lindsay J (2006) Navigation performance with a virtual auditory display: Navigation performance with a virtual auditory display: effects of beacon sound, capture radius, and practice. Hum Factors: J Hum Fact Ergonomics Soci Sum 48(2):265–278Google Scholar
  64. Walker BN, Nance A, Lindsay J (2006) Spearcons: speech-based earcons improve navigation performance in auditory menus. In: Proceedings of international conference on auditory display (ICAD2006), pp 95–98Google Scholar
  65. Zheng J, Winstanley A, Pan Z, Coveney S (2009) Spatial characteristics of walking areas for pedestrian navigation. Third International conference on multimedia and ubiquitous engineering, IEEE, China, pp 452–458Google Scholar

Copyright information

© Springer-Verlag London Limited 2012

Authors and Affiliations

  • Brian F. G. Katz
    • 1
    Email author
  • Slim Kammoun
    • 2
  • Gaëtan Parseihian
    • 1
  • Olivier Gutierrez
    • 2
  • Adrien Brilhault
    • 2
    • 3
  • Malika Auvray
    • 1
  • Philippe Truillet
    • 2
  • Michel Denis
    • 1
  • Simon Thorpe
    • 3
  • Christophe Jouffrais
    • 2
  1. 1.LIMSI-CNRSUniversité Paris SudOrsayFrance
  2. 2.IRITCNRS & Université Paul SabatierToulouseFrance
  3. 3.CerCoCNRS & Université Paul SabatierToulouseFrance

Personalised recommendations