Advertisement

Designing a Virtual Environment to Evaluate Multimodal Sensors for Assisting the Visually Impaired

  • Wai L. Khoo
  • Eric L. Seidel
  • Zhigang Zhu
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7383)

Abstract

We describe how to design a virtual environment using Microsoft Robotics Developer Studio in order to evaluate multimodal sensors for assisting visually impaired people in daily tasks such as navigation and orientation. The work focuses on the design of the interfaces of sensors and stimulators in the virtual environment for future subject experimentation. We discuss what type of sensors we have simulated and define some non-classical interfaces to interact with the environment and get feedback from it. We also present preliminary results for feasibility by showing experimental results on volunteer test subjects, concluding with a discussion of potential future directions.

Keywords

Virtual Reality Virtual Environment Obstacle Detection Impaired People Sonar Sensor 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Electronic Travel Aids:New Directions for Research, Working Group on Mobility Aids for the Visually Impaired and Blind, Committee on Vision (1986)Google Scholar
  2. 2.
    Dakopoulos, D., Bourbakis, N.G.: Wearable Obstacle Avoidance Electronic Travel Aids for Blind: A Survey. IEEE Trans. On Systems, Man, and Cybernetics 40(1) (January 2010)Google Scholar
  3. 3.
    Meijer, P.: An Experimental System for Auditory Image Representations. IEEE Trans. On Biomedical Engineering, 9(2) (February 1992)Google Scholar
  4. 4.
    Gonzalez-Mora, J.L., et al.: Development of a new space perception system for blind people, based on the creation of a virtual acoustic space, http://www.iac.es/proyect/eavi
  5. 5.
    Johnson, L.A., Higgins, C.M.: A navigation aid for the blind using tactile-visual sensory substitution. In: Proc 28th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., New York, pp. 6298–6292 (2006)Google Scholar
  6. 6.
    Liu, J., Sun, X.: A Survey of Vision Aids for the Blind. In: Proceedings of the 6th World Congress on Intelligent Control and Automation, Dalian, China, June 21 - 23 (2006)Google Scholar
  7. 7.
    American Foundation for the Blind, http://www.afb.org/
  8. 8.
    Ito, K., et al.: CyARM: an alternative aid device for blind persons. In: Proceedings of CHI Extended Abstracts 2005, pp. 1483–1488 (2005)Google Scholar
  9. 9.
    Bourbakis, N.: Sensing surrounding 3-D space for navigation of the blind. IEEE Eng. Med. Biol. Mag. 27(1), 49–55 (2008)CrossRefGoogle Scholar
  10. 10.
    Hirose, M., Amemiya, T.: Wearable Finger-Braille Interface for Navigation of Deaf-Blind in Ubiquitous Barrier-Free Space. In: Proc. of 10th International Conference on Human-Computer Interaction (HCI International 2003), Universal Access in Human Computer Interaction, Crete, Greece, June 2003, vol. 4, pp. 1417–1421 (2003)Google Scholar
  11. 11.
    Wicab, Inc., Brainport Vision Technology, http://vision.wicab.com/technology/
  12. 12.
    Torres-Gil, M.A., Casanova-Gonzalez, O., Gonzalez-Mora, J.L.: Applications of virtual reality for visually impaired people. W. Trans. on Comp. 9(2), 184–193 (2010)Google Scholar
  13. 13.
    Beauchamp, M.S., Yasar, N., Frye, R., Ro, T.: Integration of touch, sound and vision in human superior temporal sulcus. NeuroImage 41, 1011–1020 (2008)CrossRefGoogle Scholar
  14. 14.
    Prilutsky, B.I., Sirota, M.G., Gregor, R.J., Beloozerova, I.N.: Quantification of motor cortex activity and full-body biomechanics during unconstrained locomotion. Journal of Neurophysiology 94, 2959–2969 (2005)CrossRefGoogle Scholar
  15. 15.
    Microsoft Robotics Developer Studio, http://www.microsoft.com/robotics/
  16. 16.
    Amemiya, T., Hirota, K., Hirose, M.: Wearable Tactile Interface for Way-Finding Deaf-Blind People using Verbal and Nonverbal Mode. Trans. of VRSJ 9(3), 207–216 (2004) (in Japanese)Google Scholar
  17. 17.
    Tang, H., Zhu, Z.: A Segmentation-Based Stereovision Approach for Assisting Visually Impaired People. In: Miesenberger, K., et al. (eds.) ICCHP 2012, Part II. LNCS, vol. 7383, pp. 581–587. Springer, Heidelberg (2012)Google Scholar
  18. 18.
    Palmer, F., Zhu, Z., Ro, T.: Wearable Range-Vibrotactile Field: Design and Evaluation. In: Miesenberger, K., et al. (eds.) ICCHP 2012, Part II. LNCS, vol. 7383, pp. 125–132. Springer, Heidelberg (2012)Google Scholar
  19. 19.
    Khan, A., Lopez, J., Moideen, F., Khoo, W.L., Zhu, Z.: KinDectect: Kinect Detecting Objects. In: Miesenberger, K., et al. (eds.) ICCHP 2012, Part II. LNCS, vol. 7383, pp. 588–595. Springer, Heidelberg (2012)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Wai L. Khoo
    • 1
  • Eric L. Seidel
    • 1
  • Zhigang Zhu
    • 1
  1. 1.Department of Computer ScienceCUNY City CollegeNew YorkUSA

Personalised recommendations