Advertisement

Abstract

In this paper, we present a wearable tactile device called TAG (TActile Glasses) to help visually impaired individuals navigate through complex environments. The TAG device provides vibrotactile feedback whenever an obstacle is detected in front of the user. The prototype is composed of – in addition to the eyeglasses – an infrared proximity sensor, an ATMEGA128 microprocessor, a rechargeable battery, and a vibrotactile actuator attached to the right temple tip of the glasses. The TAG system is designed to be highly portable, fashionable yet cost effective, and intuitive to use. Experimental study showed that the TAG system can help visually impaired individuals to navigate unfamiliar lab environment using vibrotactile feedback, and without any previous training. Participants reported that the system is intuitive to use, quick to learn, and helpful.

Keywords

Haptic user interface Interaction Design Tangible user interfaces user support systems 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    World Health Organization (WHO), Visual Impairment and Blindness. Fact Sheet (282) (October 2013)Google Scholar
  2. 2.
    Hersh, M.A., Johnson, M.A.: Assistive Technology for Visually Impaired and Blind People. Springer, London (2008)CrossRefGoogle Scholar
  3. 3.
    Dakopoulos, D., Bourbakis, N.G.: Wearable obstacle avoidance electronic travel aids for blind: A survey. IEEE Trans. on Syst. Man. Cybern C Appl. Rev. 40, 25–35 (2010)CrossRefGoogle Scholar
  4. 4.
    Bhattacharjee, A., Ye, A.J., Lisak, J.A., Vargas, M.G., Goldreich, D.: Vibrotactile Masking Experiments Reveal Accelerated Somatosensory Processing in Congenitally Blind Braille Readers. Journal of Neuroscience 30(43), 14288 (2010)CrossRefGoogle Scholar
  5. 5.
    Bach-y-Rita, P., Kercel, W.W.: Sensory substitution and the human-machine interface. Trends in Cognitive Neuroscience 7(12), 541–546 (2003)CrossRefGoogle Scholar
  6. 6.
    Visell, Y.: Tactile sensory substitution: Models for enaction in HCI. Interacting with Computers 21(1-2), 38–53 (2009)CrossRefGoogle Scholar
  7. 7.
    Wu, J., Zhang, J., Yan, J., Liu, W., Song, G.: Design of a Vibrotactile Vest for Contour Perception. International Journal of Advanced Robotic Systems 9, 166 (2012)Google Scholar
  8. 8.
    Yuan, D., Manduchi, R.: A tool for range sensing and environment discovery for the blind. In: IEEE Conference on Computer Vision and Pattern Recognition Workshop, p. 39 (2004)Google Scholar
  9. 9.
    Moller, K., Toth, F., Wang, L., Moller, J., Arras, K.O., Bach, M., Schumann, S., Guttmann, J.: Enhanced Perception for Visually Impaired People. In: 3rd International Conference on Bioinformatics and Biomedical Engineering, pp. 1–4 (2009)Google Scholar
  10. 10.
    Rombokas, E., Stepp, C.E., Chang, C., Malhotra, M., Matsuoka, Y.: Vibrotactile Sensory Substitution for Electromyographic Control of Object Manipulation. IEEE Transactions on Biomedical Engineering 60(8), 2226–2232 (2013)CrossRefGoogle Scholar
  11. 11.
    Bach-Y-Rita, P., Tyler, M.E., Kaczmarek, K.A.: Seeing with the brain. International Journal of Human Computer Interaction 15(2), 285–295 (2003)CrossRefGoogle Scholar
  12. 12.
    Ptito, M., Moesgaard, S.M., Gjedde, A., Kupers, R.: Cross-modal plasticity revealed by electrotactile stimulation of the tongue in the congenitally blind. Brain 128, 606–614 (2005)CrossRefGoogle Scholar
  13. 13.
    Ertan, S., Lee, C., Willets, A., Tan, H., Pentland, A.: A wearable haptic navigation guidance system. In: Second International Symposium on Wearable Computers, pp. 164–165 (1998)Google Scholar
  14. 14.
    Pissaloux, E.E., Velazquez, R., Maingreaud, F.: On 3D world perception: towards a definition of a cognitive map based electronic travel aid. In: 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 107–109 (2004)Google Scholar
  15. 15.
    Zelek, J.S., Asmar, D.: A robot’s spatial perception communicated via human touch. In: IEEE International Conference on Systems, Man and Cybernetics, vol. 1, pp. 454–461 (2003)Google Scholar
  16. 16.
    Akhter, S., Mirsalahuddin, J., Marquina, F.B., Islam, S., Sareen, S.: A Smartphone-based Haptic Vision Substitution system for the blind. In: IEEE 37th Annual Northeast Bioengineering Conference (NEBEC), pp. 1–3 (2011)Google Scholar
  17. 17.
    Wheeler, J., Bark, K., Savall, J., Cutkosky, M.: Investigation of Rotational Skin Stretch for Proprioceptive Feedback With Application to Myoelectric Systems. IEEE Transactions on Neural Systems and Rehabilitation Engineering 18(1), 58–66 (2010)CrossRefGoogle Scholar
  18. 18.
    Bourbakis, N., Keefer, R., Dakopoulos, D., Esposito, A.: A Multimodal Interaction Scheme between a Blind User and the Tyflos Assistive Prototype. In: 20th IEEE International Conference on Tools with Artificial Intelligence, pp. 487–494 (2008)Google Scholar
  19. 19.
    Dakopoulos, D., Bourbakis, N.: Towards a 2D tactile vocabulary for navigation of blind and visually impaired. In: IEEE International Conference on Systems, Man and Cybernetics, pp. 45–51 (2009)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Georgios Korres
    • 1
  • Ahmad El Issawi
    • 2
  • Mohamad Eid
    • 1
  1. 1.Applied Interactive Multimedia Lab, Division of EngineeringNew York University Abu DhabiUnited Arab Emirates
  2. 2.Lebanese UniversityNabatiehLebanon

Personalised recommendations