Advertisement

SaSYS: A Swipe Gesture-Based System for Exploring Urban Environments for the Visually Impaired

  • Jee-Eun Kim
  • Masahiro Bessho
  • Noboru Koshizuka
  • Ken Sakamura
Part of the Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering book series (LNICST, volume 130)

Abstract

Exploring and learning an environment is a particularly challenging issue faced by visually impaired people. Existing interaction techniques for allowing users to learn an environment may not be useful while traveling because they often use dedicated hardware or require users to focus on tactile or auditory feedback. In this paper, we introduce an intuitive interaction technique for selecting areas of interests in urban environments by performing simple swipe gestures on touchscreen. Based on the swipe-based interaction, we developed SaSYS, a location-aware system that enables users to discover points of interest (POI) around them using off-the-shelf smartphones. Our approach can be easily implemented on handheld devices without requiring any dedicated hardware and having users to constantly focus on tactile or auditory feedback. SaSYS also provides a fine-grained control over Text-to-Speech (TTS). Our user study shows that 9 of 11 users preferred swipe-based interaction to existing pointing-based interaction.

Keywords

Accessibility mobile devices visually impaired touchscreens location-based services 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bellotti, F., Berta, R., De Gloria, A., Margarone, M.: Guiding visually impaired people in the exhibition. Mobile Guide (2006)Google Scholar
  2. 2.
    Buzzi, M.C., Buzzi, M., Leporini, B., Martusciello, L.: Making visual maps accessible to the blind. In: Stephanidis, C. (ed.) Universal Access in HCI, Part II, HCII 2011. LNCS, vol. 6766, pp. 271–280. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  3. 3.
    Coroama, V.: Experiences from the design of a ubiquitous computing system for the blind. In: Proceedings of CHI 2006 Extended Abstracts on Human Factors in Computing Systems, pp. 664–669. ACM, New York (2006)Google Scholar
  4. 4.
    Fröhlich, P., Simon, R., Baillie, L., Anegg, H.: Comparing conceptual designs for mobile access to geo-spatial information. In: Proceedings of the 8th Conference on Human-Computer Interaction with Mobile Devices and Services, pp. 109–112. ACM, New York (2006)Google Scholar
  5. 5.
    Golledge, R., Klatzky, R., Loomis, J., Marston, J.: Stated preferences for components of a personal guidance system for nonvisual navigation. Journal of Visual Impairment & Blindness 98(3), 135–147 (2004)Google Scholar
  6. 6.
    Guy, R., Truong, K.: CrossingGuard: exploring information content in navigation aids for visually impaired pedestrians. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 405–414. ACM, New York (2012)Google Scholar
  7. 7.
    Hersh, M., Johnson, M.A.: Assistive technology for visually impaired and blind people. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  8. 8.
    Kane, S.K., Bigham, J.P., Wobbrock, J.O.: Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In: Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 73–80. ACM, New York (2008)Google Scholar
  9. 9.
    Kane, S.K., Jayant, C., Wobbrock, J.O., Ladner, R.E.: Freedom to roam: a study of mobile device adoption and accessibility for people with visual and motor disabilities. In: Proceedings of the 11th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 115–122. ACM, New York (2009)Google Scholar
  10. 10.
    Kane, S.K., Wobbrock, J.O., Ladner, R.E.: Usable gestures for blind people: understanding preference and performance. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 413–422. ACM, New York (2011)Google Scholar
  11. 11.
    McGookin, D., Brewster, S., Jiang, W.: Investigating touchscreen accessibility for people with visual impairments. In: Proceedings of the 5th Nordic Conference on Human-Computer Interaction: Building Bridges, pp. 298–307. ACM, New York (2008)CrossRefGoogle Scholar
  12. 12.
    Oliveira, J., Guerreiro, T., Nicolau, H., Jorge, J., Gonçalves, D.: Blind people and mobile touch-based text-entry: acknowledging the need for different flavors. In: Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 179–186. ACM, New York (2011)Google Scholar
  13. 13.
    Parente, P., Bishop, G.: BATS: the blind audio tactile mapping system. In: Proceedings of the 41st ACM Southeast Regional Conference (2003)Google Scholar
  14. 14.
    Pielot, M., Henze, N., Heuten, W., Boll, S.: Tangible user interface for the exploration of auditory city maps. In: Oakley, I., Brewster, S. (eds.) HAID 2007. LNCS, vol. 4813, pp. 86–97. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  15. 15.
    Poppinga, B., Magnusson, C., Pielot, M., Rassmus-Gröhn, K.: TouchOver map: audio-tactile exploration of interactive maps. In: Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, pp. 545–550. ACM, New York (2011)Google Scholar
  16. 16.
    Robinson, S., Eslambolchilar, P., Jones, M.: Sweep-Shake: finding digital resources in physical environments. In: Proceedings of the 11th International Conference on Human Computer Interaction with Mobile Devices and Services, pp. 85–94. ACM, New York (2009)Google Scholar
  17. 17.
    Rukzio, E., Leichtenstern, K., Callaghan, V., Holleis, P., Schmidt, A., Chin, J.: An experimental comparison of physical mobile interaction techniques: Touching, pointing and scanning. In: Dourish, P., Friday, A. (eds.) UbiComp 2006. LNCS, vol. 4206, pp. 87–104. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  18. 18.
    Sakamura, K., Ishikawa, C.: Internet of Things—From Ubiquitous Computing to Ubiquitous Intelligence Applications. In: Vermesan, O., Friess, P. (eds.) Internet of Things-Global Technological and Societal Trends From Smart Environments and Spaces to Green ICT, pp. 115–141. River Publishers (2011)Google Scholar
  19. 19.
    Sakamura, K., Santucci, G.: Japan-Europe Cooperation on ucode technologies. In: Smith, I.G. (ed.) The Internet of Things 2012: New Horizons. CASAGRAS2, Halifax, UK, pp. 340–351 (2012)Google Scholar
  20. 20.
  21. 21.
    Su, J., Rosenzweig, A., Goel, A., de Lara, E., Truong, K.N.: Timbremap: enabling the visually-impaired to use maps on touch-enabled devices. In: Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services, pp. 17–26. ACM, New York (2010)Google Scholar
  22. 22.
    Wasinger, R., Stahl, C., Krüger, A.: M3I in a pedestrian navigation & exploration system. In: Chittaro, L. (ed.) Mobile HCI 2003. LNCS, vol. 2795, pp. 481–485. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  23. 23.
    Yang, R., Park, S., Mishra, S.R., Hong, Z., Newsom, C., Joo, H., Hofer, E., Newman, M.W.: Supporting spatial awareness and independent wayfinding for pedestrians with visual impairments. In: Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 27–34. ACM, New York (2011)Google Scholar
  24. 24.
    Yatani, K., Banovic, N., Truong, K.: SpaceSense: representing geographical information to visually impaired people using spatial tactile feedback. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 415–424. ACM, New York (2012)Google Scholar

Copyright information

© ICST Institute for Computer Science, Social Informatics and Telecommunications Engineering 2014

Authors and Affiliations

  • Jee-Eun Kim
    • 1
  • Masahiro Bessho
    • 1
  • Noboru Koshizuka
    • 1
  • Ken Sakamura
    • 1
  1. 1.Interfaculty Initiative in Information StudiesThe University of TokyoBunkyo-kuJapan

Personalised recommendations