GraVVITAS: Generic Multi-touch Presentation of Accessible Graphics

  • Cagatay Goncu
  • Kim Marriott
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6946)

Abstract

Access to graphics and other two dimensional information is still severely limited for people who are blind. We present a new multimodal computer tool, GraVVITAS, for presenting accessible graphics. It uses a multi-touch display for tracking the position of the user’s fingers augmented with haptic feedback for the fingers provided by small vibrating motors, and audio feedback for navigation and to provide non-geometric information about graphic elements. We believe GraVVITAS is the first practical, generic, low cost approach to providing refreshable accessible graphics. We have used a participatory design process with blind participants and a final evaluation of the tool shows that they can use it to understand a variety of graphics - tables, line graphs, and floorplans.

Keywords

graphics accessibility multi-touch audio speech haptic 

References

  1. 1.
    Bau, O., Poupyrev, I., Israr, A., Harrison, C.: TeslaTouch: electrovibration for touch surfaces. In: Proc. of the UIST, pp. 283–292. ACM, New York (2010)Google Scholar
  2. 2.
    Benali-Khoudja, M., Hafez, M., Alexandre, J., Kheddar, A.: Tactile interfaces: a state-of-the-art survey. In: Int. Symposium on Robotics (2004)Google Scholar
  3. 3.
    Buxton, B., Buxton, W.: Sketching user experiences: getting the design right and the right design. Morgan Kaufmann, San Francisco (2007)Google Scholar
  4. 4.
    Cattaneo, Z., et al.: Imagery and spatial processes in blindness and visual impairment. Neuroscience and Biobehavioral Reviews 32(8), 1346–1360 (2008)CrossRefGoogle Scholar
  5. 5.
    Challis, B., Edwards, A.: Design principles for tactile interaction. In: Brewster, S., Murray-Smith, R. (eds.) Haptic HCI 2000. LNCS, vol. 2058, pp. 17–24. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  6. 6.
    Coren, S., Ward, L., Enns, J.: Sensation and perception. Wiley, Chichester (2004)Google Scholar
  7. 7.
    Deutsch, D.: The psychology of music. Academic Pr., London (1999)Google Scholar
  8. 8.
    Edman, P.: Tactile graphics. American Foundation for the Blind (1992)Google Scholar
  9. 9.
    Elzer, S., et al.: A probabilistic framework for recognizing intention in information graphics. In: Proc. of the Int. Joint Conf. on AI, vol. 19, pp. 1042–1047 (2005)Google Scholar
  10. 10.
    Eriksson, Y.: Tactile Pictures: Pictorial Representations for the Blind 1784-1940. Gothenburg Uni. Press (1998)Google Scholar
  11. 11.
    Foulke, E.: Reading braille. In: Tactual Perception, pp. 223–233. Cambridge Uni. Press, Cambridge (1982)Google Scholar
  12. 12.
    Frauenberger, C., Noisternig, M.: 3D Audio Interfaces for the Blind. In: Workshop on Nomadic Data Services and Mobility, pp. 11–12 (2003)Google Scholar
  13. 13.
    Gardner, J., Bulatov, V.: Scientific Diagrams Made Easy with IVEO. In: Miesenberger, K., Klaus, J., Zagler, W.L., Karshmer, A.I. (eds.) ICCHP 2006. LNCS, vol. 4061, pp. 1243–1250. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  14. 14.
    Goncu, C., Marriott, K., Aldrich, F.: Tactile Diagrams: Worth Ten Thousand Words? In: Goel, A.K., Jamnik, M., Narayanan, N.H. (eds.) Diagrams 2010. LNCS, vol. 6170, pp. 257–263. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  15. 15.
    Goncu, C., Marriott, K., Hurst, J.: Usability of Accessible Bar Charts. In: Goel, A.K., Jamnik, M., Narayanan, N.H. (eds.) Diagrams 2010. LNCS, vol. 6170, pp. 167–181. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  16. 16.
    Hatwell, Y.: Images and Non-visual Spatial Representations in the Blind, pp. 13–35. John Libbey Eurotext (1993), http://books.google.com
  17. 17.
    Hatwell, Y., Martinez-Sarrochi, F.: The tactile reading of maps and drawings, and the access of blind people to works of art. In: Touching for Knowing: Cognitive Psychology of Haptic Manual Perception. John Benjamins B.V., Amsterdam (2003)Google Scholar
  18. 18.
    Kensing, F., Blomberg, J.: Participatory design: Issues and concerns. Computer Supported Cooperative Work (CSCW) 7(3), 167–185 (1998)CrossRefGoogle Scholar
  19. 19.
    Ladner, R., et al.: Automating tactile graphics translation. In: Proc. of the 7th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 150–157 (2005)Google Scholar
  20. 20.
    Landau, S., Gourgey, K.: Development of a Talking Tactile Tablet. Information Technology and Disabilities 7(2) (2001)Google Scholar
  21. 21.
    Larkin, J., Simon, H.: Why a Diagram is (Sometimes) Worth Ten Thousand Words. Cognitive Science 11(1), 65–100 (1987)CrossRefGoogle Scholar
  22. 22.
    Manshad, M., Manshad, A.: Multimodal Vision Glove for Touchscreens. In: Proc. of ACM ASSETS, pp. 251–252. ACM, New York (2008)Google Scholar
  23. 23.
    McGookin, D., Brewster, S.: MultiVis: Improving Access to Visualisations for Visually Impaired People. In: CHI 2006 Extended Abstracts, pp. 267–270. ACM, New York (2006)CrossRefGoogle Scholar
  24. 24.
    Millar, S.: Reading by touch. Routledge, New York (1997)CrossRefGoogle Scholar
  25. 25.
    Millar, S.: Space and sense. Psychology Press/Taylor & Francis (2008)Google Scholar
  26. 26.
    Nagasaka, D., et al.: A Real-Time Network Board Game System Using Tactile and Auditory Senses for the Visually Impaired. In: Miesenberger, K., Klaus, J., Zagler, W., Karshmer, A. (eds.) ICCHP 2010. LNCS, vol. 6179, pp. 255–262. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  27. 27.
    Petrie, H., et al.: TeDUB: A Aystem for Presenting and Exploring Technical Drawings for Blind People. In: Miesenberger, K., Klaus, J., Zagler, W.L. (eds.) ICCHP 2002. LNCS, vol. 2398, pp. 537–539. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  28. 28.
    Petrie, H., Hamilton, F., King, N., Pavan, P.: Remote usability evaluations with disabled people. In: Proc. of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1133–1141. ACM, New York (2006)CrossRefGoogle Scholar
  29. 29.
    Tactile Diagram Manual, Purdue University (2002)Google Scholar
  30. 30.
    Scaife, M., Rogers, Y.: External cognition: how do graphical representations work? International Journal of Human-Computer Studies 45(2), 185–213 (1996)CrossRefGoogle Scholar
  31. 31.
    Shimojima, A.: The graphic-linguistic distinction: Exploring alternatives. Artificial Intelligence Review 13(4), 313–335 (1999)CrossRefGoogle Scholar
  32. 32.
    Takagi, H., Saito, S., Fukuda, K., Asakawa, C.: Analysis of navigability of Web applications for improving blind usability. ACM Trans. on Computer-Human Interaction 14(3), 13 (2007)CrossRefGoogle Scholar
  33. 33.
    Tversky, B.: Spatial schemas in depictions. In: Spatial Schemas and Abstract Thought, pp. 79–111. MIT Press, Cambridge (2001)Google Scholar
  34. 34.
    Ungar, S.: Cognitive Mapping without Visual Experience. In: Cognitive mapping: Past, Present and Future, pp. 221–248 (2000)Google Scholar
  35. 35.
    Velleman, E., van Tol, R., Huiberts, S., Verwey, H.: 3D Shooting Games, Multimodal Games, Sound Games and More Working Examples of the Future of Games for the Blind. In: Miesenberger, K., Klaus, J., Zagler, W.L., Burger, D. (eds.) ICCHP 2004. LNCS, vol. 3118, pp. 257–263. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  36. 36.
    Vidal-Verdu, F., Hafez, M.: Graphical Tactile Displays for Visually-Impaired People. IEEE Transactions on Neural Systems and Rehabilitation Engineering 15(1), 119–130 (2007)CrossRefGoogle Scholar
  37. 37.
    Wall, S., Brewster, S.: Feeling What You Hear: Tactile Feedback for Navigation of Audio Graphs. In: Proc. of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1123–1132. ACM, New York (2006)CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2011

Authors and Affiliations

  • Cagatay Goncu
    • 1
  • Kim Marriott
    • 1
  1. 1.Clayton School of Information TechnologyMonash UniversityAustralia

Personalised recommendations