Advertisement

Touchscreen-Based Haptic Information Access for Assisting Blind and Visually-Impaired Users: Perceptual Parameters and Design Guidelines

Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 794)

Abstract

Touchscreen-based smart devices, such as smartphones and tablets, offer great promise for providing blind and visually-impaired (BVI) users with a means for accessing graphics non-visually. However, they also offer novel challenges as they were primarily developed for use as a visual interface. This paper studies key usability parameters governing accurate rendering of haptically-perceivable graphical materials. Three psychophysically-motivated usability studies, incorporating 46 BVI participants, were conducted that identified three key parameters for accurate rendering of vibrotactile lines. Results suggested that the best performance and greatest perceptual salience is obtained with vibrotactile feedback based on: (1) a minimum width of 1 mm for detecting lines, (2) a minimum gap of 4 mm for discriminating lines rendered parallel to each other, and (3) a minimum angular separation (i.e., cord length) of 4 mm for discriminating oriented lines. Findings provide foundational guidelines for converting/rendering visual graphical materials on touchscreen-based interfaces for supporting haptic/vibrotactile information access.

Keywords

Assistive technology Haptic information access Haptic interaction Multimodal interface Design guidelines 

Notes

Acknowledgments

We acknowledge support from NSF grants CHS-1425337 and ECR DCL Level 2 1644471 on this project.

References

  1. 1.
    Perkins: Perkins Museum. http://www.perkins.org/
  2. 2.
    Giudice, N.A., Legge, G.E.: Blind navigation and the role of technology. In: Helal, A., Mokhtari, M., Abdulrazak, B. (eds.) Engineering Handbook of Smart Technology for Aging, Disability, and Independence, pp. 479–500. Wiley (2008)Google Scholar
  3. 3.
    O’Modhrain, S., Giudice, N.A., Gardner, J.A., Legge, G.E.: Designing media for visually-impaired users of refreshable touch displays: possibilities and pitfalls. Trans. Haptics. 8, 248–257 (2015)CrossRefGoogle Scholar
  4. 4.
    Kaye, H.S., Kang, T., LaPlante, M.P.: Mobility device use in the United States. Disability Statistics Report (14), Washington, D.C., USA (2000)Google Scholar
  5. 5.
    Clark-Carter, D.D., Heyes, A.D., Howarth, C.I.: The efficiency and walking speed of visually impaired people. Ergonomics 29, 779–789 (1986)CrossRefGoogle Scholar
  6. 6.
    World Health Organization: Visual impairment and blindness Fact Sheet (2011). http://www.who.int/mediacentre/factsheets/fs282/en/
  7. 7.
    Rowell, J., Ungar, S.: The world of touch: an international survey of tactile maps. Part 1: production. Br. J. Vis. Impair. 21, 98–104 (2003)CrossRefGoogle Scholar
  8. 8.
    Rowell, J., Ungar, S.: The world of touch: an international survey of tactile maps. Part 2: design. Br. J. Vis. Impair. 21, 105–110 (2003)CrossRefGoogle Scholar
  9. 9.
    Braille Authority of North America: Guidelines and Standards for Tactile Graphics (2010). www.brailleauthority.org/tg
  10. 10.
    Bach-Y-Rita, P., Collins, C.C., Saunders, F.A., White, B., Scadden, L.: Vision substitution by tactile image projection (1969)Google Scholar
  11. 11.
    Hasser, C.: HAPTAC: A Haptic Tactile Display for the Presentation of Two-Dimensional Virtual or Remote Environments (1995)Google Scholar
  12. 12.
  13. 13.
    Zeng, L., Weber, G.: Audio-haptic browser for a geographical information system. In: Computers Helping People with Special Needs, pp. 466–473 (2010)Google Scholar
  14. 14.
    Rastogi, R., Pawluk, D.T.V.: Toward an improved haptic zooming algorithm for graphical information accessed by individuals who are blind and visually impaired. Assist. Technol. 25, 9–15 (2013)CrossRefGoogle Scholar
  15. 15.
    Williamson, J.R., Crossan, A., Brewster, S.: Multimodal mobile interactions: usability studies in real world settings. In: Proceedings 13th International Conference Multimodal Interfaces, ICMI 2011, pp. 361–368 (2011)Google Scholar
  16. 16.
    Su, J., Rosenzweig, A., Goel, A., Lara, E.D., Truong, K.N.: Timbremap: enabling the visually-impaired to use maps on touch-enabled devices. In: Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services, pp. 17–26. ACM (2010)Google Scholar
  17. 17.
    Hoggan, E., Brewster, S.: Designing audio and tactile crossmodal icons for mobile devices. In: Proceedings 9th International Conference Multimodal Interfaces, ICMI 2007, p. 162 (2007)Google Scholar
  18. 18.
    Goncu, C., Marriott, K.: GraVVITAS: generic multi-touch presentation of accessible graphics. In: Lecture Notes in Computer Science, vol. 6946, pp. 30–48 (2011)Google Scholar
  19. 19.
    Tennison, J.L., Gorlewicz, J.L.: Toward non-visual graphics representations on vibratory touchscreens: shape exploration and identification. In: Bello, F., Kajimoto, H., Visell, Y. (eds.) Haptics: Perception, Devices, Control, and Applications: 10th International Conference, EuroHaptics 2016, London, UK, July 4–7 2016, Proceedings, Part II, pp. 384–395. Springer International Publishing, Cham (2016)Google Scholar
  20. 20.
    Gershon, P., Klatzky, R.L., Palani, H.P., Giudice, N.A.: Visual, tangible, and touch-screen: comparison of platforms for displaying simple graphics. Assist. Technol. 28, 1–6 (2016)CrossRefGoogle Scholar
  21. 21.
    Palani, H.P., Giudice, N.A.: Principles for designing large-format refreshable haptic graphics using touchscreen devices. ACM Trans. Access. Comput. 9, 1–25 (2017)CrossRefGoogle Scholar
  22. 22.
    Palani, H.P., Giudice, N.A.: Evaluation of non-visual panning operations using touch-screen devices. In: Proceedings 16th International ACM SIGACCESS Conference on Computers & Accessibility. ACM (2014)Google Scholar
  23. 23.
    Palani, H.P., Giudice, U., Giudice, N.A.: Evaluation of non-visual zooming operations on touchscreen devices. In: Universal Access in Human-Computer Interaction. Interaction Techniques and Environments: 10th International Conference, UAHCI 2016, Held as Part of HCI International 2016, Toronto, ON, Canada, 17–22 July 2016, Proceedings, Part II, pp. 162–174. Springer International Publishing (2016)Google Scholar
  24. 24.
    Mullenbach, J., Shultz, C., Colgate, J.E., Piper, A.M.: Exploring affective communication through variable - friction surface haptics. In: Proceedings SIGCHI Conference on Human Factors in Computing Systems, pp. 3963–3972 (2014)Google Scholar
  25. 25.
    Xu, C., Israr, A., Poupyrev, I., Bau, O., Harrison, C.: Tactile display for the visually impaired using TeslaTouch. In: Proceedings CHI EA 2011, pp. 317–322 (2011)Google Scholar
  26. 26.
    Challis, B.: Tactile interaction. In: Soegaard, M., et al. (eds.) Encyclopedia of Human-Computer Interaction, 2nd edn. (2012)Google Scholar
  27. 27.
    Giudice, N.A., Palani, H.P., Brenner, E., Kramer, K.M.: Learning non-visual graphical information using a touch-based vibro-audio interface. In: Proceedings 14th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 103–110. ACM Press, New York (2012)Google Scholar
  28. 28.
    Palani, H.P.: Making Graphical Information Accessible without Vision using Touch-Based devices, Unpublished Masters Thesis (2013)Google Scholar
  29. 29.
    Loomis, J.M., Klatzky, R.L., Giudice, N.A.: Sensory substitution of vision: importance of perceptual and cognitive processing. In: Manduchi, R., Kurniawan, S. (eds.) Assistive Technology for Blindness and Low Vision, pp. 162–191. CRC, Boca Raton (2012)Google Scholar
  30. 30.
    Klatzky, R.L., Giudice, N.A., Bennett, C.R., Loomis, J.M.: Touch-screen technology for the dynamic display of 2D spatial information without vision: promise and progress. Multisens. Res. 27, 359–378 (2014)CrossRefGoogle Scholar
  31. 31.
    Raja, M.K.: The development and validation of a new smartphone based non-visual spatial interface for learning indoor layouts. Unpublished Masters Thesis (2011)Google Scholar
  32. 32.
    Gescheider, G.A.: Psychophysics: The Fundamentals. Lawrence Erlbaum Associates Publishers, Mahwah (1997)Google Scholar
  33. 33.
    Lederman, S.J., Klatzky, R.L.: Hand movements: a window into haptic object recognition. Cogn. Psychol. 19, 342–368 (1987)CrossRefGoogle Scholar
  34. 34.
    Miller, G.A.: The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychol. Rev. 63 (1956)Google Scholar
  35. 35.
    WebAim: WebAim: Screen Reader User Survey #5 Results. http://webaim.org/projects/screenreadersurvey5/

Copyright information

© Springer International Publishing AG, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Spatial Informatics Program: School of Computing and Information ScienceThe University of MaineOronoUSA
  2. 2.VEMI LabThe University of MaineOronoUSA
  3. 3.Department of Aerospace and Mechanical EngineeringSaint-Louis UniversitySt. LouisUSA

Personalised recommendations