Skip to main content

Haptic Auditory Feedback for Enhanced Image Description: A Study of User Preferences and Performance

  • Conference paper
  • First Online:
Human-Computer Interaction – INTERACT 2023 (INTERACT 2023)

Abstract

Our research has focused on improving the accessibility of mobile applications for blind or low vision (BLV) users, particularly with regard to images. Previous studies have shown that using spatial interaction can help BLV users create a mental model of the positions of objects within an image. In order to address the issue of limited image accessibility, we have developed three prototypes that utilize haptic feedback to reveal the positions of objects within an image. These prototypes use audio-haptic binding to make the images more accessible to BLV users. We also conducted the first user study to evaluate the memorability, efficiency, preferences, and comfort level with haptic feedback of our prototypes for BLV individuals trying to locate multiple objects within an image. The results of the study indicate that the prototype combining haptic feedback with both audio and caption components offered a more accessible and preferred among other prototypes. Our work contributes to the advancement of digital image technologies that utilize haptic feedback to enhance the experience of BLV users.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.microsoft.com/en-us/ai/seeing-ai.

  2. 2.

    We have chosen not to conduct a thematic analysis of the responses to the interview questions regarding ‘Experiences using entertainment system’ and ‘Using technology to read e-books and digital magazines’ in the present study, as a majority of the responses are not relevant to the scope of the present study. We plan to publish these responses in a separate study at a later date.

References

  1. Alkhathlan, M., Tlachac, M.L., Harrison, L., Rundensteiner, E.: “Honestly i never really thought about adding a description’’: why highly engaged tweets are inaccessible. In: Ardito, C., et al. (eds.) INTERACT 2021. LNCS, vol. 12932, pp. 373–395. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-85623-6_23

    Chapter  Google Scholar 

  2. Alvina, J., Zhao, S., Perrault, S.T., Azh, M., Roumen, T., Fjeld, M.: OmniVib: towards cross-body spatiotemporal vibrotactile notifications for mobile phones. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 2487–2496 (2015)

    Google Scholar 

  3. Apple: Testing for accessibility on OS X (2022). https://developer.apple.com/library/archive/documentation/Accessibility/Conceptual/AccessibilityMacOSX/OSXAXTestingApps.html

  4. Bardot, S., Serrano, M., Jouffrais, C.: From tactile to virtual: using a smartwatch to improve spatial map exploration for visually impaired users. In: Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services, pp. 100–111 (2016)

    Google Scholar 

  5. Beltramelli, T.: pix2code: generating code from a graphical user interface screenshot. In: Proceedings of the ACM SIGCHI Symposium on Engineering Interactive Computing Systems, pp. 1–6 (2018)

    Google Scholar 

  6. Brady, E., Morris, M.R., Bigham, J.P.: Gauging receptiveness to social microvolunteering. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 1055–1064 (2015)

    Google Scholar 

  7. Brajnik, G.: A comparative test of web accessibility evaluation methods. In: Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 113–120 (2008)

    Google Scholar 

  8. Braun, V., Clarke, V.: Using thematic analysis in psychology. Qual. Res. Psychol. 3(2), 77–101 (2006)

    Article  Google Scholar 

  9. Caldwell, B., et al.: Web content accessibility guidelines (WCAG) 2.0. WWW Consortium (W3C) 290, 1–34 (2008)

    Google Scholar 

  10. Campbell, C.S., Zhai, S., May, K.W., Maglio, P.P.: What you feel must be what you see: adding tactile feedback to the trackpoint. In: Proceedings of INTERACT 1999: 7th IFIP Conference on Human Computer Interaction. Citeseer (1999)

    Google Scholar 

  11. Carter, J.A., Fourney, D.W.: Techniques to assist in developing accessibility engineers. In: Proceedings of the 9th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 123–130 (2007)

    Google Scholar 

  12. Casiez, G., Roussel, N., Vanbelleghem, R., Giraud, F.: Surfpad: riding towards targets on a squeeze film effect. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2491–2500 (2011)

    Google Scholar 

  13. Chen, J., et al.: Wireframe-based UI design search through image autoencoder. ACM Trans. Softw. Eng. Methodol. (TOSEM) 29(3), 1–31 (2020)

    Article  Google Scholar 

  14. Chen, J., et al.: Object detection for graphical user interface: old fashioned or deep learning or a combination? In: Proceedings of the 28th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering, pp. 1202–1214 (2020)

    Google Scholar 

  15. Culbertson, H., Schorr, S.B., Okamura, A.M.: Haptics: the present and future of artificial touch sensation. Ann. Rev. Contr. Robot. Auton. Syst. 1, 385–409 (2018)

    Article  Google Scholar 

  16. Dennerlein, J.T., Martin, D.B., Hasser, C.: Force-feedback improves performance for steering and combined steering-targeting tasks. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 423–429 (2000)

    Google Scholar 

  17. Android Developers: Improve your code with lint checks (2022). https://developer.android.com/studio/write/lint

  18. Ducasse, J., Brock, A.M., Jouffrais, C.: Accessible interactive maps for visually impaired users. In: Pissaloux, E., Velázquez, R. (eds.) Mobility of Visually Impaired People, pp. 537–584. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-54446-5_17

    Chapter  Google Scholar 

  19. Eardley, A.F., Mineiro, C., Neves, J., Ride, P.: Redefining access: embracing multimodality, memorability and shared experience in museums. Curator Museum J. 59(3), 263–286 (2016)

    Google Scholar 

  20. Engel, C., Konrad, N., Weber, G.: TouchPen: rich interaction technique for audio-tactile charts by means of digital pens. In: Miesenberger, K., Manduchi, R., Covarrubias Rodriguez, M., Peňáz, P. (eds.) ICCHP 2020, Part I. LNCS, vol. 12376, pp. 446–455. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58796-3_52

    Chapter  Google Scholar 

  21. Engel, C., Müller, E.F., Weber, G.: SVGPlott: an accessible tool to generate highly adaptable, accessible audio-tactile charts for and from blind and visually impaired people. In: Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments, pp. 186–195 (2019)

    Google Scholar 

  22. Engel, C., Weber, G.: ATIM: automated generation of interactive, audio-tactile indoor maps by means of a digital pen. In: Miesenberger, K., Kouroupetroglou, G., Mavrou, K., Manduchi, R., Covarrubias Rodriguez, M., Penáz, P. (eds.) Computers Helping People with Special Needs ICCHP-AAATE 2022, Part I. LNCS, vol. 13341, pp. 123–133. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-08648-9_15

    Chapter  Google Scholar 

  23. Giudice, N.A., Palani, H.P., Brenner, E., Kramer, K.M.: Learning non-visual graphical information using a touch-based vibro-audio interface. In: Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS 2012, pp. 103–110. Association for Computing Machinery, New York (2012). https://doi.org/10.1145/2384916.2384935

  24. Goncu, C., Marriott, K.: GraVVITAS: generic multi-touch presentation of accessible graphics. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) INTERACT 2011. LNCS, vol. 6946, pp. 30–48. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-23774-4_5

    Chapter  Google Scholar 

  25. Google: Conceptual captions dataset. Envision empower (2021). https://ai.google.com/research/ConceptualCaptions/

  26. Gordon, M.L., Zhai, S.: Touchscreen haptic augmentation effects on tapping, drag and drop, and path following. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, CHI 2019, pp. 1–12. Association for Computing Machinery, New York (2019). https://doi.org/10.1145/3290605.3300603

  27. Hanson, V.L., Richards, J.T.: Progress on website accessibility? ACM Trans. Web (TWEB) 7(1), 1–30 (2013)

    Article  Google Scholar 

  28. Hightower, B., Lovato, S., Davison, J., Wartella, E., Piper, A.M.: Haptic explorers: supporting science journaling through mobile haptic feedback displays. Int. J. Hum. Comput. Stud. 122, 103–112 (2019)

    Article  Google Scholar 

  29. Apple Inc.: Models with haptic feedback (2021). https://devstreaming-cdn.apple.com/videos/wwdc/2019/810fdftstga66w4hfadq/810/810_designing_audiohaptic_experiences.pdf?dl=1

  30. Apple Inc.: Haptic feedback provides a tactile response (2021). https://developer.apple.com/documentation/uikit/uifeedbackgenerator

  31. Kane, S.K., Bigham, J.P., Wobbrock, J.O.: Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In: Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, Assets 2008, pp. 73–80. Association for Computing Machinery, New York (2008). https://doi.org/10.1145/1414471.1414487

  32. Kane, S.K., Morris, M.R., Perkins, A.Z., Wigdor, D., Ladner, R.E., Wobbrock, J.O.: Access overlays: improving non-visual access to large touch screens for blind users. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, pp. 273–282 (2011)

    Google Scholar 

  33. Kasahara, S., Nishida, J., Lopes, P.: Preemptive action: accelerating human reaction using electrical muscle stimulation without compromising agency, CHI 2019, pp. 1–15. Association for Computing Machinery, New York (2019). https://doi.org/10.1145/3290605.3300873

  34. Levesque, V., et al.: Enhancing physicality in touch interaction with programmable friction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2481–2490 (2011)

    Google Scholar 

  35. MacLean, K.E.: Haptic interaction design for everyday interfaces. Rev. Hum. Factors Ergon. 4(1), 149–194 (2008)

    Article  Google Scholar 

  36. Meeker, M.: Internet trends 2014 (2014). https://cryptome.org/2014/05/internet-trends-2014.pdf

  37. Melfi, G., Müller, K., Schwarz, T., Jaworek, G., Stiefelhagen, R.: Understanding what you feel: a mobile audio-tactile system for graphics used at schools with students with visual impairment. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–12 (2020)

    Google Scholar 

  38. Morris, M.R., Johnson, J., Bennett, C.L., Cutrell, E.: Rich representations of visual content for screen reader users. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1–11 (2018)

    Google Scholar 

  39. Nguyen, T.A., Csallner, C.: Reverse engineering mobile application user interfaces with REMAUI (T). In: 2015 30th IEEE/ACM International Conference on Automated Software Engineering (ASE), pp. 248–259. IEEE (2015)

    Google Scholar 

  40. O’Dea, S.: Number of smartphone users worldwide from 2016 to 2021, 10 December 2020. https://www.statista.com/statistics/330695/number-of-smartphone-users-worldwide/

  41. BT Online: Whatsapp users share 55 billion texts, 4.5 billion photos, 1 billion videos daily (2017). https://www.businesstoday.in/technology/news/whatsapp-users-share-texts-photos-videos-daily/story/257230.html

  42. Pascual-Leone, A., Hamilton, R.: The metamodal organization of the brain. Prog. Brain Res. 134, 427–445 (2001)

    Article  Google Scholar 

  43. Poppinga, B., Magnusson, C., Pielot, M., Rassmus-Gröhn, K.: Touchover map: audio-tactile exploration of interactive maps. In: Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, pp. 545–550 (2011)

    Google Scholar 

  44. Postma, A., Zuidhoek, S., Noordzij, M.L., Kappers, A.M.: Differences between early-blind, late-blind, and blindfolded-sighted people in haptic spatial-configuration learning and resulting memory traces. Perception 36(8), 1253–1265 (2007)

    Article  Google Scholar 

  45. Ross, A.S., Zhang, X., Fogarty, J., Wobbrock, J.O.: Epidemiology as a framework for large-scale mobile application accessibility assessment. In: Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 2–11 (2017)

    Google Scholar 

  46. Salisbury, E., Kamar, E., Morris, M.: Toward scalable social alt text: conversational crowdsourcing as a tool for refining vision-to-language technology for the blind. In: Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, vol. 5 (2017)

    Google Scholar 

  47. Accessibility Scanner: Improve your code with lint checks (2022). https://play.google.com/store/apps/details?id=com.google.android.apps.accessibility.auditor

  48. Shrewsbury, B.T.: Providing haptic feedback using the kinect. In: The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 321–322 (2011)

    Google Scholar 

  49. Stangl, A., Morris, M.R., Gurari, D.: “Person, shoes, tree. Is the person naked?” What people with vision impairments want in image descriptions. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–13 (2020)

    Google Scholar 

  50. Stangl, A.J., Kothari, E., Jain, S.D., Yeh, T., Grauman, K., Gurari, D.: BrowseWithMe: an online clothes shopping assistant for people with visual impairments. In: Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 107–118 (2018)

    Google Scholar 

  51. Strauss, A., Corbin, J.: Basics of qualitative research techniques. Citeseer (1998)

    Google Scholar 

  52. Tactile Technologies: The five different touch screen technologies, February 2021. https://tactiletechnologies.com/Tactile/The-Five-Different-Touch-Screen-Technologies-Choosing-The-Best-One

  53. Tennison, J.L., Gorlewicz, J.L.: Toward non-visual graphics representations on vibratory touchscreens: shape exploration and identification. In: Bello, F., Kajimoto, H., Visell, Y. (eds.) EuroHaptics 2016, Part II. LNCS, vol. 9775, pp. 384–395. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-42324-1_38

    Chapter  Google Scholar 

  54. Vi, C.T., Ablart, D., Gatti, E., Velasco, C., Obrist, M.: Not just seeing, but also feeling art: mid-air haptic experiences integrated in a multisensory art exhibition. Int. J. Hum. Comput. Stud. 108, 1–14 (2017)

    Article  Google Scholar 

  55. WHO: Definitions of blindness (2021). https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment

  56. World Health Organisation (WHO): Blindness and vision impairment (2020). https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment

  57. Wolbers, T., Klatzky, R.L., Loomis, J.M., Wutte, M.G., Giudice, N.A.: Modality-independent coding of spatial layout in the human brain. Curr. Biol. 21(11), 984–989 (2011)

    Article  Google Scholar 

  58. Wu, S., Wieland, J., Farivar, O., Schiller, J.: Automatic alt-text: computer-generated image descriptions for blind users on a social network service. In: Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing, pp. 1180–1192 (2017)

    Google Scholar 

  59. Yasu, K.: MagnetAct: magnetic-sheet-based haptic interfaces for touch devices. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1–8 (2019)

    Google Scholar 

  60. Zeng, L., Weber, G.: Audio-haptic browser for a geographical information system. In: Miesenberger, K., Klaus, J., Zagler, W., Karshmer, A. (eds.) ICCHP 2010, Part II. LNCS, vol. 6180, pp. 466–473. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-14100-3_70

    Chapter  Google Scholar 

  61. Zhang, Y., Harrison, C.: Quantifying the targeting performance benefit of electrostatic haptic feedback on touchscreens. In: Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces, pp. 43–46 (2015)

    Google Scholar 

  62. Zhao, K., Serrano, M., Oriola, B., Jouffrais, C.: VibHand: on-hand vibrotactile interface enhancing non-visual exploration of digital graphics. Proc. ACM Hum.-Comput. Interact. 4(ISS), 1–19 (2020)

    Google Scholar 

  63. Zhao, Y., et al.: Enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1–14 (2018)

    Google Scholar 

Download references

Acknowledgments

We thank Imam Abdulrahman Bin Faisal University (IAU) and The Saudi Arabian Cultural Mission (SACM) for financially supporting Mallak Alkhathlan.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Mallak Alkhathlan , M. L. Tlachac or Elke A. Rundensteiner .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Alkhathlan, M., Tlachac, M.L., Rundensteiner, E.A. (2023). Haptic Auditory Feedback for Enhanced Image Description: A Study of User Preferences and Performance. In: Abdelnour Nocera, J., Kristín Lárusdóttir, M., Petrie, H., Piccinno, A., Winckler, M. (eds) Human-Computer Interaction – INTERACT 2023. INTERACT 2023. Lecture Notes in Computer Science, vol 14142. Springer, Cham. https://doi.org/10.1007/978-3-031-42280-5_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-42280-5_14

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-42279-9

  • Online ISBN: 978-3-031-42280-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics