Skip to main content
Log in

Automatic switching between speech and non-speech: adaptive auditory feedback in desktop assistance for the visually impaired

  • Long Paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

Continual enrichments in auditory interfaces of desktop applications allow visually impaired people to successfully use computers in education, employment, and social interaction. Designers face multiple challenges while producing sound for auditory interfaces. This paper presents a new method of adaptive auditory feedback, which converts speech-only instructions to non-speech (i.e., spearcons), based on users’ interaction with the application in the desktop environment. Using within-subject design, fifteen participants (i.e., visually impaired) were involved in the study. Results from the study demonstrate that the adaptive auditory feedback method is more efficient than non-speech and more pleasurable with respect to repetitive speech-only instructions. Furthermore, adaptive auditory feedback improves task completion time and action awareness as compared to speech-only. Lastly, these findings may benefit researchers and developers to use adaptive auditory feedback, instead of using speech-only or non-speech feedback while designing auditory feedback for interfaces in desktop environment for the people with visual impairment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. http://www.freedomscientific.com/Products/LowVision/MAGic.

  2. http://www.freedomscientific.com/Products/Blindness/JAWS.

  3. http://www.nvaccess.org/.

  4. http://www.apple.com/accessibility/osx/voiceover/.

  5. https://play.google.com/store/apps/details?id=com.google.android.marvin.talkback.

References

  1. Wentz, B., Lazar, J.: Usability evaluation of email applications by blind users. J. Usability Stud. 6(2), 75–89 (2011)

    Google Scholar 

  2. Lazar, J., Feng, J., Allen, A.: Determining the impact of computer frustration on the mood of blind users browsing the web. In: Proceedings of the 8th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, pp. 149–156 (2006)

  3. Nilesh, J., Alai, P., Swapnil, C., Bendre, M.: Voice based system in desktop and mobile devices for blind people. Int. J. Emerg. Technol. Adv. Eng. (IJETAE) 4, 404–407 (2014)

    Google Scholar 

  4. Monticelli, C., Heidrich, R.D.O., Rodrigues, R., Cappelatti, E., Goulart, R., Oliveira, R., Velho, E.: Text vocalizing desktop scanner for visually impaired people. In: International Conference on Human–Computer Interaction. pp. 62–67. Springer (2018)

  5. Organization, W.H.: Blindness and vision impairment. https://www.who.int/en/news-room/fact-sheets/detail/blindness-and-visual-impairment (2018). Accessed 23 March 2019

  6. McKiel, F.: Audio-enabled graphical user interface for the blind or visually impaired. In: Proceedings of the Johns Hopkins National Search for Computing Applications to Assist Persons with Disabilities, IEEE, pp. 185–187 (1992)

  7. Poll, L.H.D.: Visualising graphical user interfaces for blind users. Thesis Technische Universiteit Eindhoven (1996)

  8. Metatla, O., Oldfield, A., Ahmed, T., Vafeas, A., Miglani, S.: Voice user interfaces in schools: co-designing for inclusion with visually-impaired and sighted pupils (2019)

  9. Bigham, J.P., Prince, C.M., Ladner, R.E.: WebAnywhere: a screen reader on-the-go. In: Proceedings of the 2008 International Cross-Disciplinary Conference on Web Accessibility (W4A). ACM, pp. 73–82 (2008)

  10. Lutz, R.J.: Prototyping and evaluation of landcons: auditory objects that support wayfinding for blind travelers. ACM SIGACCESS Access. Comput. 86, 8–11 (2006)

    Article  Google Scholar 

  11. Strachan, S., Eslambolchilar, P., Murray-Smith, R., Hughes, S., O’Modhrain, S.: GpsTunes: controlling navigation via audio feedback. In: Proceedings of the 7th International Conference on Human Computer Interaction with Mobile Devices & Services. ACM, pp. 275–278 (2005)

  12. Hussain, I., Chen, L., Mirza, H.T., Chen, G., Hassan, S.-U.: Right mix of speech and non-speech: hybrid auditory feedback in mobility assistance of the visually impaired. Univ. Access Inf. Soc. 14(4), 527–536 (2015)

    Article  Google Scholar 

  13. Gaver, W.W.: Auditory icons: using sound in computer interfaces. Human-Comput. Interact. 2(2), 167–177 (1986)

    Article  Google Scholar 

  14. Brewster, S.A., Wright, P.C., Edwards, A.D.: Experimentally derived guidelines for the creation of earcons. In: Adjunct Proceedings of HCI. pp. 155–159 (1995)

  15. Walker, B.N., Nance, A., Lindsay, J.: Spearcons: speech-based earcons improve navigation performance in auditory menus. In: Georgia Institute of Technology (2006)

  16. Palladino, D.K., Walker, B.N.: Learning rates for auditory menus enhanced with spearcons versus earcons. In: Georgia Institute of Technology (2007)

  17. Walker, B.N., Lindsay, J.: Navigation performance with a virtual auditory display: effects of beacon sound, capture radius, and practice. Hum. Factors 48(2), 265–278 (2006)

    Article  Google Scholar 

  18. Dingler, T., Lindsay, J., Walker, B.N.: Learnabiltiy of sound cues for environmental features: Auditory icons, earcons, spearcons, and speech. In International Community for Auditory Display (2008)

  19. Hussain, I., Chen, L., Mirza, H.T., Majid, A., Chen, G.: Hybrid auditory feedback: a new method for mobility assistance of the visually impaired. In: Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility. ACM. pp. 255–256 (2012)

  20. Shoaib, M., Hussain, I., Ishaq, A., Wahab, A., Bashir, A., Tayyab, M.: Adaptive auditory feedback: a new method for desktop assistance of the visual impaired people. In: Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers. ACM. pp. 247–250 (2018)

  21. Asakawa, C., Takagi, H.: Transcoding. Web Accessibility, pp. 231–260. Springer, Berlin (2008)

    Chapter  Google Scholar 

  22. Mahmud, J., Borodin, Y., Das, D., Ramakrishnan, I.: Combating information overload in non-visual web access using context. In: Proceedings of the 12th International Conference on Intelligent user Interfaces. ACM, pp. 341–344 (2007)

  23. Bigham, J.P., Kaminsky, R.S., Ladner, R.E., Danielsson, O.M., Hempton, G.L.: WebInSight:: making web images accessible. In: Proceedings of the 8th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, pp. 181–188 (2006)

  24. Harper, S., Goble, C., Stevens, R., Yesilada, Y.: Middleware to expand context and preview in hypertext. In: ACM SIGACCESS Accessibility and Computing, ACM, vol. 77–78, pp. 63–70 (2004)

  25. Thatcher, J.: Screen reader/2: access to OS/2 and the graphical user interface. In: Proceedings of the First Annual ACM Conference on Assistive Technologies. ACM, pp. 39–46 (1994)

  26. Miller, D., Parecki, A., Douglas, S.A.: Finger dance: a sound game for blind people. In: Proceedings of the 9th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, pp. 253–254 (2007)

  27. Kim, S., Lee, K-p., Nam, T-J.: Sonic-badminton: audio-augmented badminton game for blind people. In: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, pp. 1922–1929 (2016)

  28. Jäger, A., Hadjakos, A.: Navigation in an audio-only first person adventure game. In: Georgia Institute of Technology (2017)

  29. Karshmer, A.I., Gupta, G., Pontelli, E., Miesenberger, K., Ammalai, N., Gopal, D., Batusic, M., Stöger, B., Palmer, B., Guo, H.-F.: UMA: a system for universal mathematics accessibility. ACM SIGACCESS Access. Comput. 77–78, 55–62 (2003)

    Article  Google Scholar 

  30. Blattner, M.M., Sumikawa, D.A., Greenberg, R.M.: Earcons and icons: their structure and common design principles. Hum.-Comput. Interact. 4(1), 11–44 (1989)

    Article  Google Scholar 

  31. Brewster, S.A.: Providing a Structured Method for Integrating Non-speech Audio into Human–Computer Interfaces. University of York York, UK (1994)

    Google Scholar 

  32. Brewster, S.A.: Using nonspeech sounds to provide navigation cues. ACM Trans. Comput. Hum. Interact. (TOCHI) 5(3), 224–259 (1998)

    Article  Google Scholar 

  33. Crispien, K., Würz, W., Weber, G.: Using spatial audio for the enhanced presentation of synthesised speech within screen-readers for blind computer users. In: International Conference on Computers for Handicapped Persons. Springer, pp. 144–153 (1994)

  34. Gaver, W.: “Using and Creating Auditory Icons”, Auditory Displays Sonification, Audification and Auditory Interfaces. Addison-Wesley, Boston (1994)

    Google Scholar 

  35. Mynatt, E.D., Weber, G.: Nonvisual presentation of graphical user interfaces: contrasting two approaches. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, pp. 166–172 (1994)

  36. Stevens, R.D.: Principles for the design of auditory interfaces to present complex information to blind people. University of York (1996)

  37. Smith, A.C., Cook, J.S., Francioni, J.M., Hossain, A., Anwar, M., Rahman, M.F.: Nonvisual tool for navigating hierarchical structures. In: ACM SIGACCESS Accessibility and Computing. ACM. vol. 77–78. pp. 133–139 (2004)

  38. Wang, H-C., Katzschmann, R.K., Teng, S., Araki, B., Giarré, L., Rus, D.: Enabling independent navigation for visually impaired people through a wearable vision-based feedback system. In: 2017 IEEE International Conference on Robotics and Automation (ICRA). IEEE, pp. 6533–6540 (2017)

  39. Brock, A.M., Truillet, P., Oriola, B., Picard, D., Jouffrais, C.: Interactivity improves usability of geographic maps for visually impaired people. Hum. Comput. Interact. 30(2), 156–194 (2015)

    Article  Google Scholar 

  40. Mascetti, S., Rossetti, C., Gerino, A., Bernareggi, C., Picinali, L., Rizzi, A.: Towards a natural user interface to support people with visual impairments in detecting colors. In: International Conference on Computers Helping People with Special Needs. Springer, pp. 171–178 (2016)

  41. Zientara, P.A., Lee, S., Smith, G.H., Brenner, R., Itti, L., Rosson, M.B., Carroll, J.M., Irick, K.M., Narayanan, V.: Third Eye: a shopping assistant for the visually impaired. Computer 50(2), 16–24 (2017)

    Article  Google Scholar 

  42. Jayachandran, K., Anbumani, P.: Voice based email for blind people. Int. J. Adv. Res. Ideas Innov. Technol. 3(3), 1065–1071 (2017)

    Google Scholar 

Download references

Funding

This work was supported by the Higher Education Commission, Pakistan (No. HRD-HEC-2014-833).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ibrar Hussain.

Ethics declarations

Conflict of interest

On behalf of all authors, the corresponding author states that there is no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shoaib, M., Hussain, I. & Mirza, H.T. Automatic switching between speech and non-speech: adaptive auditory feedback in desktop assistance for the visually impaired. Univ Access Inf Soc 19, 813–823 (2020). https://doi.org/10.1007/s10209-019-00696-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-019-00696-5

Keywords

Navigation