Advertisement

Accelerometer & Spatial Audio Technology: Making Touch-Screen Mobile Devices Accessible

  • Flaithri Neff
  • Tracey J. Mehigan
  • Ian Pitt
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6179)

Abstract

As mobile-phone design moves toward a touch-screen form factor, the visually disabled are faced with new accessibility challenges. The mainstream interaction model for touch-screen devices relies on the user having the ability to see spatially arranged visual icons, and to interface with these icons via a smooth glass screen. An inherent challenge for blind users with this type of interface is its lack of tactile feedback. In this paper we explore the concept of using a combination of spatial audio and accelerometer technology to enable blind users to effectively operate a touch-screen device. We discuss the challenges involved in representing icons using sound and we introduce a design framework that is helping us tease out some of these issues. We also outline a set of proposed user-studies that will test the effectiveness of our design using a Nokia N97. The results of these studies will be presented at ICCHP 2010.

Keywords

Spatial Audio Accelerometers Vision Impaired Mobile Devices 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Brewster, S., Leplâtre, G., Crease, M.: Using Non-Speech Sounds in Mobile Computing Devices. Glasgow Interactive Systems Group, Dept. Computing Science, University of Glasgow (1998)Google Scholar
  2. 2.
    Stevens, R.D., Wright, P.C., Edwards, A.D.N., Brewster, S.A.: An Audio Glance at Syntactic Structure Based on Spoken Form. In: Proc. of ICCHP 1996, Linz, pp. 627–635 (1996)Google Scholar
  3. 3.
    Brewster, S., Wright, P.C., Edwards, A.D.N.: A detailed investigation into the effectiveness of earcons. In: Kramer, G., Smith, S. (eds.) Proc. International Conference on Auditory Display, USA (1994)Google Scholar
  4. 4.
    Stevens, R.D., Brewster, S.A., Wright, P.C., Edwards, A.D.N.: Design and Evaluation of an Auditory Glance at Algebra for Blind Readers. In: Kramer, G., Smith, S. (eds.) Proc. International Conference on Auditory Display, USA (1994)Google Scholar
  5. 5.
    Brown, G.J., Cooke, M.: Perceptual grouping of musical sounds: A computational model. Journal of New Music Research 23(2), 107–132 (1994)CrossRefGoogle Scholar
  6. 6.
    Kashino, K., Nakadai, K., Kinoshita, T., Tanaka, H.: Organization of Hierarchical Perceptual Sounds: Music Scene Analysis with Autonomous Processing Modules and a Quantitative Information Integration Mechanism. In: Proc. International Joint Conf. on Artificial Intelligence, pp. 158–164 (1995)Google Scholar
  7. 7.
    Bregman, A.: Auditory Scene Analysis: The Perceptual Organization of Sound. MIT Press, USA (1994)Google Scholar
  8. 8.
    Neff, F., Pitt, I., Keheo, A.: A Consideration of Perceptual Interaction in an Auditory Prolific Mobile Device, Spatial Audio for Mobile Devices. In: 9th International Conference on Human Interaction with Mobile Devices and Services, Mobile HCI 2007, Singapore (2007)Google Scholar
  9. 9.
    Neff, F., Kehoe, A., Pitt, I.J.: User Modeling to Support the Development of an Auditory Help System. In: Matoušek, V., Mautner, P. (eds.) TSD 2007. LNCS (LNAI), vol. 4629, pp. 390–397. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  10. 10.
    Begault, D.R.: Virtual Acoustics, Aeronautics and Communications. Presented at the 101st Convention, Los Angeles, California, November 8-11; Journal of the Audio Engineering Society (1996)Google Scholar
  11. 11.
    Cabrera, D., Ferguson, S., Laing, G.: Development of Auditory Alerts for Air Traffic Control Consoles. In: 119th Convention, NY, USA, October 7-10; Journal of the Audio Engineering Society (2005)Google Scholar
  12. 12.
    Brown, L., Brewster, S., Ramloll, R., Burton, M., Riedel, B.: Design Guidelines for Audio Presentation of Graphs and Tables. In: Proceedings of International Conference on Auditory Display, Boston, MA, USA, July 6-9 (2003)Google Scholar
  13. 13.
    JSR-234 Expert Group: Advanced Multimedia Supplements API for JavaTM2 Micro Edition. Nokia Corporation 2004-2005 (May 17, 2005)Google Scholar
  14. 14.
  15. 15.
    The Khronos Group: OpenSL ES Specification, Version 1.0. The Khronos Group Inc. (2007-2009) (March 16, 2009)Google Scholar
  16. 16.
    Zheng, H., Black, N., Harris, N.: Position-Sensing Technologies for Movement Analysis in Stroke Rehabilitation. Medical and Biological Engineering and Computing 43(4), 413–420 (2005)CrossRefGoogle Scholar
  17. 17.
    Wigdor, D., Balakrishnan, R.: TiltText: Using Tilt for Text Input to Mobile Phones. In: Proceedings UIST 2003, Vancouver, BC, Canada, pp. 81–90 (2003)Google Scholar
  18. 18.
    Cho, S., Murray-smith, R., Choi, C., Sung, Y., Lee, K., Kim, Y.: Dynamics of Tilt Based Browsing. In: Proceedings of CHI 2007, San Jose, USA (2007)Google Scholar
  19. 19.
    Darnauer, J., Garrity, S., Kim, T.: Orientation-based Interaction for Mobile Devices (June 10, 2007), http://hci.stanford.edu/~srk/cs377a-mobile/project/final/darnauer-garrity-kim.pdf
  20. 20.
    Eslambolchilar, P., Murray-Smith, R.: Tilt-based automatic zooming and scaling in mobile devices-a state-space implementation. In: Brewster, S., Dunlop, M.D. (eds.) Mobile HCI 2004. LNCS, vol. 3160, pp. 120–131. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  21. 21.
    Wrigley, S.N., Brown, G.J.: A model of auditory attention. Technical Report CS-00- 07, Speech and Hearing Research Group, University of Sheffield (2000)Google Scholar
  22. 22.
    Jones, D.M., Macken, W.J.: Irrelevant Tones Produce an Irrelevant Speech Effect: Implications for Phonological Coding in Working Memory. Journal of Experimental Psychology: Learning, Memory, and Cognition 19(2), 369–381 (1993)Google Scholar
  23. 23.
    Jones, D.M., Madden, C., Miles, C.: Privileged access by irrelevant speech to short-term memory: The role of changing state. Quarterly Journal of Experimental Psychology 44A, 645–669 (1992)CrossRefGoogle Scholar
  24. 24.
    Baddeley, A.: Your Memory: A User’s Guide, 2nd edn., Prion, UK, June 1 (1996)Google Scholar
  25. 25.
    Dingler, T., Lindsay, J., Walker, B.: Learnability of Sound Cues for Environmental Features: Auditory Icons, Earcons, Spearcons, and Speech. In: Proceedings of the 14th International Conference on Auditory Display, Paris, France, June 24-27 (2008)Google Scholar
  26. 26.
    Walker, B., Lindsay, J.: The Georgia Tech Sonification Lab, School of Psychology and the School of Interactive Computing. Georgia Institute of Technology, USAGoogle Scholar
  27. 27.
    Buxton, W., Gaver, W., Bly, S.: Auditory Interfaces: The Use of Non-Speech Audio at the Interface (1994) (Unpublished)Google Scholar
  28. 28.
    Brewster, S.: Nonspeech Auditory Output. In: Sears, A., Jacko, J. (eds.) Human-Computer Interaction: Fundamentals, Evolving Technologies and Emerging Applications. Taylor & Francis Group, LLC CRC Press, USA (2009)Google Scholar
  29. 29.
    Darnauer, J., Garrity, S., Kim, T.: Orientation-based Interaction for Mobile Devices. (June 10, 2007), http://hci.stanford.edu/ srk/cs377a-mobile/project/final/ darnauer-garrity-kim.pdf Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Flaithri Neff
    • 1
  • Tracey J. Mehigan
    • 2
  • Ian Pitt
    • 2
  1. 1.Dep’t Electrical & Electronic EngineeringLimerick Institute of TechnologyIreland
  2. 2.Dep’t Computer ScienceUniversity College CorkIreland

Personalised recommendations