Spoken Mathematics Using Prosody, Earcons and Spearcons
Printed notation provides a highly succinct and unambiguous description of the structure of mathematical formulae in a manner which is difficult to replicate for the visually impaired. A number of different approaches to the verbal presentation of mathematical material have been explored, however, the fundamental differences between the two modalities of vision and audition are often ignored. This use of additional lexical cues, spatial audio or complex hierarchies of non-speech sounds to represent the structure and scope of equations may be cognitively demanding to process, and this can detract from the perception of the mathematical content. In this paper, a new methodology is proposed which uses the prosodic component found in spoken language, in conjunction with a limited set of spatialized earcons and spearcons, to disambiguate the structure of mathematical formulae. This system can potentially represent this information in an intuitive and unambiguous manner which takes advantage of the specific strengths and capabilities of audition.
KeywordsMath auditory interfaces visual impairment earcons spearcons
Unable to display preview. Download preview PDF.
- 1.Gillan, D., Barraz, P., et al.: Cognitive Analysis of Eqauation readings: Application to the development of the MathGenie. In: Miesenberger, K., Klaus, J., Zagler, W.L., Burger, D. (eds.) ICCHP 2004. LNCS, vol. 3118, pp. 630–637. Springer, Heidelberg (2004)Google Scholar
- 2.Raman, T.V.: Audio Systems for Technical Reading. PhD Thesis, Department of Computer Science, Cornell University, Ny, USA (May 1994)Google Scholar
- 4.Stevens, R.D.: Principles for the Design of Auditory Interfaces to Present Complex Information to Blind Computer Users. PhD Thesus, University of York, UK (1996)Google Scholar
- 5.Harling, P.A., Stevens, R., Edwards, A.: Mathgrasp: The design of an algebra manipulation tool for visually disabled mathematicians using spatial-sound and manual gestures. HCI Group, University of York, UK (1995)Google Scholar
- 7.Hollander, A.J., Furness, T.A.: Perception of Virtual Auditory Shapes. In: Proceedings of the International Conference on Auditory Displays (November 1994)Google Scholar
- 8.Goose, S., Möller, C.: A 3D Audio Only Interactive Web Browser: Using Spatialization to Convey Hypermedia Document Structure. In: Proceedings of the ACM International Conference on Multimedia, Orlando, USA, pp. 363–371 (1999)Google Scholar
- 9.Begault, D.R., Erbe, T.R.: Multi-channel spatial auditory display for speech communications. Audio Engineering Society 95th Convention, Preprint No. 3707 (1993)Google Scholar
- 10.Crispien, K., Petrie, H.: Providing Access to Graphical-Based User Interfaces for Blind People: Using Multimedia System Based on Spatial Audio Representation. 95th AES Convention, J. Audio Eng. Soc. (Abstracts) 41, 1060 (1993)Google Scholar
- 11.Nemeth, A.: Abraham Nemeth’s Anthology on the Unified Braille Code, http://www.math.virginia.edu/arm4r/nemeth
- 12.Baddeley, A.: Human Memory: Theory and Practice. Lawrence Erlbaum Associates, London (1990)Google Scholar
- 13.Walker, B., Nance, A., Lindsay, J.: Spearcons: Speech-based Earcons Improve Navigation Performance in Auditory Menus. In: Proceedings of the 12th International Conference on Auditory Display, London, UK, June 20-23 (2006)Google Scholar
- 14.Edwards, A.D.N.: Using sounds to convey complex information. In: Schick, A., Klatte, M. (eds.) Contributions to Psychological Acoustics: Results of the Seventh Oldenburg Symposium on Psychological Acoustics, Oldenburg, pp. 341–358 (1997)Google Scholar