Synthesizing non-speech sound to support blind and visually impaired computer users

  • A. Darvishi
  • V. Guggiana
  • E. Munteanu
  • H. Schauer
  • M. Motavalli
  • M. Rauterberg
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 860)


This paper describes work in progress on automatic generation of “impact sounds” based on physical modelling. These sounds can be used as non-speech audio presentation of objects and as interaction-mechanisms to non visual interfaces. In this paper especially we present the complete physical model for impact sounds “spherical objects hitting flat plates or beams.” The results of analysing of some examples of recorded (digitised) “impact sounds” and their comparisons with some theoretical aspects are discussed in this paper. These results are supposed to be used as input for the next phases of our audio framework project. The objective of this research project (joint project University of Zurich and Swiss Federal Institute of Technology) is to develop a concept, methods and a prototype for an audio framework. This audio framework shall describe sounds on a highly abstract semantic level. Every sound is to be described as the result of one or several interactions between one or several objects at a certain place and in a certain environment.


non speech sound generation visual impairment auditory interfaces physical modelling auditive feedback human computer interaction software ergonomics usability engineering material properties 


  1. [Blattner-92]
    Blattner, M. M., Greenberg, R. M. and Kamegai, M. (1992) Listening to Turbulence: An Example of Scientific Audialization. In: Multimedia Interface Design, ACM Press/ Addison-Wesley, pp 87–102.Google Scholar
  2. [Blattner-93]
    Blattner, M. M., G. Kramer, J. Smith, and E. Wenzel (1993) Effective Uses of Nonspeech Audio in Virtual Reality. In: Proceedings of the IEEE Symposium on Reaseach Frontiers in Virtual Reality, San Jose, CA. (In conjunction with IEEE Visualization '93) October 25–26, 1993.Google Scholar
  3. [Boyd-90]
    Bloyd, L.H., Boyd, W.L., and Vanderheiden, G.C. (1990) The graphical user interface: Crisis, danger and opportunity. Journal of Visual Impairment and Blindness, p.496–502.Google Scholar
  4. [Crispien-94]
    Crispien,K. (1994) Graphische Benutzerschnittstellen für blinde Rechnerbenutzer. Unpublished manuscriptGoogle Scholar
  5. [Edwards-93]
    Edwards, W.K., Mynatt, E.D. and Rodriguez, T. (1993) The Mercator Project: a non-visual interface to the X Window system. The X Resource, 4:1–20.(ftp /papers/Mercator /xresource).Google Scholar
  6. [Gaver-86]
    Gaver, W. W. (1986). Auditory icons: Using sound in computer interfaces. Human-Computer Interaction. 2, 167–177.Google Scholar
  7. [Gaver-88]
    Gaver, W. W. (1988). Everyday listening and auditory icons. Doctoral Dissertation, University of California, San Diego.Google Scholar
  8. [Gaver-89]
    Gaver, W. (1989) The SonicFinder: an interface that uses auditory icons. Human Computer Interaction 4:67–94.Google Scholar
  9. [Gaver-90]
    Gaver, W. & Smith R. (1990) Auditory icons in large-scale collaborative environments. In: D. Diaper, D. Gilmore, G. Cockton & B. Shackel (eds.) Human-Computer Interaction — INTERACT'90. (pp. 735–740), AmsterdamGoogle Scholar
  10. [Gaver-91]
    Gaver, W., Smith, R. & O'Shea, T. (1991) Effective sounds in complex systems: the ARKola simulation. in S. Robertson, G. Olson & J. Olson (eds.), Reaching through technology CHI'91. (pp. 85–90), Reading MA: Addison-Wesley.Google Scholar
  11. [Gaver-93]
    Gaver, W. (1993) What in the World do We Hear? An Ecological Approach to Auditory Event Perception. Ecological Psychology, 5(1).Google Scholar
  12. [Momtahan-93]
    Momtahan, K., Hetu, R. and Tansley, B. (1993) Audibility and identification of auditory alarms in the operating room and intensive care unit. Ergonomics 36(10): 1159–1176.PubMedGoogle Scholar
  13. [Mynatt-92]
    Mynatt, E.D. and Edwards, W.K. (1992) The Mercator Environment: A Nonvisual Interface to XWindows and Workstations. GVU Tech Report GIT-GVU-92-05 Google Scholar
  14. [Mynatt-92b]
    Mynatt, E.D. and Edwards, W.K. (1992) Mapping GUIs to auditory interfaces. In: Proceedings of the ACM Symposium on User Interface Software and Technology UIST'92.Google Scholar
  15. [Mynatt-93]
    Mynatt, E.D. and Weber, G. (1993) Nonvisual Presentation of Graphical User Interfaces: Contrasting Two Approaches. Tech Report/93 Google Scholar
  16. [Rauterberg-94]
    Rauterberg, M., Motavalli, M., Darvishi, A. & Schauer, H. (1994) Automatic sound generation for spherical objects hitting straight beams. In: Proceedings of “World Conference on Educational Multimedia and Hypermedia” ED-Media'94 held in Vancouver (C), June 25–29, 1994.Google Scholar
  17. [Rossing-90]
    Rossing, T.D (1990) The Science of Sound 2nd Edition, Addison Wesley Publishing CompanyGoogle Scholar
  18. [Sumikawa-86]
    Sumikawa, D. A., M. M. Blattner, K. I. Joy and R. M. Greenberg (1986) Guidelines for the Syntactic Design of Audio Cues in Computer Interfaces. In: 19th Annual Hawaii International Conference on System Sciences.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1994

Authors and Affiliations

  • A. Darvishi
    • 1
  • V. Guggiana
    • 1
  • E. Munteanu
    • 1
  • H. Schauer
    • 1
  • M. Motavalli
    • 2
  • M. Rauterberg
    • 3
  1. 1.Department of Computer Science (IfI)University of ZurichZurichSwitzerland
  2. 2.Swiss Federal Laboratories for Material Testing and Research (EMPA)DubendorfSwitzerland
  3. 3.Usability Laboratory, Work and Organizational Psychology UnitSwiss Federal Institute of Technology (ETH)ZurichSwitzerland

Personalised recommendations