Advertisement

Multimodal Interaction and People with Disabilities

  • A. D. N. Edwards
Chapter
Part of the Text, Speech and Language Technology book series (TLTB, volume 19)

Abstract

What is the connection between computers, multiple modalities and people with disabilities? A traditionally scientific chapter might start out with definitions of these terms. However, there is problem in doing that in this case, which is that only one of them — ‘computer’ — is at all easy to define.

Keywords

Sign Language American Sign Language Blind People Screen Reader Blind User 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. ASL Dictionary Online. http://www.bewellnet.com/dario/asl_dictionary_online practical.htm. Baker, B. Minspeak. Byte 7(9): pp. 186–202, 1982.Google Scholar
  2. Barnard, P. and J. May. Interactions with Advanced Graphical Interfaces and the Deployment of Latent Human Knowledge. In: Interactive Systems: Design, Specification and Verification. F. Patemo (Ed.) Heidelberg, Springer-Verlag, 1994.Google Scholar
  3. Bennett, D. Presenting diagrams in sounds for blind people, DPhil thesis, University of York, Department of Computer Science, 1999.Google Scholar
  4. Bennett, D.J. and A.D.N Edwards. Exploration of non-seen diagrams. In: Proceedings of ICAD ‘88 (International Conference on Auditory Display), S.A. Brewster and A.D.N. Edwards (Eds.), Glasgow: British Computer Society, 1998.Google Scholar
  5. Blattner, M. and R.B. Dannenberg. Introduction: The trend toward multimedia interfaces. In: Multimedia Interface Design. M. Blattner and R.B. Dannenberg (Eds.), New York: ACM Press, Addison-Wesley: pp. xvii-xxv, 1992.Google Scholar
  6. Blattner, M.M., D.A Sumikawa and R.M. Greenberg. Earcons and icons: Their structure and common design principles. Human-computer Interaction 4 (1): pp. 11–44, 1989.CrossRefGoogle Scholar
  7. Blenkhom, P. and D.G. Evans. A method to access computer aided software engineering (CASE) tools for blind software engineers. In: Computers for Handicapped Persons: Proceedings of the 4th International Conference, ICCHP ‘84, W.L. Zagler (Ed.), pp. 321–328, Springer-Verlag, 1994.Google Scholar
  8. Bonebright, T.L., M.A. Nees, T.T. Connerley and R, M.C.G. Testing the effectiveness of sonified graphs for education: A programmatic research project. In: ICAD 2001, J. Hiipakka, N. Zacharov and T. Takala (Eds.), Espoo, Finland, Helsinki University of Technology, 2001.Google Scholar
  9. Bregman, A.S. Auditory Scene Analysis. Cambridge, Massachusetts: MIT Press. 1990.Google Scholar
  10. Brewster, S.A. Providing a structured method for integrating non-speech audio into human-computer interfaces, DPhil Thesis, University of York, Department of Computer Science, 1994.Google Scholar
  11. Brewster, S.A., P.C. Wright and A.D.N. Edwards. Experimentally derived guidelines for the creation of earcons. In: Adjunct Proceedings of HCI’95: People and Computers, G. Allen, J. Wilkinson and P. Wright (Eds.) pp. 155–159, Huddersfield, British Computer Society. 1995.Google Scholar
  12. Buxton, W. Introduction to this special issue on nonspeech audio. Human-Computer Interaction 4 (1): pp. 1–10, 1989.Google Scholar
  13. Challis, B.P. Design principles for tactile communication within the human-computer interface, DPhil thesis, University of York, Department of Computer Science, 2000.Google Scholar
  14. Challis, B.P. and A.D.N. Edwards. Design principles for tactile interaction. In: First International Workshop on Haptical Human-Computer Interaction, S. Brewster (Ed.) pp. 98–101, Glasgow, British Computer Society. 2000.Google Scholar
  15. Challis, B., J. Hankinson, T. Evreinova and G. Evreinov. Alternative textured display. In: Computers and Assistive Technology, ICCHP ‘88: Proceedings of the XV IFIP World Computer Congress, A.D.N. Edwards, A. Arato and W.L. Zagler (Eds.) pp. 37–48, Vienna and Budapest, Austrian Computer Society. 1998.Google Scholar
  16. DigiScents. Digiscents: A revolution of the senses. http://www.digiscents.com/. 2000
  17. Edwards, A.D.N. Speech Synthesis: Technology for Disabled People. London: Paul Chapman. 1991. Edwards, A.D.N. The rise of the graphical user interface. Information Technology and Disabilities 2(4), (http://www.isc.rit..edu/~easi/itd/itdv02n4/article3.html). 1995.
  18. Edwards, A.D.N. Progress in sign language recognition. In: Gesture and Sign Language in HumanComputer Interaction. I. Wachsmuth and M. Frölich (Eds.), Berlin: Springer: pp. 13–21. 1998.Google Scholar
  19. Edwards, A.D.N. and A. Blore. Speech input for persons with speech impairments. Journal of Microcomputer Applications 18: pp. 327–333, 1995.CrossRefGoogle Scholar
  20. Edwards, A.D.N. and R.D. Stevens. Mathematical representations: Graphs, curves and formulas. In: Non-Visual Human-Computer Interactions: Prospects for the visually handicapped. D. Burger and J.-C. Sperandio (Eds.), Paris, John Libbey Eurotext: pp. 181–194, 1993.Google Scholar
  21. Elkind, J. and J. Shrager. Modeling and analysis of dyslexic writing using speech and other modalities. In: Extra-ordinary Human-Computer Interaction: Interfaces for Users with Disabilities. A.D.N. Edwards (Ed.) New York: Cambridge University Press: pp. 145–168, 1995.Google Scholar
  22. Hinton, R. Tactile Graphics in Education. Edinburgh: Moray House Publications, 1996.Google Scholar
  23. Jaffe, D.L. Ralph: A fourth generation fingerspelling hand. http://guide.stanford.edu/Publications/dev2.html. 1994.Google Scholar
  24. Kennel, A.R. Audiograf. A diagram reader for the blind. In: Proceedings of Assets ‘86, pp. 51–56, Vancouver, ACM, 1996.CrossRefGoogle Scholar
  25. Kurze, M. TDraw: A computer-based tactile drawing tool for blind people. In: Proceedings of Assets ‘86, pp. 131–138, Vancouver, ACM, 1996.CrossRefGoogle Scholar
  26. Lapiak, J.A. Handspeak: A sign language dictionary online, http://dww.deafworldweb.org/asl/
  27. Mansur, D.L., M. Blattner and K. Joy. Sound-Graphs: A numerical data analysis method for the blind. Journal of Medical Systems 9: pp. 163–174, 1985.CrossRefGoogle Scholar
  28. Mayes, T. The ‘M’ word: Multimedia interfaces and their role in interactive learning systems. In Multimedia Interface Design in Education. A. D. N. Edwards and S. Holland (Eds.), Berlin, Springer-Verlag. 76: pp. 1–22. 1992.Google Scholar
  29. Mtsopoulos, E. A principled approach to the design of auditory interaction on the non-visual user interface, DPhil thesis, University of York, Department of Computer Science, 2000.Google Scholar
  30. Mitsopoulos, E.N. and A.D.N. Edwards. A methodology for the specification of non-visual widgets. In: Adjunct Conference Proceedings of HCI International ‘89, H.-J. Bullinger and P.H. Vossen (Eds.) pp. 59–60, 1999a.Google Scholar
  31. Mitsopoulos, E.N. and A.D.N. Edwards. A principled design methodology for auditory interaction. In: Proceedings of Interact 99, M. A. Sasse and C. Johnson (Eds.) pp. 263–271, Edinburgh, IOS Press. 1999b.Google Scholar
  32. Mitsopoulos, E.N. and A.D.N. Edwards. A principled methodology for the specification and design of non-visual widgets. In: Proceedings of ICAD ‘88 (International Conference on Auditory Display), S. A. Brewster and A.D.N. Edwards (Eds.), Glasgow, British Computer Society. 1998.Google Scholar
  33. Mynatt, E.D. and Weber, G. Nonvisual presentation of graphical user interfaces: Contrasting two approaches. In: Celebrating Interdependence: Proceedings of Chi ‘84, C. Plaisant (Ed.) pp. 166–172, Boston, New York: ACM Press. 1994.Google Scholar
  34. Newell, A.F., J. L. Arnott, et al. Intelligent systems for speech and language impaired people: A portfolio of research. In: Extra-Ordinary Human-Computer Interaction: Interfaces for Users with Disabilities. A.D.N. Edwards (Ed.) New York, Cambridge University Press: pp. 8–102. 1995.Google Scholar
  35. Oakley, I., M.R. McGee, S.A. Brewster and P.D. Gray. Putting the feel in ‘look and feel’. In: The Future is Here: Proceedings of Chi 2000, T. Turner, G. Szwillus, M. Czerwiniski and F. Patemb (Eds.) pp. 415422, The Hague, NL, ACM Press Addison-Wesley, 2000.Google Scholar
  36. Patterson, K. and C. Shewell. Speak and spell: Dissociations and word-class effects. In: The Cognitive Neuropsychology of Language. G.S. Max-Coltheart and R. Job (Eds.), London: Lawrence Erlbaum Associates: pp. 273–294, 1987.Google Scholar
  37. Pitt, I.J. The Principled Design of Speech-Based Interfaces. DPhil Thesis, University of York, 1996. Rigas, D. Guidelines for auditory interface design: An empirical investigation, unpublished PhD Thesis, Loughborough University, Department of Computer Science, 1996.Google Scholar
  38. Rigas, D.I. and J.L. Alty. The use of music in a graphical interface for the visually impaired. In: Proceedings of Interact ‘87, the International Conference on Human-Computer Interaction, S. Howard, J. Hammond and G. Lindegaard (Eds.), pp. 228–235, Sydney: Chapman and Hall, 1997.Google Scholar
  39. RNIB. This is Moon,RNIB. http://www.rnib.org.uk/braille/moonc.htm. 1996.Google Scholar
  40. Sacks, A.H. and R. Steele. Ajoumey from concept to commercialization - Lingraphica. OnCenter Tech100 nology Transfer News (5), (http://guide.stanford.edu/Publications/issue5.html#ling). 1993.Google Scholar
  41. Schulz, B. and B. Wilhelm. Access to computerized line drawings with speech. In: Computers for Handicapped Persons: Proceedings of the 3rd International Conference, ICCHP ‘82, W.L. Zagler (Ed.), pp. 461–465, Vienna, Osterreichische Computer Gesellschaft. 1992.Google Scholar
  42. Smith, S.L. and J.N. Mosier. Design guidelines for user-system interface software,Report ESD-TR-84–190Google Scholar
  43. USAF Electronics Division, (http://www.info.fundp.ac.be/httpdocs/guidelines/??). 1984.
  44. Steele, R.D. and M. Weinrich. Training of severely impaired aphasics on a computerized visual communication system. In: Proceedings of Resna 8th Annual Conference, pp. 320–322. 1986.Google Scholar
  45. Stevens, R. Principles for the design of auditory interfaces to present complex information to blind computer users, DPhil Thesis, University of York, UK, 1996.Google Scholar
  46. Strong, G.W. An evaluation of the PRC Touch Talker with Minspeak: Some lessons for speech prosthesis design. In: Extra-Ordinary Human-Computer Interaction: Interfaces for Users with Disabilities. A.D.N. Edwards (Ed.) New York: Cambridge University Press: pp. 47–57, 1995.Google Scholar
  47. UN. The United Nations Declaration on the Rights of Disabled Persons. Unesco Courier 1: pp. 6–7, 1981.Google Scholar
  48. Youngblut, C., R.E. Johnson, et al. Review of Virtual Environment Interface Technology,Report IDA Paper P-3186, Institute for Defense Analyses - IDA, (http://www.hitl.washington.edu/scivw/IDA/). 1996.Google Scholar
  49. Zhang, J. A representational analysis of relational information displays. International Journal of Human-Computer Studies 45: pp. 59–74, 1996.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2002

Authors and Affiliations

  • A. D. N. Edwards
    • 1
  1. 1.Department of Computer ScienceUniversity of YorkYorkUK

Personalised recommendations