People with Motor and Mobility Impairment: Innovative Multimodal Interfaces to Wheelchairs

  • Andreas Holzinger
  • Alexander K. Nischelwitzer
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4061)

Abstract

Standard Interfaces have limited accessibility. Multimodal user interfaces combine various input and output modalities (including seeing/vision, hearing/audition, haptic/tactile, taste/gustation, smell/olfaction etc.), which are a classical research area in Human-Computer Interaction. One of the advantages of multiple modalities is increased flexibility in Usability. The weaknesses of one modality are offset by the strengths of another. For example, on a mobile device with a small visual interface and keypad, a word may be quite difficult to read/type, however very easy to say/listen. Such interfaces, in combination with mobile technologies, can have tremendous implications for accessibility and consequently, they are a potential benefit for people with a wide variety of impairments. Multimodal interfaces must be designed and developed exactly to fit the needs, requirements, abilities and different knowledge levels of the targeted end-users. It is also important to consider different contexts of use. However, in order to achieve advances in both research and development of such interfaces, it is essential to bring researchers and practitioners from Psychology and Computer Science together.

Introducing Statement:

Today, together for better interfaces of tomorrow!

Keywords

Human-Computer Interaction & Usability Engineering (HCI&UE) Multimodal User Interfaces (MUI) Auditive User Interfaces (AUI) 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Blosser, S.R.: SCATIR Switch, Michigan State University - Artificial Language Laboratory (2005), http://www.msu.edu/~rtlang/SCATIR.html
  2. 2.
    Brisson, L.J.: Designing user interfaces for severely handicapped persons. In: Proceedings of the 2001 EC/NSF workshop on Universal accessibility of ubiquitous computing: providing for the elderly, ACM Press, Alcácer do Sal, Portugal (2001)Google Scholar
  3. 3.
    Buxton, B.: Less is More (More or Less). In: Denning, P. (ed.) The Invisible Future: The seamless integration of technology in everyday life, pp. 145–179. McGraw Hill, New York (2001)Google Scholar
  4. 4.
    Hawley, M.S., Cudd, P.A., Wells, J.H., Wilson, A.J., PL. J.: Wheelchair-mounted integrated control systems for multiply handicapped people. Journal of Biomedical Engineering 14(3)Google Scholar
  5. 5.
    Holzinger, A.: Finger Instead of Mouse: Touch Screens as a means of enhancing Universal Access. In: Carbonell, N., Stephanidis, C. (eds.) UI4ALL 2002. LNCS, vol. 2615, pp. 387–397. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  6. 6.
    Holzinger, A.: User-Centered Interface Design for disabled and elderly people: First experiences with designing a patient communication system (PACOSY). In: Miesenberger, K., Klaus, J., Zagler, W. (eds.) ICCHP 2002. LNCS, vol. 2398, pp. 34–41. Springer, Berlin (2002)CrossRefGoogle Scholar
  7. 7.
    LC-Technology. Eyegaze Eyetracking System (2005), http://www.eyegaze.com/
  8. 8.
    Oviatt, S., Coulston, R., Lunsford, R.: When do we interact multimodally? Cognitive load and multimodal communication patterns. In: 6th international conference on Multimodal interfaces, pp. 129–136 (2004)Google Scholar
  9. 9.
    Oviatt, S., Darrell, T., Flickner, M.: Multimodal interfaces that flex, adapt, and persist. Communications of the ACM 47(1), 30–33Google Scholar
  10. 10.
    Sweller, J.: Cognitive load during problem solving: Effects on learning. Cognitive Science 12(2), 257–285Google Scholar
  11. 11.
    TashInc. Special Input Devices (2005), http://www.tashinc.com/
  12. 12.
    Wobbrock, J.O., Myers, B.A., Aung, H.H., LoPresti, E.F.: Text entry from power wheelchairs: edgewrite for joysticks and touchpads. In: 6th international ACM SIGACCESS conference on Computers and accessibility, Atlanta (GA), pp. 110–117. ACM, New York (2004)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Andreas Holzinger
    • 1
  • Alexander K. Nischelwitzer
    • 2
  1. 1.Medical University of Graz 
  2. 2.University of Applied Sciences Joanneum 

Personalised recommendations