Holzinger A., Nischelwitzer A.K. (2006) People with Motor and Mobility Impairment: Innovative Multimodal Interfaces to Wheelchairs. In: Miesenberger K., Klaus J., Zagler W.L., Karshmer A.I. (eds) Computers Helping People with Special Needs. ICCHP 2006. Lecture Notes in Computer Science, vol 4061. Springer, Berlin, Heidelberg
Standard Interfaces have limited accessibility. Multimodal user interfaces combine various input and output modalities (including seeing/vision, hearing/audition, haptic/tactile, taste/gustation, smell/olfaction etc.), which are a classical research area in Human-Computer Interaction. One of the advantages of multiple modalities is increased flexibility in Usability. The weaknesses of one modality are offset by the strengths of another. For example, on a mobile device with a small visual interface and keypad, a word may be quite difficult to read/type, however very easy to say/listen. Such interfaces, in combination with mobile technologies, can have tremendous implications for accessibility and consequently, they are a potential benefit for people with a wide variety of impairments. Multimodal interfaces must be designed and developed exactly to fit the needs, requirements, abilities and different knowledge levels of the targeted end-users. It is also important to consider different contexts of use. However, in order to achieve advances in both research and development of such interfaces, it is essential to bring researchers and practitioners from Psychology and Computer Science together.
Today, together for better interfaces of tomorrow!
Human-Computer Interaction & Usability Engineering (HCI&UE) Multimodal User Interfaces (MUI) Auditive User Interfaces (AUI)