People with Motor and Mobility Impairment: Innovative Multimodal Interfaces to Wheelchairs
Standard Interfaces have limited accessibility. Multimodal user interfaces combine various input and output modalities (including seeing/vision, hearing/audition, haptic/tactile, taste/gustation, smell/olfaction etc.), which are a classical research area in Human-Computer Interaction. One of the advantages of multiple modalities is increased flexibility in Usability. The weaknesses of one modality are offset by the strengths of another. For example, on a mobile device with a small visual interface and keypad, a word may be quite difficult to read/type, however very easy to say/listen. Such interfaces, in combination with mobile technologies, can have tremendous implications for accessibility and consequently, they are a potential benefit for people with a wide variety of impairments. Multimodal interfaces must be designed and developed exactly to fit the needs, requirements, abilities and different knowledge levels of the targeted end-users. It is also important to consider different contexts of use. However, in order to achieve advances in both research and development of such interfaces, it is essential to bring researchers and practitioners from Psychology and Computer Science together.
Today, together for better interfaces of tomorrow!
KeywordsHuman-Computer Interaction & Usability Engineering (HCI&UE) Multimodal User Interfaces (MUI) Auditive User Interfaces (AUI)
Unable to display preview. Download preview PDF.
- 1.Blosser, S.R.: SCATIR Switch, Michigan State University - Artificial Language Laboratory (2005), http://www.msu.edu/~rtlang/SCATIR.html
- 2.Brisson, L.J.: Designing user interfaces for severely handicapped persons. In: Proceedings of the 2001 EC/NSF workshop on Universal accessibility of ubiquitous computing: providing for the elderly, ACM Press, Alcácer do Sal, Portugal (2001)Google Scholar
- 3.Buxton, B.: Less is More (More or Less). In: Denning, P. (ed.) The Invisible Future: The seamless integration of technology in everyday life, pp. 145–179. McGraw Hill, New York (2001)Google Scholar
- 4.Hawley, M.S., Cudd, P.A., Wells, J.H., Wilson, A.J., PL. J.: Wheelchair-mounted integrated control systems for multiply handicapped people. Journal of Biomedical Engineering 14(3)Google Scholar
- 7.LC-Technology. Eyegaze Eyetracking System (2005), http://www.eyegaze.com/
- 8.Oviatt, S., Coulston, R., Lunsford, R.: When do we interact multimodally? Cognitive load and multimodal communication patterns. In: 6th international conference on Multimodal interfaces, pp. 129–136 (2004)Google Scholar
- 9.Oviatt, S., Darrell, T., Flickner, M.: Multimodal interfaces that flex, adapt, and persist. Communications of the ACM 47(1), 30–33Google Scholar
- 10.Sweller, J.: Cognitive load during problem solving: Effects on learning. Cognitive Science 12(2), 257–285Google Scholar
- 11.TashInc. Special Input Devices (2005), http://www.tashinc.com/