Invited Paper: Multimodal Interface for an Intelligent Wheelchair

  • Luís Paulo ReisEmail author
  • Brígida Mónica Faria
  • Sérgio Vasconcelos
  • Nuno Lau
Conference paper
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 325)


Since the demographics of population, with respect to age, are continuously changing, politicians and scientists start to pay more attention to the needs of senior individuals. Additionally, the well-being and needs of disabled individuals are also becoming highly valued in the political and entrepreneurial society. Intelligent wheelchairs are adapted electric wheelchairs with environmental perception, semi-autonomous behaviour and flexible human-machine-interaction. This paper presents the specification and development of a user-friendly multimodal interface, as a component of the IntellWheels Platform project. The developed prototype combines several input modules, allowing the control of the wheelchair through flexible user defined input sequences of distinct types (speech, facial expressions, head movements and joystick). To validate the effectiveness of the prototype, two experiments were performed with a number of individuals who tested the system firstly by driving a simulated wheelchair in a virtual environment. The second experiment was performed using the real IntellWheels wheelchair prototype. The results achieved proved that the multimodal interface may be successfully used by people, due to the interaction flexibility it provides.


Multimodal interface Intelligent robotics Intelligent wheelchair IntellWheels 



The authors would like to acknowledge to FCT—Portuguese Science and Technology Foundation for the INTELLWHEELS project funding (RIPD/ADA/109636/2009), for the Ph.D. Scholarship FCT/SFRH/BD/44541/2008, LIACC—Laboratório de Inteligência Artificial e de Ciência de Computadores da Universidade do Porto, DETI/UA—Dep. Electrónica, Telecomunicações e Informática, IEETA—Instituto de Engenharia Electrónica e Telemática de Aveiro and ESTSP/IPP—Escola Superior de Tecnologia da Saúde Porto—IPP.


  1. 1.
    Dayal, H.: Management of rehabilitation personnel within the context of the national rehabilitation policy (2009)Google Scholar
  2. 2.
    Braga, R., Petry, M., Moreira, A. P., Reis, L. P.: Concept and design of the intellWheels platform for developing intelligent wheelchairs. Inf. Control Autom. Robot. 191–203 (2009)Google Scholar
  3. 3.
    Sharma, R., Pavlovic, V. I., Huang, T. S.: Toward multimodal human computer interface. Proc. IEEE 86(5), 853–869 (1998)Google Scholar
  4. 4.
    Wang, H., Wang, Y., Cao, A.: Video-based face recognition: a survey. World Acad. Sci. Eng. Technol. 35(4), 293–302 (2009)Google Scholar
  5. 5.
    Tian, Y. L., Kanade, T., Cohn, J. F.: Handbook of face recognition. Stan, L., Anil, J., (eds.) pp. 247–274 (2005)Google Scholar
  6. 6.
    Sayette, M.A., Cohn, J.F., Wertz, J.M., Perrott, M.A., Parrott, D.J.: A psychometric evaluation of the facial action coding system for assessing spontaneous expression. J. Nonverbal Behav. 25, 167–185 (2001)CrossRefGoogle Scholar
  7. 7.
    Silva. L.: Head gestures recognition. In: Proceedings of the International Conference on Image Processing, vol. 3, pp. 266–269 (2002)Google Scholar
  8. 8.
    Gavankar, C., Warnekar, C.: Automated system for interpreting non-verbal communication in video conferencing. Int. J. Comput. Sci. Eng. (IJCSE) 2, 22–27 (2010)Google Scholar
  9. 9.
    Gips, J., Di Mattia, P., Curran, F.X., Olivieri, P.: Using eagle eyes—an electrodes based device for controlling the computer with your eyes to help people with special needs. In: Proceedings of the 5th International Conference on Computers Helping People with Special Needs. Part I, pp. 77–83. Munich, Germany (1996)Google Scholar
  10. 10.
    Ashwash, I., Hu, W., Marcotte, G.: Eye gestures recognition: a mechanism for hands-free computer control. Available at: fall08/cos436/FinalReports/Eye\_Gesture\_Recognition.pdf. Accessed 2011
  11. 11.
    Tall, M., Alapetite, A., Agustin, J.S., Skovsgaard, H.H.T., Hansen, J.P., Hansen, D.W.: Møllenbach, E.: Gaze-controlled driving. In: Proceedings of the 27th International Conference Extended Abstracts on Human Factors in Computing Systems. CHI’09, pp. 4387–4392. USA, ACM, New York (2009)Google Scholar
  12. 12.
    Jia, P., Hu, H.H., Lu, T., Yuan, K.: Head gesture recognition for hands-free control of an intelligent wheelchair. Ind. Rob. Int. J. 34(1), 60–68 (2007)CrossRefGoogle Scholar
  13. 13.
    Nakanishi, S., Kuno, Y., Shimada, N., Shirai, Y.: Robotic wheelchair based on observations of both user and environment. In: Proceedings of the International Conference on Intelligent Robots and Systems, vol. 2, pp. 912–917 (1999)Google Scholar
  14. 14.
    Matsumoto, Y., Ino, T., Ogasawara, T.: Development of intelligent wheelchair system with face and gaze based interface. In: Proceedings of the 10th IEEE International Workshop on Robot and Human Communication, pp. 262–267 (2001)Google Scholar
  15. 15.
    Ju, S., Shin, Y., Kim, Y.: Intelligent wheelchair (iw) interface using face and mouth recognition. In: Proceedings of the 13th International Conference on Intelligent User Interfaces, IUI’09, pp. 307–314. ACM (2009)Google Scholar
  16. 16.
    Manasse, P.: Speech recognition. Available at: Accessed on January 2011
  17. 17.
    Youdin, M., Sell, G., Reich, T., Clagnaz, M., Louie, H., Kolwicz, R.: A voice controlled powered wheelchair and environmental control system for the severely disabled. Med Prog Technol 7, 139–143 (1980)Google Scholar
  18. 18.
    Sasou, A., Kojima, H.: Noise robust speech recognition applied to voice-driven wheelchair. EURASIP J. Adv. Sig. Proces. 41, 1–41:1 (2009)Google Scholar
  19. 19.
    Cyber Glove Systems. Cyber glove ii. Available at: Accessed on Nov 2011
  20. 20.
    AnthroTronix.: The AcceleGlove—capturing hand gesture in virtual reality. Available at: Accessed on Jan 2011
  21. 21.
    Microsoft.: Kinect for xbox 360. Available at: Accessed on May 2011
  22. 22.
    LURCH Project.: Lurch—the autonomous wheelchair. Available at: Accessed on May 2011
  23. 23.
    Philips, J., Millan, J., Vanacker, G., Lew, E., Galán, F., Ferrez, P., Van Brussel, H., Nuttin, M.: Adaptive shared control of a brain-actuated simulated wheelchair. In: Proceedings of the 10th IEEE International Conference on Rehabilitation Robotics, pp. 408–414. Noordwijk, The Netherlands, 6 (2007)Google Scholar
  24. 24.
    Blatt, R., Ceriani, S., Seno, B.D., Fontana, G., Matteucci, M., Migliore, D.: Brain control of a smart wheelchair. In 10th International Conference on Intelligent Autonomous Systems (2008)Google Scholar
  25. 25.
    Rebsamen, B., Burdet, E., Guan, C., Zhang, H., Teo, C. L., Zeng, Q., Laugier, C., Ang Jr., M. H.: Controlling a wheelchair indoors using thought. IEEE Intel. Sys. 22, 18–24 (2007)Google Scholar
  26. 26.
    Mahl, C., Hayrettin, G., Danny, P., Marieke, E., Lasse, S., Matthieu, D., Alexandra, A.: Bacteriahunt: evaluating multi-paradigm BCI interaction. J. Multimodal User Interfaces 4(1), 11–25 (2010)CrossRefGoogle Scholar
  27. 27.
    Shepherd: How Shepherd Center works. Discovery & Fit Health. Available at: Accessed 2010
  28. 28.
    Oviatt, S.: A Handbook of Human-Computer Interaction. In: Jacko J., Sears, A. (eds.) New Jersey (2002)Google Scholar
  29. 29.
    Dumas, B., Lalanne, D., Oviatt, S.: Human Machine Interaction, vol. 5440 (Chap. Multimodal interfaces: a survey of principles, models and frameworks, pp. 3–26. Springer, Berlin (2009)Google Scholar
  30. 30.
    Bolt, R. A.: "Put-that-there": voice and gesture at the graphics interface. In: Proceedings of the 7th Annual Conference on Computer Graphics and Interactive Techniques, pp. 262–270. New York, USA (1980)Google Scholar
  31. 31.
    Johnston, M., Bangalore, S.: Matchkiosk: a multimodal interactive city guide. In: Proceedings of the ACL 2004 on Interactive Poster and Demonstration Sessions. Barcelona, Spain, Article No. 33 (2004)Google Scholar
  32. 32.
    Johnston, M., Cohen, Philip R., McGee, D., Oviatt, S. L., Pittman, J. A., Smith, I.: Unification-based multimodal integration. In: Proceedings of the 35th Annual Meeting of the Association for Computational Linguistics and Eighth Conference of the European Chapter of the Association for Computational Linguistics, ACL-35, pp. 281–288. Morristown, NJ, U.S.A. (1997)Google Scholar
  33. 33.
    Karpov, A., Ronzhin, A.: ICANDO: low cost multimodal interface for hand disabled people. J. Multimodal User Interfaces 1(2), 21–29 (2007)CrossRefGoogle Scholar
  34. 34.
    Norman, D.: The Psychology of Everyday Things. Basic Books, New York. Open Interface. Open interface platform. Available at: Accessed on May 2011
  35. 35.
    Braga, R., Petry, M., Moreira, A., Reis, L. P.: A development platform for intelligent wheelchairs for disabled people. In: 5th International Conference on Informatics in Control, Automation and Robotics 1, 115–121 (2008)Google Scholar
  36. 36.
    Reis, L.P., Braga, R., Sousa, M., Moreira, A.: Intellwheels MMI: a flexible interface for an intelligent wheelchair. In: Baltes, J., Lagoudakis, M.G., Naruse, T., ShiryGhidary, S. (eds) RoboCup, vol. 5949. Lecture Notes in Computer Science, pp. 296–307. Springer, Berlin (2009)Google Scholar
  37. 37.
    Braga, R., Petry, M., Moreira, A., Reis, L. P.: Platform for intelligent wheelchairs using multi-level control and probabilistic motion model. In: 8th Portuguese Conference on Automatic Control, pp. 833–838 (2008)Google Scholar
  38. 38.
    Banzi, M., Cuartielles, D., Igoe, T., Martino, G., Mellis D.: Arduino. Available at: Accessed 2011
  39. 39.
    Braga, R., Malheiro, P., Reis, L.P.: Development of a realistic simulator for robotic intelligent wheelchairs in a hospital environment. RoboCup2009: Robot Soccer World Cup XIII. vol. 5949. Lecture Notes in Computer Science, pp. 23–34. Springer, Berlin (2010)Google Scholar
  40. 40.
    Lau, N., Pereira, A., Melo, A., Neves, A., Figueiredo, J.: Ciber-rato: Umambientedesimulaçãoderobotsmóveiseautónomos. RevistadoDETUA 3(7), 647–650 (2002)Google Scholar
  41. 41.
    Vasconcelos, S.: Multimodal Interface for an intelligent wheelchair. University of Porto, Faculty of Engineering. Retrieved April, 21, 2012, from
  42. 42.
    Embarcadero. Available at: Accessed on May 2011
  43. 43.
    SAPI: Accessed on Jan 2012
  44. 44.
    Nintendo: Wii controllers. Available at: Consulted on May 2011. Accessed on May 2011
  45. 45.
    Black, P.: Binary search algorithm. Available at: Accessed on Jan 2011
  46. 46.
    Brooke, J.: SUS: aquick and dirty usability scale. In: Jordan, P.W., Weerdmeester, B., Thomas, A., Mclelland, I.L. (eds.) Usability Evaluation in Industry. Taylor and Francis, London (1996)Google Scholar
  47. 47.
    Ubisense: Precise real-time location. Available at: Accessed on May 2011

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Luís Paulo Reis
    • 1
    • 2
    Email author
  • Brígida Mónica Faria
    • 2
    • 3
  • Sérgio Vasconcelos
    • 2
    • 4
  • Nuno Lau
    • 5
    • 6
  1. 1.Departamento de Sistemas de InformaçãoEscola de Engenharia da Universidade do Minho (DSI/EEUM)GuimarãesPortugal
  2. 2.Laboratório de Inteligência Artificial e Ciência de Computadores (LIACC)PortoPortugal
  3. 3.Escola Superior Tecnologia de Saúde do PortoInstituto Politécnico do Porto (ESTSP/IPP)PortoPortugal
  4. 4.Departamento de Engenharia InformáticaFaculdade de Engenharia da Universidade do Porto (DEI/FEUP)PortoPortugal
  5. 5.Departamento de Engenharia Eletrónica Telecomunicações e InformáticaUniversidade de Aveiro (DETI/UA)AveiroPortugal
  6. 6.Instituto de Engenharia Eletrónica e Telemática de Aveiro (IEETA)AveiroPortugal

Personalised recommendations