Head-Computer Interface: A Multimodal Approach to Navigate through Real and Virtual Worlds

  • Francesco Carrino
  • Julien Tscherrig
  • Elena Mugellini
  • Omar Abou Khaled
  • Rolf Ingold
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6762)


This paper presents a novel approach for multimodal interaction which combines user mental activity (thoughts and emotions), user facial expressions and user head movements. In order to avoid problems related to computer vision (sensitivity to lighting changes, reliance on camera position, etc.), the proposed approach doesn’t make use of optical techniques. Furthermore, in order to make human communication and control smooth, and avoid other environmental artifacts, the used information is non-verbal. The head’s movements (rotations) are detected by a bi-axial gyroscope; the expressions and gaze are identified by electromyography and electrooculargraphy; the emotions and the thoughts are monitored by electroencephalography. In order to validate the proposed approach we developed an application where the user can navigate through a virtual world using his head. We chose Google Street View as virtual world. The developed application was conceived for a further integration with a electric wheelchair in order to replace the virtual world with a real world. A first evaluation of the system is provided.


Gesture recognition Brain-Computer Interface multimodality navigation through real and virtual worlds human-computer interaction psycho-physiological signals 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Gunes, H., Piccardi, M., Jan, T.: Face and body gesture recognition for a vision-based multimodal analyzer. In: Proceedings of the Pan-Sydney area, vol. 36, pp. 19–28 (2004)Google Scholar
  2. 2.
    Mitra, S., Acharya, T.: Gesture Recognition: A Survey. IEEE Transactions on Systems, Man and Cybernetics, Part C (Applications and Reviews) 37, 311–324 (2007)CrossRefGoogle Scholar
  3. 3.
    Davis, J., Shah, M.: Visual gesture recognition. Vision, Image and Signal Processing. In: IEE Proceedings, IET , pp. 101–106 (2002)Google Scholar
  4. 4.
    Je, H., Kim, J., Kim, D.: Vision-Based Hand Gesture Recognition for Understanding Musical Time Pattern and Tempo. In: 33rd Annual Conference of the IECON 2007, pp. 2371–2376. IEEE Industrial Electronics Society (2007)Google Scholar
  5. 5.
    Ng, P.C., De Silva, L.C.: Head gestures recognition. In: International Conference on Image Processing, Proceedings, pp. 266–269. IEEE, Los Alamitos (2002)Google Scholar
  6. 6.
    Keir, P., Payne, J., Elgoyhen, J., Horner, M., Naef, M., Anderson, P.: Gesture-recognition with Non-referenced Tracking. The Computer Journal, 151–158 (2006)Google Scholar
  7. 7.
    Takahashi, K., Nakauke, T., Hashimto, M.: Remarks on Simple Gesture Recognition of Bio-Potential Signals and Its Application to Hands-Free Manipulation System. In: TENCON 2006 - 2006 IEEE Region 10 Conference, pp. 1–4 (2006)Google Scholar
  8. 8.
    Reilly, R.B.: Applications of face and gesture recognition for human-computer interaction. In: Proceedings of the Sixth ACM International Conference on Multimedia Face/Gesture Recognition and their Applications - MULTIMEDIA 1998, pp. 20–27 (1998)Google Scholar
  9. 9.
    Takahashi, K., Hashimoto, M.: Remarks on EOG and EMG gesture recognition in hands-free manipulation system. In: 2008 IEEE International Conference on Systems, Man and Cybernetics, pp. 798–803 (2008)Google Scholar
  10. 10.
    Hashimoto, M., Takahashi, K., Shimada, M.: Wheelchair control using an EOG- and EMG-based gesture interface. In: IEEE/ASME International Conference on Advanced Intelligent Mechatronics, pp. 1212–1217 (2009)Google Scholar
  11. 11.
    Zhang, X., Chen, X., Wang, W.-h., Yang, J.-h., Lantz, V., Wang, K.-q.: Hand gesture recognition and virtual game control based on 3D accelerometer and EMG sensors. In: Proceedings of the 13th International Conference on Intelligent User Interfaces - IUI 2009, vol. 401 (2008)Google Scholar
  12. 12.
    Chen, X., Zhang, X., Zhao, Z.-Y., Yang, J.-H., Lantz, V., Wang, K.-Q.: Hand Gesture Recognition Research Based on Surface EMG Sensors and 2D-accelerometers. In: 11th IEEE International Symposium on Wearable Computers, pp. 1–4 (2007)Google Scholar
  13. 13.
    Morimoto, C., Yacoob, Y., Davis, L.: Recognition of head gestures using hidden Markov models. In: Proceedings of 13th International Conference on Pattern Recognition, pp. 461–465 (1996)Google Scholar
  14. 14.
    Emotiv - Brain Computer Interface Technology,
  15. 15.
    Pfurtscheller, G., Neuper, C.: Motor imagery and direct brain-computer communication. Proceedings of the IEEE 89, 1123–1134 (2001)CrossRefGoogle Scholar
  16. 16.
    Pfurtscheller, G., Neuper, C.: Motor imagery activates primary sensorimotor area in humans. Neuroscience Letters 239, 65–68 (1997)CrossRefGoogle Scholar
  17. 17.
    Scherer, R., Lee, F., Schlögl, A., Leeb, R., Bischof, H., Pfurtscheller, G.: Towards self-paced (asynchronous) Brain-Computer Communication: Navigation through virtual worlds. IEEE Transaction on Biomedical Engineering 55, 675–682 (2008)CrossRefGoogle Scholar
  18. 18.
    Dumas, B., Lalanne, D., Oviatt, S.: Multimodal interfaces: a survey of principles, models and frameworks. Human Machine Interaction, 3–26 (2009)Google Scholar
  19. 19.
    Nigay, L., Coutaz, J.: A design space for multimodal systems: concurrent processing and data fusion. In: Proceedings of the INTERACT 1993 and CHI 1993 Conference on Human Factors in Computing Systems, pp. 172–178. ACM Press, New York (1993)Google Scholar
  20. 20.
    Guger, C., Edlinger, G.: How many people can control a brain-computer interface (BCI). In: Proc. BrainPlay, pp. 29–32 (2007)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Francesco Carrino
    • 1
    • 2
  • Julien Tscherrig
    • 1
  • Elena Mugellini
    • 1
  • Omar Abou Khaled
    • 1
  • Rolf Ingold
    • 2
  1. 1.College of Engineering and Architecture of FribourgSwitzerland
  2. 2.University of FribourgSwitzerland

Personalised recommendations