FaceMouse: A Human-Computer Interface for Tetraplegic People

  • Emanuele Perini
  • Simone Soria
  • Andrea Prati
  • Rita Cucchiara
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3979)


This paper proposes a new human-machine interface particularly conceived for people with severe disabilities (specifically tetraplegic people), that allows them to interact with the computer for their everyday life by means of mouse pointer. In this system, called FaceMouse, instead of classical "pointer paradigm" that requires the user to look at the point where to move, we propose to use a paradigm called "derivative paradigm", where the user does not indicate the precise position, but the direction along which the mouse pointer must be moved. The proposed system is composed of a common, low-cost webcam, and by a set of computer vision techniques developed to identify the parts of the user’s face (the only body part that a tetraplegic person can move) and exploit them for moving the pointer. Specifically, the implemented algorithm is based on template matching to track the nose of the user and on cross-correlation to calculate the best match. Finally, several real applications of the system are described and experimental results carried out by disabled people are reported.


Disable People Template Match Face Tracking Mouse Pointer Computer Vision Technique 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Beach, G., Cohen, C.J., Braun, J., Moody, G.: Eye tracker system for use with head mounted displays. In: Proceedings of IEEE Intl Conf on Systems, Man, and Cybernetics, vol. 5, pp. 4348–4352 (1998)Google Scholar
  2. 2.
    Zhu, Z., Ji, Q., Fujimura, K., Lee, K.: Combining Kalman filtering and mean shift for real time eye tracking under active IR illumination. In: Proceedings of Intl Conf on Pattern Recognition, vol. 4, pp. 318–321 (2002)Google Scholar
  3. 3.
    Tu, J., Huang, T., Tao, H.: Face as mouse through visual face tracking. In: Proceedings of 2nd Canadian Conf on Computer and Robot Vision, pp. 339–346 (2005)Google Scholar
  4. 4.
    Qian, R.J., Sezan, M.I., Matthews, K.E.: A Robust Real-Time Face Tracking Algorithm. In: Proceedings of IEEE Intl Conf on Image Processing, vol. 1, pp. 131–135 (1998)Google Scholar
  5. 5.
    Toyama, K.: Look, ma – no hands! Hands free cursor control with real-time 3D face tracking. In: Proceedings of Workshop on Perceptual User Interfaces, pp. 49–54 (1998)Google Scholar
  6. 6.
    Hutchinson, T.E., White, K.P., Martin, W.N., Reichert, K.C., Frey, L.A.: Human-computer interaction using eye-gaze input. IEEE Transactions on Systems, Man, and Cybernetics 19, 1527–1534 (1989)CrossRefGoogle Scholar
  7. 7.
    Betke, M., Gips, J., Fleming, P.: The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities. IEEE Transactions on Neural Systems and Rehabilitation Engineering 10(1), 1–10 (2002)CrossRefGoogle Scholar
  8. 8.
    Gorodnichy, D.O., Malik, S., Roth, G.: Use Your Nose as a Mouse – a New Technology for Hands-free Games and Interfaces. Computational Video Group, IIT, National Research Council, Ottawa, Canada K1A 0R6Google Scholar
  9. 9.
    Verma, R., Schmid, C., Mikolajczyk, K.: Face detection and tracking in a video by Propagatine Detection Probabilities. IEEE Transactions on Pattern Analysis and Machine Intelligence 25(10), 1215–1228 (2003)CrossRefGoogle Scholar
  10. 10.
    Moghaddam, B., Pentland, A.: Probabilistic Visual Learning for Object Recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(7), 696–710 (1997)CrossRefGoogle Scholar
  11. 11.
    Kalman, R.E.: A new approach to linear filtring and prediction problems. Journal of Basic Engineering, Trans. of ASME (1960)Google Scholar
  12. 12.
    Oliver, N., Pentland, A.P., Berard, F.: LAFTER: lips and face real time tracker. In: Proceedings of Computer Vision and Pattern Recognition, pp. 123–129 (1997)Google Scholar
  13. 13.
    Fagiani, C., Betke, M., Gips, J.: Evaluation of Tracking Methods for Human-Computer Interaction. In: Proceedings of Workshop on Applications of Computer Vision (2002)Google Scholar
  14. 14.
    Campedelli, P., Casiraghi, E., Lanzerotti, R.: Detection of Facial Features. Technical Report DSI, University of MilanGoogle Scholar
  15. 15.
    Gorodnichy, D.: On importance of nose for face tracking. In: Proceedings of Intl Conf on Automatic Face and Gesture Recognition (2002)Google Scholar
  16. 16.
    Horn, B.K.P.: Robot Vision. MIT Press, Cambridge (1986)Google Scholar
  17. 17.
    Betke, M., Haritaoglu, E., Davis, L.S.: Real-time multiple vehicle detection and tracking from a moving vehicle. Machine Vision and Applications 12(2), 69–83 (2000)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Emanuele Perini
    • 1
  • Simone Soria
    • 1
  • Andrea Prati
    • 1
  • Rita Cucchiara
    • 1
  1. 1.Dipartimento di Ingegneria dell’InformazioneUniversity of Modena and Reggio EmiliaModenaItaly

Personalised recommendations