Advertisement

Towards Hands-Free Interfaces Based on Real-Time Robust Facial Gesture Recognition

  • Cristina Manresa-Yee
  • Javier Varona
  • Francisco J. Perales
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4069)

Abstract

Perceptual user interfaces are becoming important nowadays, because they offer a more natural interaction with the computer via speech recognition, haptics, computer vision techniques and so on. In this paper we present a visual-based interface (VBI) that analyzes users’ facial gestures and motion. This interface works in real-time and gets the images from a conventional webcam. Due to this, it has to be robust recognizing gestures in webcam standard quality images. The system automatically finds the user’s face and tracks it through time for recognizing the gestures within the face region. Then, a new information fusion procedure has been proposed to acquire data from computer vision algorithms and its results are used to carry out a robust recognition process. Finally, we show how the system is used to replace a conventional mouse for human computer interaction. We use the head’s motion for controlling the mouse’s motion and eyes winks detection to execute the mouse’s events.

Keywords

Gesture Recognition Facial Gesture Recognition Computer Vision Technique Nose Region Nose Point 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Alexander, D.C., Buxton, B.F.: Statistical Modeling of Colour Data. International Journal of Computer Vision 44, 87–109 (2001)MATHCrossRefGoogle Scholar
  2. 2.
    Baker, S., Matthews, I.: Lucas-Kanade 20 Years On: A Unifying Framework. International Journal of Computer Vision 56, 221–225 (2004)CrossRefGoogle Scholar
  3. 3.
    Betke, M., Gips, J., Fleming, P.: The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People with Severe Disabilities. IEEE Transactions on neural systems and Rehabilitation Engineering 10 (2002)Google Scholar
  4. 4.
    Bishop, C.M.: Neural Networks for Pattern Recognition. Clarendon Press (1995)Google Scholar
  5. 5.
    Bradski, G.R.: Computer Vision Face Tracking as a Component of a Perceptual User Interface. In: Proceedings of the IEEE Workshop on Applications of Computer Vision, pp. 214–219 (1998)Google Scholar
  6. 6.
    Comaniciu, D., Ramesh, V., Meer, P.: Kernel-based Object Tracking. IEEE Transactions on Pattern Analysis and Machine Intelligence 25, 564–577 (2003)CrossRefGoogle Scholar
  7. 7.
    Fagiani, C., Betke, M., Gips, J.: Evaluation of Tracking Methods for Human-Computer Interaction. In: Proceedings of the IEEE Workshop on Applications in Computer Vision, pp. 121–126 (2002)Google Scholar
  8. 8.
    Fasel, B., Luettin, J.: Automatic Facial Expression Analysis: A Survey. Pattern Recognition 36, 259–275 (2003)MATHCrossRefGoogle Scholar
  9. 9.
    Gorodnichy, D.O.: Towards automatic retrieval of blink-based lexicon for persons suffered from brain-stem injury using video cameras. In: Proceedings of the IEEE Computer Vision and Pattern Recognition, Workshop on Face Processing in Video (2004)Google Scholar
  10. 10.
    Gorodnichy, D.O., Malik, S., Roth, G.: Nouse ‘Use Your Nose as a Mouse’ – a New Technology for Hands-free Games and Interfaces. Image and Vision Computing 22, 931–942 (2004)CrossRefGoogle Scholar
  11. 11.
    Grauman, K., Betke, M., Gips, J., Bradski, G.: Communication via Eye Blinks Detection and Duration Analysis in Real Time. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2001)Google Scholar
  12. 12.
    Grauman, K., Betke, M., Lombardi, J., Gips, J., Bradski, G.R.: Communication via eye blinks and eyebrow raises: video-based human-computer interfaces Video-Based Human-Computer Interfaces. Universal Access in the Information Society 2, 359–373 (2003)CrossRefGoogle Scholar
  13. 13.
    EyeTech Quick Glance (2006), http://www.eyetechds.com/qglance2.htm
  14. 14.
    Kölsch, M., Turk, M.: Perceptual Interfaces. In: Medioni, G., Kang, S.B. (eds.) Emerging Topics in Computer Vision. Prentice Hall, Englewood Cliffs (2005)Google Scholar
  15. 15.
    Morimoto, C., Mimica, M.: Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding 98, 4–24 (2005)CrossRefGoogle Scholar
  16. 16.
    Shi, J., Tomasi, C.: Good Features to Track. In: Processdings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 593–600 (1994)Google Scholar
  17. 17.
    Toyama, K.: Look, Ma – No Hands! Hands-Free Cursor Control with Real-Time 3D Face Tracking. In: Proceedings of the Workshop on Perceptual User Interfaces, pp. 49–54 (1998)Google Scholar
  18. 18.
    Turk, M., Robertson, G.: Perceptual User Interfaces. Communications of the ACM 43, 32–34 (2000)CrossRefGoogle Scholar
  19. 19.
    Viola, P., Jones, M.: Robust Real-Time Face Detection. International Journal of Computer Vision 57, 137–154 (2004)CrossRefGoogle Scholar
  20. 20.
    Zelinsky, A., Heinzmann, J.: Real-Time Visual Recognition of Facial Gestures for Human-Computer Interaction. In: Proceedings of the IEEE Automatic Face and Gesture Recognition, pp. 351–356 (1996)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Cristina Manresa-Yee
    • 1
  • Javier Varona
    • 1
  • Francisco J. Perales
    • 1
  1. 1.Departament de Matemàtiques i InformàticaUniversitat de les Illes BalearsPalma

Personalised recommendations