FaceMouse: A Human-Computer Interface for Tetraplegic People
This paper proposes a new human-machine interface particularly conceived for people with severe disabilities (specifically tetraplegic people), that allows them to interact with the computer for their everyday life by means of mouse pointer. In this system, called FaceMouse, instead of classical "pointer paradigm" that requires the user to look at the point where to move, we propose to use a paradigm called "derivative paradigm", where the user does not indicate the precise position, but the direction along which the mouse pointer must be moved. The proposed system is composed of a common, low-cost webcam, and by a set of computer vision techniques developed to identify the parts of the user’s face (the only body part that a tetraplegic person can move) and exploit them for moving the pointer. Specifically, the implemented algorithm is based on template matching to track the nose of the user and on cross-correlation to calculate the best match. Finally, several real applications of the system are described and experimental results carried out by disabled people are reported.
KeywordsDisable People Template Match Face Tracking Mouse Pointer Computer Vision Technique
Unable to display preview. Download preview PDF.
- 1.Beach, G., Cohen, C.J., Braun, J., Moody, G.: Eye tracker system for use with head mounted displays. In: Proceedings of IEEE Intl Conf on Systems, Man, and Cybernetics, vol. 5, pp. 4348–4352 (1998)Google Scholar
- 2.Zhu, Z., Ji, Q., Fujimura, K., Lee, K.: Combining Kalman filtering and mean shift for real time eye tracking under active IR illumination. In: Proceedings of Intl Conf on Pattern Recognition, vol. 4, pp. 318–321 (2002)Google Scholar
- 3.Tu, J., Huang, T., Tao, H.: Face as mouse through visual face tracking. In: Proceedings of 2nd Canadian Conf on Computer and Robot Vision, pp. 339–346 (2005)Google Scholar
- 4.Qian, R.J., Sezan, M.I., Matthews, K.E.: A Robust Real-Time Face Tracking Algorithm. In: Proceedings of IEEE Intl Conf on Image Processing, vol. 1, pp. 131–135 (1998)Google Scholar
- 5.Toyama, K.: Look, ma – no hands! Hands free cursor control with real-time 3D face tracking. In: Proceedings of Workshop on Perceptual User Interfaces, pp. 49–54 (1998)Google Scholar
- 8.Gorodnichy, D.O., Malik, S., Roth, G.: Use Your Nose as a Mouse – a New Technology for Hands-free Games and Interfaces. Computational Video Group, IIT, National Research Council, Ottawa, Canada K1A 0R6Google Scholar
- 11.Kalman, R.E.: A new approach to linear filtring and prediction problems. Journal of Basic Engineering, Trans. of ASME (1960)Google Scholar
- 12.Oliver, N., Pentland, A.P., Berard, F.: LAFTER: lips and face real time tracker. In: Proceedings of Computer Vision and Pattern Recognition, pp. 123–129 (1997)Google Scholar
- 13.Fagiani, C., Betke, M., Gips, J.: Evaluation of Tracking Methods for Human-Computer Interaction. In: Proceedings of Workshop on Applications of Computer Vision (2002)Google Scholar
- 14.Campedelli, P., Casiraghi, E., Lanzerotti, R.: Detection of Facial Features. Technical Report DSI, University of MilanGoogle Scholar
- 15.Gorodnichy, D.: On importance of nose for face tracking. In: Proceedings of Intl Conf on Automatic Face and Gesture Recognition (2002)Google Scholar
- 16.Horn, B.K.P.: Robot Vision. MIT Press, Cambridge (1986)Google Scholar