Eye Tracking Using Neural Network and Mean-Shift
In this paper, an eye tracking method is presented using a neural network (NN) and mean-shift algorithm that can accurately detect and track user’s eyes under the cluttered background. In the proposed method, to deal with the rigid head motion, the facial region is first obtained using skin-color model and connected-component analysis. Thereafter the eye regions are localized using neural network (NN)-based texture classifier that discriminates the facial region into eye class and non-eye class, which enables our method to accurately detect users’ eyes even if they put on glasses. Once the eye region is localized, they are continuously and correctly tracking by mean-shift algorithm. To assess the validity of the proposed method, it is applied to the interface system using eye movement and is tested with a group of 25 users through playing a ‘aligns games.’ The results show that the system process more than 30 frames/sec on PC for the 320×240 size input image and supply a user-friendly and convenient access to a computer in real-time operation.
KeywordsFacial Region Search Window Cluttered Background Shift Algorithm Camera Mouse
Unable to display preview. Download preview PDF.
- 1.Jacob, R.J.K.: Human-computer interaction: Input devices. ACM Computing Surveys 28(1) (1996)Google Scholar
- 3.Kaufman, A.E., Bandopadhay, A., Shaviv, B.D.: An Eye Tracking Computer User Interface. Virtual Reality. In: Proceedings of IEEE 1993 Symposium on Research Frontiers in, pp. 25–26 (1993)Google Scholar
- 4.Scassellati, B.: Eye finding via face detection for a foveated, active vision system. American Association for Artificial Intelligence (1998)Google Scholar
- 5.Kurata, T., Okuma, T., Kourogi, M., Sakaue, K.: The Hand Mouse: GMM Hand-color Classification and Mean Shift Tracking. In: Proc. Second International Workshop on Recognition, Analysis and Tracking of Faces and Gestures in Real-time Systems (RATFG-RTS 2001) in conjunction with ICCV 2001, Vancouver, Canada, pp. 119–124 (2001)Google Scholar
- 6.Takami, N., Irie, K.C., Ishimatsu, T., Ochiai, T.: Computer interface to use head movement for handicapped people. In: TENCON 1996. Proceedings. 1996 IEEE TENCON. Digital Signal Processing Applications, 26-29 November, vol. 1, pp. 468–472 (1996)Google Scholar
- 7.Sako, H., Whitehouse, M., Smith, A., Sutherland, A.: Real-time facial-feature tracking based on matching techniques and its applications. In: Proceedings of the 12th IAPR International Conference on Pattern Recognition, Computer Vision & Image Processing 1994, vol. 2(9-13), pp. 320–324 (1994)Google Scholar
- 9.Schiele, B., Waibel, A.: Gaze Tracking Based on Face-Color. School of Computer Science, Carnegie Mello University (1995)Google Scholar
- 10.Yang, J., Waibel, A.: A real-time face tracker. Applications of Computer Vision. In: Proceedings 3rd IEEE Workshop on WACV 1996, vol. 2-4, pp. 142–147 (1996)Google Scholar
- 11.Betke, M., Gips, J., Fleming, P.: The camera mouse: visual tracking of body features to provide computer access for people with severe disabilities. Neural Systems and Rehabilitation Engineering, IEEE Transactions on [see also IEEE Trans. on Rehabilitation Engineering] 10(1), 1–10 (2002)CrossRefGoogle Scholar