Computer Interface Using Eye Tracking for Handicapped People
In this paper, a computer interface for handicapped people is proposed, where input signals are given by eye movement of the handicapped people. Eye movement is detected by neural network (NN)-based texture classifier, which enables our system to be not obliged to constrained environment. To be robust the natural motion of a user, we first detect a user’s face using skin-color information, and then detect her or his eyes using neural network (NN)-based texture classifier. After detection of eye movements, the tracking is performed using mean-shift algorithms. We use this eye-tracking system as an interface to control the surrounding system such as audio, TV, light, phone, and so on. The experimental results verify the feasibility and validity of the proposed eye-tracking system to be applicable as an interface for the handicapped people.
KeywordsHandicapped People Computer Interface Facial Region Search Window Menu Item
Unable to display preview. Download preview PDF.
- Kaufman, A.E., Bandopadhay, A.S., Bernard, D.: An Eye Tracking Computer User Interface. In: Proceedings, IEEE Symposium on Research Frontiers in Virtual Reality, pp. 25–26 (1993)Google Scholar
- Liu, T., Zhu, S.: Eyes Detection and Tracking based on Entropy in Particle Filter. In: International Conf. on Control and Automation, pp. 1002–1007 (2005)Google Scholar
- Yoo, D., Chung, M.J.: Eye-mouse under Large Head Movement for Human-Computer Interface. In: IEEE International Conf. on Robotics and Automation, pp. 237–242 (2004)Google Scholar
- Takami, O., Morimoto, K., Ochiai, T., Ishimatsu, T.: Computer Interface to Use Head and Eyeball Movement for Handicapped People. In: IEEE International Conference on Systems, Man and Cybernetics, pp. 1119–1123 (1995)Google Scholar