Zelinsky A. (1999) Visual Human-Machine Interaction. In: Foo N. (eds) Advanced Topics in Artificial Intelligence. AI 1999. Lecture Notes in Computer Science, vol 1747. Springer, Berlin, Heidelberg
It is envisaged that computers of the future will have smart interfaces such as speech and vision, which will facilitate natural and easy human-machine interaction. Gestures of the face and hands could become a natural way to control the operations of a computer or a machine, such as a robot. In this paper, we present a vision-based interface that in real-time tracks a person’s facial features and the gaze point of the eyes. The system can robustly track facial features, can detect tracking failures and has an automatic mechanism for error recovery. The system is insensitive to lighting changes and occulsions or distortion of the facial features. The system is user independent and can automatically calibrate for each different user. An application using this technology for driver fatigue detection and the evaluation of ergonomic design of motor vehicles has been developed. Our human-machine interface has an enormous potential in other applications that allow the control of machines and processes, and measure human performance. For example, product possibilities exist for assisting the disabled and in video game entertainment.