Visual Human-Machine Interaction

  • Alexander Zelinsky
Conference paper

DOI: 10.1007/3-540-46695-9_37

Part of the Lecture Notes in Computer Science book series (LNCS, volume 1747)
Cite this paper as:
Zelinsky A. (1999) Visual Human-Machine Interaction. In: Foo N. (eds) Advanced Topics in Artificial Intelligence. AI 1999. Lecture Notes in Computer Science, vol 1747. Springer, Berlin, Heidelberg


It is envisaged that computers of the future will have smart interfaces such as speech and vision, which will facilitate natural and easy human-machine interaction. Gestures of the face and hands could become a natural way to control the operations of a computer or a machine, such as a robot. In this paper, we present a vision-based interface that in real-time tracks a person’s facial features and the gaze point of the eyes. The system can robustly track facial features, can detect tracking failures and has an automatic mechanism for error recovery. The system is insensitive to lighting changes and occulsions or distortion of the facial features. The system is user independent and can automatically calibrate for each different user. An application using this technology for driver fatigue detection and the evaluation of ergonomic design of motor vehicles has been developed. Our human-machine interface has an enormous potential in other applications that allow the control of machines and processes, and measure human performance. For example, product possibilities exist for assisting the disabled and in video game entertainment.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 1999

Authors and Affiliations

  • Alexander Zelinsky
    • 1
  1. 1.Research School of Information Sciences and EngineeringAustralian National UniversityCanberraAustralia

Personalised recommendations