Visual Human-Machine Interaction
- Alexander Zelinsky
- … show all 1 hide
Purchase on Springer.com
$29.95 / €24.95 / £19.95*
* Final gross prices may vary according to local VAT.
It is envisaged that computers of the future will have smart interfaces such as speech and vision, which will facilitate natural and easy human-machine interaction. Gestures of the face and hands could become a natural way to control the operations of a computer or a machine, such as a robot. In this paper, we present a vision-based interface that in real-time tracks a person’s facial features and the gaze point of the eyes. The system can robustly track facial features, can detect tracking failures and has an automatic mechanism for error recovery. The system is insensitive to lighting changes and occulsions or distortion of the facial features. The system is user independent and can automatically calibrate for each different user. An application using this technology for driver fatigue detection and the evaluation of ergonomic design of motor vehicles has been developed. Our human-machine interface has an enormous potential in other applications that allow the control of machines and processes, and measure human performance. For example, product possibilities exist for assisting the disabled and in video game entertainment.
- T. Darrel, A.P. Pentland, “Attention-driven Expression and Gesture Analysis in an Interactive Environment”, Proceedings of the International Workshop on Automatic Face-and Gesture-Recognition, pp. 135–140, 1995.
- A. Jacquin, A. Eleftheriadis, “Automatic location tracking of faces and facial features in video sequences”, Proceedings of the International Workshop on Automatic Face-and Gesture-Recognition, pp. 142–147, 1995.
- A. Azarbayejani, T. Starner, B. Horowitz, and A. Pentland. Visually controlled graphics. IEEE Trans. on Pattern Analysis and Machine Intelligence, 15(6):602–605, 1993. CrossRef
- A. Zelinsky and J. Heinzmann. Real-time Visual Recognition of Facial Gestures for Human Computer Interaction. In Proc. of the Int. Conf, on Automatic Face and Gesture Recognition, pages 351–356, 1996.
- Black and Yaccob. Tracking and Recognizing Rigid and Non-rigid Facial Motions Using Parametric Models of Image Motion. In Proc. of Int. Conf. on Computer Vision (ICCV’95), pages 374–381, 1995.
- A. Gee and R. Cipolla. Fast Visual Tracking by Temporal Consensus. Image and Vision Computing, 14(2):105–114, 1996. CrossRef
- J. Heinzmann and A. Zelinsky. 3-D Facial Pose and Gaze Point Estimation using a Robust Real-Time Tracking Paradigm. In Proc. of the Int. Conf. on Automatic Face and Gesture Recognition, 1998.
- R. Stiefelhagan, J. Yang, and A. Waibel. Tracking Eyes and Monitoring Eye Gaze. In Proc. of Workshop on Perceptual User Interface (PUI’97), 1997.
- Y. Matsutmoto, T. Shibata, K. Sakai, M. Inaba, and H. Inoue. Real-time Color Stereo Vision System for a Mobile Robot based on Field Multiplexing. In Proc. of IEEE Int. Conf. on Robotics and Automation, pages 1934–1939, 1997.
- Visual Human-Machine Interaction
- Book Title
- Advanced Topics in Artificial Intelligence
- Book Subtitle
- 12th Australian Joint Conference on Artificial Intelligence, AI’99 Sydney, Australia, December 6–10, 1999 Proceedings
- pp 440-452
- Print ISBN
- Online ISBN
- Series Title
- Lecture Notes in Computer Science
- Series Volume
- Series ISSN
- Springer Berlin Heidelberg
- Copyright Holder
- Springer-Verlag Berlin Heidelberg
- Additional Links
- Industry Sectors
- eBook Packages
To view the rest of this content please follow the download PDF link above.