Assessment of human head pose in human-computer interaction
- 141 Downloads
A system is described that estimates a user’s head position from a video image of it. The system includes three base algorithms: segmentation, detection of markers, and assessment of motion direction. The direction of head motion is determined by the dynamics of changing geometrical correlations between facial markers in the picture sequence. It is shown that the change in the angle formed by straight lines connecting the corners of the eyes and the tip of the nose have change dynamics similar to that of “yawing.” This system operates in real time (7 fps) and ensures high precision in assessing the direction of motions (p = 0.95).
Keywordsvideo image interface video snapshots
Unable to display preview. Download preview PDF.
- 1.S. Anishenko, D. Shaposhnikov, R. Comley, and X. Gao, “Facial Image Segmentation Based on Mixed Colour Space,” in Proc. 15th Int. Conf. on Neurocybernetics (Rostov-on-Don, 2009), Vol. 2, pp. 231–234.Google Scholar
- 2.P. A. Bakaut, and G. S. Kolmogorov, “Image Segmentation: Domain Boundaries Detection,” Zarubezhn. Elektron. 10, 25–47 (1987).Google Scholar
- 5.P. Majaranta and K.-J. Räihä, “Text Entry by Gaze: Utilizing Eye-Tracking,” in Text Entry Systems: Mobility, Accessibility, Universality, Ed. by I. S. MacKenzie and K. Tanaka-Ishii (2007), pp. 175–187.Google Scholar
- 6.M. D. Fairchild, Color Appearance Models (Addison-Wesley, Reading, MA, 1998).Google Scholar
- 8.E. L. Cascia, S. Sclaroff, and V. Athitsos, “Fast, Reliable Head Tracking under Varying Illumination: An Approach Based on Registration of Texture-Mapped 3d Models,” Pattern Anal. Mach. Intelligence 22(4) (2000).Google Scholar
- 9.R. Cowie, E. Douglas-Cowie, K. Karpouzis, G. Caridakis, M. Wallace, and S. Kollias, “Recognition of Emotional States in Natural Human-Computer Interaction,” in Multimodal User Interfaces, Ed. by D. Tzovaras (Springer, 2008), pp. 119–153.Google Scholar
- 11.Marr, D., Vision (W.H. Freeman, New York, 1982).Google Scholar