Mouth tracking for hands-free robot control systems
In this paper, we propose a mouth tracking method for remote robot control systems. The main idea behind the work is to help disabled people, who cannot operate any keyboard or joystick, to control a robot without use of their hands. The mouth tracking method is mainly based on the AdaBoost feature detection approach. By adding new Haar-like features for detecting the corner of the mouth, the speed and accuracy of detection are improved. The AdaBoost feature detection combined with the Kalman filter accomplished continuous and accurate mouth tracking. Meanwhile, the gripping commands for the robot manipulator were obtained through recognition of mouth shape, such as for a pouting mouth or a grinning mouth. To assess the validity of the method, mouth detection experiments and robot cargo transport experiments were conducted. The results indicate that the proposed method can realize mouth tracking and robot operations that are quick and accurate in retrieving items successfully.
KeywordsAdaBoost Haar-like features human-computer interaction (HCI) region of interest (ROI)
Unable to display preview. Download preview PDF.
- C. Kwan, I. Paquette, J. J. Magee, P. Y. Lee, and M. Betke, “Click control: improving mouse interaction for people with motor impairments,” Proc. of the 13th Int. ACM SIGACCESS Conf. on Computers and Accessibility, Dundee, Scotland, pp. 231–232, October 2011.Google Scholar
- P. Viola and M. Jones, “Rapid object detection using a boosted cascade of simple features,” Proc. of the IEEE Computer Society Conf. on Computer Vision and Pattern Recognition, vol. 1, pp. 511–518, 2001.Google Scholar
- S. Deb and S. Deb, “Designing an intelligent blink analyzer tool for effective human computer interaction through eye,” Proc. of 4th Int. Conf. on. IEEE Intelligent Human Computer Interaction, 2012.Google Scholar
- L. Lee, S. An, and S. Oh, “Efficient face detection and tracking with extended camshift and Haar-like features,” Proc. of Int. Conf. on. IEEE Mechatronics and Automation (ICMA), 2011.Google Scholar
- B. Chen, S. Jie, and S. Helei, “A fast face recognition system on mobile phone,” Proc. of Int. Conf. on IEEE Systems and Informatics, 2012.Google Scholar
- S. Ping and L. Yang, “Improved algorithm of integral image of 45 degree rotated rectangle windows and its application,” Computer Applications and Software, vol. 3, 2008.Google Scholar
- J. Cho, S. Kim, and J. Lim, “A fast eye-movement tracking using cross-division algorithm,” Proc. of IEEE EMBS Annual Int. Conf., 2006.Google Scholar
- H. Nanda and K. Fujimura, “A robust elliptical head tracker,” Proc. IEEE Int. Conf. on Automatic Face and Gesture Recognition, pp. 469–474, 2004.Google Scholar
- H. Trevor, R. Tibshirani, and J. Friedman, “The elements of statistical learning: data mining, inference and prediction,” The Mathematical Intelligencer, vol. 27, no. 2, pp. 83–85, 2001.Google Scholar
- M. Pham and T. Cham, “Fast training and selection of Haar features using statistics in boosting-based face detection,” Proc. of IEEE Int. Conf. on Computer Vision, pp. 1–7, 2007.Google Scholar
- Y. Fu and T. S. Huang, “hMouse: head tracking driven virtual computer mouse,” Proc. of IEEE Workshop Applications of Computer Vision, p. 30, 2007.Google Scholar
- Y. Nakanishi, T. Fujii, K. Kiatjima, Y. Sato, and H. Koike, “Vision-based face tracking system for large displays,” Proc. of the Fourth Int. Conf. on Ubiquitous Computing, pp. 152–159, 2002.Google Scholar