Skip to main content
Log in

Mouth tracking for hands-free robot control systems

  • Regular Papers
  • Robotics and Automation
  • Published:
International Journal of Control, Automation and Systems Aims and scope Submit manuscript

Abstract

In this paper, we propose a mouth tracking method for remote robot control systems. The main idea behind the work is to help disabled people, who cannot operate any keyboard or joystick, to control a robot without use of their hands. The mouth tracking method is mainly based on the AdaBoost feature detection approach. By adding new Haar-like features for detecting the corner of the mouth, the speed and accuracy of detection are improved. The AdaBoost feature detection combined with the Kalman filter accomplished continuous and accurate mouth tracking. Meanwhile, the gripping commands for the robot manipulator were obtained through recognition of mouth shape, such as for a pouting mouth or a grinning mouth. To assess the validity of the method, mouth detection experiments and robot cargo transport experiments were conducted. The results indicate that the proposed method can realize mouth tracking and robot operations that are quick and accurate in retrieving items successfully.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. C. Kwan, I. Paquette, J. J. Magee, P. Y. Lee, and M. Betke, “Click control: improving mouse interaction for people with motor impairments,” Proc. of the 13th Int. ACM SIGACCESS Conf. on Computers and Accessibility, Dundee, Scotland, pp. 231–232, October 2011.

    Google Scholar 

  2. S. Epstein, E. Missimer, and M. Betke, “Using kernels for a video-based mouse-replacement interface,” Personal and Ubiquitous Computing, vol. 18, no. 1, pp. 47–60, 2014.

    Article  Google Scholar 

  3. P. Viola and M. Jones, “Rapid object detection using a boosted cascade of simple features,” Proc. of the IEEE Computer Society Conf. on Computer Vision and Pattern Recognition, vol. 1, pp. 511–518, 2001.

    Google Scholar 

  4. R. Lienhart and J. Maydt, “An extended set of Haar-like features for rapid object detection,” Proc. Int. Conf. on. IEEE Image Processing, vol. 1, pp. 900–903, 2002.

    Article  Google Scholar 

  5. P. L. Bartlett and M. Traskin, “Adaboost is consistent,” Journal of Machine Learning Research, vol. 8, pp. 2347–2368, 2007.

    MATH  MathSciNet  Google Scholar 

  6. T. Morris and V. Chauhan, “Facial feature tracking for cursor control,” Journal of Network and Computer Applications, vol. 29, no. 1, pp. 62–80, 2006.

    Article  Google Scholar 

  7. V. Theodorou, C. Zouzoulas, G. A. Triantafyllidis, and G. Papadourakis, “Gaze direction detection for cursor control,” Image Processing and Communications Challenges 3, Springer Berlin Heidelberg, pp. 395–400, 2011.

    Chapter  Google Scholar 

  8. S. Deb and S. Deb, “Designing an intelligent blink analyzer tool for effective human computer interaction through eye,” Proc. of 4th Int. Conf. on. IEEE Intelligent Human Computer Interaction, 2012.

    Google Scholar 

  9. X. Zhao, X. Chai, Z. Niu, C. Heng, and S. Shan, “Context modeling for facial landmark detection based on non-adjacent rectangle (NAR) Haar-like feature,” Image and Vision Computing, vol. 30, no. 3, pp. 136–146, 2012.

    Article  Google Scholar 

  10. L. Lee, S. An, and S. Oh, “Efficient face detection and tracking with extended camshift and Haar-like features,” Proc. of Int. Conf. on. IEEE Mechatronics and Automation (ICMA), 2011.

    Google Scholar 

  11. B. Chen, S. Jie, and S. Helei, “A fast face recognition system on mobile phone,” Proc. of Int. Conf. on IEEE Systems and Informatics, 2012.

    Google Scholar 

  12. K. Choi and S. Lee, “An enhanced CSLAM for multi-robot based on unscented Kalman filter,” International Journal of Control, Automation and Systems, vol. 10, no. 1, pp. 102–108, 2012.

    Article  Google Scholar 

  13. M. H. Yang, D. J. Kriegman, and N. Ahuja, “Detecting faces in images: a survey,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 24, no. 1, pp. 34–58, 2002.

    Article  Google Scholar 

  14. S. Ping and L. Yang, “Improved algorithm of integral image of 45 degree rotated rectangle windows and its application,” Computer Applications and Software, vol. 3, 2008.

  15. O. Eng-Jon and R. Bowden, “Robust facial feature tracking using shape-constrained multi-resolution selected linear predictors,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 33, no. 9, pp. 1844–1859, 2010.

    Article  Google Scholar 

  16. S. Kaushik Pavani and D. Delgado, “Haar-like features with optimally weighted rectangles for rapid object detection,” Pattern Recognition, vol. 43, no. 1, pp. 160–172, 2010.

    Article  MATH  Google Scholar 

  17. J. Canny, “A computational approach to edge detection,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 8, no. 6, pp. 679–698, 1986.

    Article  Google Scholar 

  18. S. H. Chang, D. S. Shim, H. Y. Kim, and K. N. Choi, “Object motion tracking using a moving direction estimate and color updates,” International Journal of Control, Automation and Systems, vol. 10, no. 1, pp. 136–142, 2012.

    Article  Google Scholar 

  19. I. Ullah, F. Ullah, Q. Ullah, and S. Shin, “Integrated tracking and accident avoidance system for mobile robots,” International Journal of Control, Automation and Systems, vol. 11, no. 6, pp. 1253–1265, 2012.

    Article  Google Scholar 

  20. J. Cho, S. Kim, and J. Lim, “A fast eye-movement tracking using cross-division algorithm,” Proc. of IEEE EMBS Annual Int. Conf., 2006.

    Google Scholar 

  21. J. Meynet and J. P. Thiran, “Information theoretic combination of pattern classifiers,” Pattern Recognition, vol. 43, no. 10, pp. 3412–3421, 2010.

    Article  MATH  Google Scholar 

  22. H. Nanda and K. Fujimura, “A robust elliptical head tracker,” Proc. IEEE Int. Conf. on Automatic Face and Gesture Recognition, pp. 469–474, 2004.

    Google Scholar 

  23. H. Trevor, R. Tibshirani, and J. Friedman, “The elements of statistical learning: data mining, inference and prediction,” The Mathematical Intelligencer, vol. 27, no. 2, pp. 83–85, 2001.

    Google Scholar 

  24. M. Pham and T. Cham, “Fast training and selection of Haar features using statistics in boosting-based face detection,” Proc. of IEEE Int. Conf. on Computer Vision, pp. 1–7, 2007.

    Google Scholar 

  25. Y. Fu and T. S. Huang, “hMouse: head tracking driven virtual computer mouse,” Proc. of IEEE Workshop Applications of Computer Vision, p. 30, 2007.

    Google Scholar 

  26. Y. Nakanishi, T. Fujii, K. Kiatjima, Y. Sato, and H. Koike, “Vision-based face tracking system for large displays,” Proc. of the Fourth Int. Conf. on Ubiquitous Computing, pp. 152–159, 2002.

    Google Scholar 

  27. O. Rösch, K. Schilling, and H. Roth, “Haptic interfaces for the remote control of mobile robots,” Control Engineering Practice, vol. 10, no. 11, pp. 1309–1313, 2002.

    Article  Google Scholar 

  28. O. Simonin and O. Grunder, “A cooperative multi-robot architecture for moving a paralyzed robot,” Mechatronics, vol. 19, no. 4, pp. 463–470, 2009.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Phill Kyu Rhee.

Additional information

Recommended by Editor Hyouk Ryeol Choi.

This research was partially supported by Inha University research grant.

Miyoung Nam received her B.Sc. and M.Sc. degrees in Computer Science from the University of Silla Busan, Korea, and her Ph.D. degree in Computer Science & Engineering from Inha University, Korea, in 1995, 2001, and 2006, respectively. Currently, she is the Vice President of R&D in YM-Naeultech, Korea. Her research interests include biometrics, pattern recognition, computer vision, and image processing.

Minhaz Uddin Ahmed received his B.S. and M.S. degrees in Computer Science from the National University, Bangladesh, in 2006 and 2010. He is currently a Ph.D. student in the Intelligent Technology Laboratory, Inha University, Korea. His research interests include visual tracking and surveillance, object detection, image processing, machine intelligence, and cloud computing.

Yan Shen received her B.S. degree in Computer Science from Inha University, Incheon, Korea in 2011. She is currently pursuing a Masters’ degree at Inha University, where she is majoring in computer science and engineering. Her research interests include recommender systems, computer vision, intelligent computers, and HCI.

Phill Kyu Rhee received his B.S. degree in Electrical Engineering from the Seoul National University, Seoul, Korea, in 1982, an M.S. degree in Computer Science from the East Texas State University, Commerce, Texas, in 1986, and a Ph.D. degree in Computer Science from the University of Louisiana, Lafayette, Louisiana, in 1990. From 1982 to 1985, he worked as a research scientist in the Systems Engineering Research Institute, Seoul, South Korea. In 1991, he joined the Electronic and Telecommunication Research Institute, Seoul, South Korea, as a senior research staff member. From 1992 to 2001, he was an associate professor in the Department of Computer Science and Information Engineering of Inha University, Incheon, South Korea, and since 2001, he has been a professor in the same university and department. His current research interests are pattern recognition, machine intelligence, and autonomic cloud computing. Dr. Rhee is a member of the IEEE Computer Society and the Korea Information Science Society (KISS).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nam, M., Ahmed, M.U., Shen, Y. et al. Mouth tracking for hands-free robot control systems. Int. J. Control Autom. Syst. 12, 628–636 (2014). https://doi.org/10.1007/s12555-012-0473-7

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12555-012-0473-7

Keywords

Navigation