Mouth tracking for hands-free robot control systems

  • Miyoung Nam
  • Minhaz Uddin Ahmed
  • Yan Shen
  • Phill Kyu Rhee
Regular Papers Robotics and Automation

Abstract

In this paper, we propose a mouth tracking method for remote robot control systems. The main idea behind the work is to help disabled people, who cannot operate any keyboard or joystick, to control a robot without use of their hands. The mouth tracking method is mainly based on the AdaBoost feature detection approach. By adding new Haar-like features for detecting the corner of the mouth, the speed and accuracy of detection are improved. The AdaBoost feature detection combined with the Kalman filter accomplished continuous and accurate mouth tracking. Meanwhile, the gripping commands for the robot manipulator were obtained through recognition of mouth shape, such as for a pouting mouth or a grinning mouth. To assess the validity of the method, mouth detection experiments and robot cargo transport experiments were conducted. The results indicate that the proposed method can realize mouth tracking and robot operations that are quick and accurate in retrieving items successfully.

Keywords

AdaBoost Haar-like features human-computer interaction (HCI) region of interest (ROI) 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    C. Kwan, I. Paquette, J. J. Magee, P. Y. Lee, and M. Betke, “Click control: improving mouse interaction for people with motor impairments,” Proc. of the 13th Int. ACM SIGACCESS Conf. on Computers and Accessibility, Dundee, Scotland, pp. 231–232, October 2011.Google Scholar
  2. [2]
    S. Epstein, E. Missimer, and M. Betke, “Using kernels for a video-based mouse-replacement interface,” Personal and Ubiquitous Computing, vol. 18, no. 1, pp. 47–60, 2014.CrossRefGoogle Scholar
  3. [3]
    P. Viola and M. Jones, “Rapid object detection using a boosted cascade of simple features,” Proc. of the IEEE Computer Society Conf. on Computer Vision and Pattern Recognition, vol. 1, pp. 511–518, 2001.Google Scholar
  4. [4]
    R. Lienhart and J. Maydt, “An extended set of Haar-like features for rapid object detection,” Proc. Int. Conf. on. IEEE Image Processing, vol. 1, pp. 900–903, 2002.CrossRefGoogle Scholar
  5. [5]
    P. L. Bartlett and M. Traskin, “Adaboost is consistent,” Journal of Machine Learning Research, vol. 8, pp. 2347–2368, 2007.MATHMathSciNetGoogle Scholar
  6. [6]
    T. Morris and V. Chauhan, “Facial feature tracking for cursor control,” Journal of Network and Computer Applications, vol. 29, no. 1, pp. 62–80, 2006.CrossRefGoogle Scholar
  7. [7]
    V. Theodorou, C. Zouzoulas, G. A. Triantafyllidis, and G. Papadourakis, “Gaze direction detection for cursor control,” Image Processing and Communications Challenges 3, Springer Berlin Heidelberg, pp. 395–400, 2011.CrossRefGoogle Scholar
  8. [8]
    S. Deb and S. Deb, “Designing an intelligent blink analyzer tool for effective human computer interaction through eye,” Proc. of 4th Int. Conf. on. IEEE Intelligent Human Computer Interaction, 2012.Google Scholar
  9. [9]
    X. Zhao, X. Chai, Z. Niu, C. Heng, and S. Shan, “Context modeling for facial landmark detection based on non-adjacent rectangle (NAR) Haar-like feature,” Image and Vision Computing, vol. 30, no. 3, pp. 136–146, 2012.CrossRefGoogle Scholar
  10. [10]
    L. Lee, S. An, and S. Oh, “Efficient face detection and tracking with extended camshift and Haar-like features,” Proc. of Int. Conf. on. IEEE Mechatronics and Automation (ICMA), 2011.Google Scholar
  11. [11]
    B. Chen, S. Jie, and S. Helei, “A fast face recognition system on mobile phone,” Proc. of Int. Conf. on IEEE Systems and Informatics, 2012.Google Scholar
  12. [12]
    K. Choi and S. Lee, “An enhanced CSLAM for multi-robot based on unscented Kalman filter,” International Journal of Control, Automation and Systems, vol. 10, no. 1, pp. 102–108, 2012.CrossRefGoogle Scholar
  13. [13]
    M. H. Yang, D. J. Kriegman, and N. Ahuja, “Detecting faces in images: a survey,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 24, no. 1, pp. 34–58, 2002.CrossRefGoogle Scholar
  14. [14]
    S. Ping and L. Yang, “Improved algorithm of integral image of 45 degree rotated rectangle windows and its application,” Computer Applications and Software, vol. 3, 2008.Google Scholar
  15. [15]
    O. Eng-Jon and R. Bowden, “Robust facial feature tracking using shape-constrained multi-resolution selected linear predictors,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 33, no. 9, pp. 1844–1859, 2010.CrossRefGoogle Scholar
  16. [16]
    S. Kaushik Pavani and D. Delgado, “Haar-like features with optimally weighted rectangles for rapid object detection,” Pattern Recognition, vol. 43, no. 1, pp. 160–172, 2010.CrossRefMATHGoogle Scholar
  17. [17]
    J. Canny, “A computational approach to edge detection,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 8, no. 6, pp. 679–698, 1986.CrossRefGoogle Scholar
  18. [18]
    S. H. Chang, D. S. Shim, H. Y. Kim, and K. N. Choi, “Object motion tracking using a moving direction estimate and color updates,” International Journal of Control, Automation and Systems, vol. 10, no. 1, pp. 136–142, 2012.CrossRefGoogle Scholar
  19. [19]
    I. Ullah, F. Ullah, Q. Ullah, and S. Shin, “Integrated tracking and accident avoidance system for mobile robots,” International Journal of Control, Automation and Systems, vol. 11, no. 6, pp. 1253–1265, 2012.CrossRefGoogle Scholar
  20. [20]
    J. Cho, S. Kim, and J. Lim, “A fast eye-movement tracking using cross-division algorithm,” Proc. of IEEE EMBS Annual Int. Conf., 2006.Google Scholar
  21. [21]
    J. Meynet and J. P. Thiran, “Information theoretic combination of pattern classifiers,” Pattern Recognition, vol. 43, no. 10, pp. 3412–3421, 2010.CrossRefMATHGoogle Scholar
  22. [22]
    H. Nanda and K. Fujimura, “A robust elliptical head tracker,” Proc. IEEE Int. Conf. on Automatic Face and Gesture Recognition, pp. 469–474, 2004.Google Scholar
  23. [23]
    H. Trevor, R. Tibshirani, and J. Friedman, “The elements of statistical learning: data mining, inference and prediction,” The Mathematical Intelligencer, vol. 27, no. 2, pp. 83–85, 2001.Google Scholar
  24. [24]
    M. Pham and T. Cham, “Fast training and selection of Haar features using statistics in boosting-based face detection,” Proc. of IEEE Int. Conf. on Computer Vision, pp. 1–7, 2007.Google Scholar
  25. [25]
    Y. Fu and T. S. Huang, “hMouse: head tracking driven virtual computer mouse,” Proc. of IEEE Workshop Applications of Computer Vision, p. 30, 2007.Google Scholar
  26. [26]
    Y. Nakanishi, T. Fujii, K. Kiatjima, Y. Sato, and H. Koike, “Vision-based face tracking system for large displays,” Proc. of the Fourth Int. Conf. on Ubiquitous Computing, pp. 152–159, 2002.Google Scholar
  27. [27]
    O. Rösch, K. Schilling, and H. Roth, “Haptic interfaces for the remote control of mobile robots,” Control Engineering Practice, vol. 10, no. 11, pp. 1309–1313, 2002.CrossRefGoogle Scholar
  28. [28]
    O. Simonin and O. Grunder, “A cooperative multi-robot architecture for moving a paralyzed robot,” Mechatronics, vol. 19, no. 4, pp. 463–470, 2009.CrossRefGoogle Scholar

Copyright information

© Institute of Control, Robotics and Systems and The Korean Institute of Electrical Engineers and Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  • Miyoung Nam
    • 1
  • Minhaz Uddin Ahmed
    • 1
  • Yan Shen
    • 1
  • Phill Kyu Rhee
    • 1
  1. 1.ITLab, Department of Computer and Information EngineeringInha UniversityIncheonKorea

Personalised recommendations