Advertisement

Journal of Real-Time Image Processing

, Volume 11, Issue 2, pp 301–314 | Cite as

An optimized real-time hands gesture recognition based interface for individuals with upper-level spinal cord injuries

  • Hairong Jiang
  • Juan P. Wachs
  • Bradley S. Duerstock
Special Issue Paper

Abstract

This paper presents a hand gesture-based interface to facilitate interaction with individuals with upper-level spinal cord injuries, and offers an alternative way to perform “hands-on” laboratory tasks. The presented system consists of four modules: hand detection, tracking, trajectory recognition, and actuated device control. A 3D particle filter framework based on color and depth information is proposed to provide a more efficient solution to the independent face and hands tracking problem. More specifically, an interaction model utilizing spatial and motion information was integrated into the particle filter framework to tackle the “false merge” and “false labeling” problem through hand interaction and occlusion. To obtain an optimal parameter set for the interaction model, a neighborhood search algorithm was employed. An accuracy of 98.81 % was achieved by applying the optimal parameter set to the tracking module of the system. Once the hands were tracked successfully, the acquired gesture trajectories were compared with motion models. The dynamic time warping method was used for signals’ time alignment, and they were classified by a CONDENSATION algorithm with a recognition accuracy of 97.5 %. In a validation experiment, the decoded gestures were passed as commands to a mobile service robot and a robotic arm to perform simulated laboratory tasks. Control policies using the gestural control were studied and optimal policies were selected to achieve optimal performance. The computational cost of each system module demonstrated a real-time performance.

Keywords

Gesture recognition 3D particle filter Neighborhood search Dynamic time warping (DTW) CONDENSATION 

Notes

Acknowledgments

This work was partially funded by the National Institutes of Health through the NIH Director’s Pathfinder Award to Promote Diversity in the Scientific Workforce, Grant number DP4-GM096842-01.

References

  1. 1.
    Jacko, J.A.: Human–computer interaction design and development approaches. In: Proceeding of 14th HCI International Conference, Orlando, Florida, 9–14 July 2011Google Scholar
  2. 2.
    Moon, I., Lee, M., Ryu, J., et al.: Intelligent robotic wheelchair with EMG-, gesture-, and voice-based interfaces. In: International Conference on Intelligent Robots and Systems, pp. 3453–3458. IEEE Press, New York (2003)Google Scholar
  3. 3.
    Reale, M., Liu, P., Yin, L.J.: Using eye gaze, head pose and facial expression for personalized non-player character interaction. In: Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, pp. 13–18. IEEE Press, New York (2011)Google Scholar
  4. 4.
    Huo, X., Ghovanloo, M.: Using unconstrained tongue motion as an alternative control mechanism for wheeled mobility. IEEE Trans. Biomed. Eng. 56(6), 1719–1726 (2009)Google Scholar
  5. 5.
    Goektuerk, B.S., Tomasi, C.: 3D head tracking based on recognition and interpolation using a time-of flight depth sensor. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 211–217 (2004)Google Scholar
  6. 6.
    Li, Z., Jarvis, R.: A multi-modal gesture recognition system in a Human-robot interaction scenario. In: Proceedings of the IEEE International Workshop on Robotic and Sensors Environments, Lecco, Italy, pp. 41–46, 6–7 November 2009Google Scholar
  7. 7.
    Suma, E.A., Lange, B., Rizzo, A., et al.: FAAST: the flexible action and articulated skeleton toolkit. In: IEEE Virtual Reality Conference, pp 247–248 (2011)Google Scholar
  8. 8.
    Maskell, S., Gordon, N., Clapp, T.: A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking. IEEE Trans Signal Process 50(2), 174–188 (2002)Google Scholar
  9. 9.
    Perez, P., Hue, C., Vermaak, J., et al.: Color-based probabilistic tracking. LNCS, vol. 2350, pp. 661–675. Springer, Heidelberg (2002)Google Scholar
  10. 10.
    Okuma, K., Taleghani, A., Freitas, N., et al.: A boosted particle filter: multitarget detection and tracking. In: ECCV, pp. 28–39 (2004)Google Scholar
  11. 11.
    Kristan, M., Pers, J., Kovacic, S., et al.: A local-motion-based probabilistic model for visual tracking. Pattern Recogn 42(9), 2160–2168 (2009)Google Scholar
  12. 12.
    Kang, J., Cohen, I., Medioni, G.: Continuous tracking within and across camera streams. In: Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol 1, pp. 267–272 (2003)Google Scholar
  13. 13.
    Khan, Z., Balch, T., Dellaert, F.: An MCMC-based particle filter for tracking multiple interacting targets. In: ECCV, pp. 279–290. Springer, Heidelberg (2004)Google Scholar
  14. 14.
    Qu, W., Schonfeld, D., Mohamed, M.: Real-time distributed multi-object tracking using multiple interactive trackers and a magnetic-inertia potential model. IEEE Trans. Multimedia 9(3), 511–519 (2007)CrossRefGoogle Scholar
  15. 15.
    Bradski, G.R.: Computer vision face tracking as a component of a perceptual user interface. In: Workshop on Applications of Computer Vision, pp. 214–219 (1998)Google Scholar
  16. 16.
    Isard, M., Black, A.: CONDENSATION: Conditional density propagation for visual tracking. Int. J. Comput. Vis. 29(1), 5–28 (1998)CrossRefGoogle Scholar
  17. 17.
    Bilal, S., Akmeliawati, R., Shafie, A.A., et al.: Hidden Markov Model for human to computer interaction: a study on human hand gesture recognition. Artif. Intell. Rev. pp. 1–22 (2011). doi: 10.1007/s10462-011-9292-0
  18. 18.
    Black, M.J., Jepson, A.D.: A probabilistic framework for matching temporal trajectories: CONDENSATION-based recognition of gesture and expressions. In: Computer Vision—ECCV, pp. 909–924. Springer Berlin Heidelberg (1998)Google Scholar
  19. 19.
    Jones, M.J., Rehg, J.M.: Statistical color models with application to skin detection. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol.1, pp. 81–96 (1999)Google Scholar
  20. 20.
    Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: International Conference on Computer Vision and Pattern Recognition, pp. 511–518 (2001)Google Scholar
  21. 21.
    Hess, R., Fern, A.: Discriminatively trained particle filters for complex multi-object tracking. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 240–247 (2009)Google Scholar
  22. 22.
    Yu, T., Wu, Y.: Collaborative tracking of multiple targets. In: Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), vol 1, pp. 834–841 (2004)Google Scholar
  23. 23.
    Aach, J., Church, G.M.: Alignment gene expression time series with time warping algorithms. Bioinformatics 17(6), 495–508. Oxford University Press (2001)Google Scholar
  24. 24.
    Jiang, H., Wachs, J.P., Duerstock, B.S.: Facilitated gesture recognition based interfaces for people with upper extremity physical impairments. In: Proceedings in Pattern Recognition, Image Analysis, Computer Vision, and Applications. Lecture Notes in Computer Science, vol. 7441, pp. 228–235 (2012)Google Scholar
  25. 25.
    Black, J., Ellis, T., Rosin, P.: A noval method for video tracking performance evaluation. In: International Workshop on Visual Surveillance and Performance Evaluation of Tracking and Surveillance, Beijing, China, pp. 125–132 (2003)Google Scholar
  26. 26.
    Ellis, T.: Performance metrics and methods for tracking in surveillance. In: 3rd IEEE International Workshop on Performance Evaluation of Tracking and Surveillance, Copenhagen, Denmark, pp. 26–31 (2002)Google Scholar
  27. 27.
    Pingali, G., Segen, J.: Performance evaluation of people tracking systems. In: Proceedings of IEEE Workshop on Application of Computer Vision, pp. 33–38 (1996)Google Scholar
  28. 28.
    Pan, P., Porikli, F., Schonfeld, D.: A new method for trackning performance evaluation based on a reflective model and perturbation analysis. In: IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 3529–3532 (2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Hairong Jiang
    • 1
  • Juan P. Wachs
    • 1
  • Bradley S. Duerstock
    • 2
  1. 1.School of Industrial EngineeringPurdue UniversityWest LafayetteUSA
  2. 2.School of Industrial Engineering and Weldon School of Biomedical EngineeringPurdue UniversityWest LafayetteUSA

Personalised recommendations