Skip to main content
Log in

An optimized real-time hands gesture recognition based interface for individuals with upper-level spinal cord injuries

  • Special Issue Paper
  • Published:
Journal of Real-Time Image Processing Aims and scope Submit manuscript

Abstract

This paper presents a hand gesture-based interface to facilitate interaction with individuals with upper-level spinal cord injuries, and offers an alternative way to perform “hands-on” laboratory tasks. The presented system consists of four modules: hand detection, tracking, trajectory recognition, and actuated device control. A 3D particle filter framework based on color and depth information is proposed to provide a more efficient solution to the independent face and hands tracking problem. More specifically, an interaction model utilizing spatial and motion information was integrated into the particle filter framework to tackle the “false merge” and “false labeling” problem through hand interaction and occlusion. To obtain an optimal parameter set for the interaction model, a neighborhood search algorithm was employed. An accuracy of 98.81 % was achieved by applying the optimal parameter set to the tracking module of the system. Once the hands were tracked successfully, the acquired gesture trajectories were compared with motion models. The dynamic time warping method was used for signals’ time alignment, and they were classified by a CONDENSATION algorithm with a recognition accuracy of 97.5 %. In a validation experiment, the decoded gestures were passed as commands to a mobile service robot and a robotic arm to perform simulated laboratory tasks. Control policies using the gestural control were studied and optimal policies were selected to achieve optimal performance. The computational cost of each system module demonstrated a real-time performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

References

  1. Jacko, J.A.: Human–computer interaction design and development approaches. In: Proceeding of 14th HCI International Conference, Orlando, Florida, 9–14 July 2011

  2. Moon, I., Lee, M., Ryu, J., et al.: Intelligent robotic wheelchair with EMG-, gesture-, and voice-based interfaces. In: International Conference on Intelligent Robots and Systems, pp. 3453–3458. IEEE Press, New York (2003)

  3. Reale, M., Liu, P., Yin, L.J.: Using eye gaze, head pose and facial expression for personalized non-player character interaction. In: Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, pp. 13–18. IEEE Press, New York (2011)

  4. Huo, X., Ghovanloo, M.: Using unconstrained tongue motion as an alternative control mechanism for wheeled mobility. IEEE Trans. Biomed. Eng. 56(6), 1719–1726 (2009)

    Google Scholar 

  5. Goektuerk, B.S., Tomasi, C.: 3D head tracking based on recognition and interpolation using a time-of flight depth sensor. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 211–217 (2004)

  6. Li, Z., Jarvis, R.: A multi-modal gesture recognition system in a Human-robot interaction scenario. In: Proceedings of the IEEE International Workshop on Robotic and Sensors Environments, Lecco, Italy, pp. 41–46, 6–7 November 2009

  7. Suma, E.A., Lange, B., Rizzo, A., et al.: FAAST: the flexible action and articulated skeleton toolkit. In: IEEE Virtual Reality Conference, pp 247–248 (2011)

  8. Maskell, S., Gordon, N., Clapp, T.: A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking. IEEE Trans Signal Process 50(2), 174–188 (2002)

    Google Scholar 

  9. Perez, P., Hue, C., Vermaak, J., et al.: Color-based probabilistic tracking. LNCS, vol. 2350, pp. 661–675. Springer, Heidelberg (2002)

  10. Okuma, K., Taleghani, A., Freitas, N., et al.: A boosted particle filter: multitarget detection and tracking. In: ECCV, pp. 28–39 (2004)

  11. Kristan, M., Pers, J., Kovacic, S., et al.: A local-motion-based probabilistic model for visual tracking. Pattern Recogn 42(9), 2160–2168 (2009)

    Google Scholar 

  12. Kang, J., Cohen, I., Medioni, G.: Continuous tracking within and across camera streams. In: Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol 1, pp. 267–272 (2003)

  13. Khan, Z., Balch, T., Dellaert, F.: An MCMC-based particle filter for tracking multiple interacting targets. In: ECCV, pp. 279–290. Springer, Heidelberg (2004)

  14. Qu, W., Schonfeld, D., Mohamed, M.: Real-time distributed multi-object tracking using multiple interactive trackers and a magnetic-inertia potential model. IEEE Trans. Multimedia 9(3), 511–519 (2007)

    Article  Google Scholar 

  15. Bradski, G.R.: Computer vision face tracking as a component of a perceptual user interface. In: Workshop on Applications of Computer Vision, pp. 214–219 (1998)

  16. Isard, M., Black, A.: CONDENSATION: Conditional density propagation for visual tracking. Int. J. Comput. Vis. 29(1), 5–28 (1998)

    Article  Google Scholar 

  17. Bilal, S., Akmeliawati, R., Shafie, A.A., et al.: Hidden Markov Model for human to computer interaction: a study on human hand gesture recognition. Artif. Intell. Rev. pp. 1–22 (2011). doi:10.1007/s10462-011-9292-0

  18. Black, M.J., Jepson, A.D.: A probabilistic framework for matching temporal trajectories: CONDENSATION-based recognition of gesture and expressions. In: Computer Vision—ECCV, pp. 909–924. Springer Berlin Heidelberg (1998)

  19. Jones, M.J., Rehg, J.M.: Statistical color models with application to skin detection. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol.1, pp. 81–96 (1999)

  20. Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: International Conference on Computer Vision and Pattern Recognition, pp. 511–518 (2001)

  21. Hess, R., Fern, A.: Discriminatively trained particle filters for complex multi-object tracking. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 240–247 (2009)

  22. Yu, T., Wu, Y.: Collaborative tracking of multiple targets. In: Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), vol 1, pp. 834–841 (2004)

  23. Aach, J., Church, G.M.: Alignment gene expression time series with time warping algorithms. Bioinformatics 17(6), 495–508. Oxford University Press (2001)

    Google Scholar 

  24. Jiang, H., Wachs, J.P., Duerstock, B.S.: Facilitated gesture recognition based interfaces for people with upper extremity physical impairments. In: Proceedings in Pattern Recognition, Image Analysis, Computer Vision, and Applications. Lecture Notes in Computer Science, vol. 7441, pp. 228–235 (2012)

  25. Black, J., Ellis, T., Rosin, P.: A noval method for video tracking performance evaluation. In: International Workshop on Visual Surveillance and Performance Evaluation of Tracking and Surveillance, Beijing, China, pp. 125–132 (2003)

  26. Ellis, T.: Performance metrics and methods for tracking in surveillance. In: 3rd IEEE International Workshop on Performance Evaluation of Tracking and Surveillance, Copenhagen, Denmark, pp. 26–31 (2002)

  27. Pingali, G., Segen, J.: Performance evaluation of people tracking systems. In: Proceedings of IEEE Workshop on Application of Computer Vision, pp. 33–38 (1996)

  28. Pan, P., Porikli, F., Schonfeld, D.: A new method for trackning performance evaluation based on a reflective model and perturbation analysis. In: IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 3529–3532 (2009)

Download references

Acknowledgments

This work was partially funded by the National Institutes of Health through the NIH Director’s Pathfinder Award to Promote Diversity in the Scientific Workforce, Grant number DP4-GM096842-01.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hairong Jiang.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Jiang, H., Wachs, J.P. & Duerstock, B.S. An optimized real-time hands gesture recognition based interface for individuals with upper-level spinal cord injuries. J Real-Time Image Proc 11, 301–314 (2016). https://doi.org/10.1007/s11554-013-0352-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11554-013-0352-3

Keywords

Navigation