Real-Time Single Camera Hand Gesture Recognition System for Remote Deaf-Blind Communication

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8853)


This paper presents a fast approach for marker-less Full-DOF hand tracking, leveraging only depth information from a single depth camera. This system can be useful in many applications, ranging from tele-presence to remote control of robotic actuators or interaction with 3D virtual environment. We applied the proposed technology to enable remote transmission of signs from Tactile Sing Languages (i.e., Sign Languages with Tactile feedbacks), allowing non-invasive remote communication not only among deaf-blind users, but also with deaf, blind and hearing with proficiency in Sign Languages. We show that our approach paves the way to a fluid and natural remote communication for deaf-blind people, up to now impossible. This system is a first prototype for the PARLOMA project, which aims at designing a remote communication system for deaf-blind people.


Real-time Markerless Hand Tracking Hand Gesture Recognition Tactile Sign-Language Communication Haptic Interface 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
  2. 2.
    Prensilia s.r.l., datasheet eh1 milano series (2010).
  3. 3.
    Abbou, C.C., Hoznek, A., Salomon, L., Olsson, L.E., Lobontiu, A., Saint, F., Cicco, A., Antiphon, P., Chopin, D.: Laparoscopic radical prostatectomy with a remote controlled robot. The Journal of Urology 165(6), 1964–1966 (2001)CrossRefGoogle Scholar
  4. 4.
    Athitsos, V., Sclaroff, S.: Estimating 3d hand pose from a cluttered image. In: Proceedings of 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2, p. II-432. IEEE (2003)Google Scholar
  5. 5.
    Bray, M., Koller-Meier, E., Van Gool, L.: Smart particle filtering for 3d hand tracking. In: Proceedings of Sixth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 675–680. IEEE (2004)Google Scholar
  6. 6.
    Breiman, L.: Random forests. Machine Learning 45(1), 5–32 (2001)CrossRefzbMATHGoogle Scholar
  7. 7.
    Breuer, P., Eckes, C., Müller, S.: Hand gesture recognition with a novel ir time-of-flight range camera–a pilot study. In: Gagalowicz, A., Philips, W. (eds.) MIRAGE 2007. LNCS, vol. 4418, pp. 247–260. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  8. 8.
    Comaniciu, D., Meer, P.: Mean shift: A robust approach toward feature space analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(5), 603–619 (2002)CrossRefGoogle Scholar
  9. 9.
    Controzzi, M., Cipriani, C., Carrozza, M.C.: Design of artificial hands: A review. The Human Hand as an Inspiration for Robot Hand Development. STAR, vol. 95, pp. 219–246. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  10. 10.
    Erol, A., Bebis, G., Nicolescu, M., Boyle, R.D., Twombly, X.: Vision-based hand pose estimation: A review. Computer Vision and Image Understanding 108(1), 52–73 (2007)CrossRefGoogle Scholar
  11. 11.
    Frankel, S., Glenn, R., Kelly, S.: The aes-cbc cipher algorithm and its use with ipsec. RFC3602 (2003)Google Scholar
  12. 12.
    Gavrila, D.M., Davis, L.S.: 3-d model-based tracking of humans in action: A multi-view approach. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 1996, pp. 73–80. IEEE (1996)Google Scholar
  13. 13.
    Goncalves, L., Di Bernardo, E., Ursella, E., Perona, P.: Monocular tracking of the human arm in 3d. In: Proceedings of Fifth International Conference on Computer Vision, pp. 764–770. IEEE (1995)Google Scholar
  14. 14.
    Grebenstein, M.: The awiwi hand: An artificial hand for the DLR hand arm system. In: Grebenstein, M. (ed.) Approaching Human Performance. STAR, vol. 98, pp. 67–136. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  15. 15.
    Han, J., Shao, L., Xu, D., Shotton, J.: Enhanced computer vision with microsoft kinect sensor: A review (2013)Google Scholar
  16. 16.
    Keskin, C., Kıraç, F., Kara, Y.E., Akarun, L.: Real time hand pose estimation using depth sensors. In: Consumer Depth Cameras for Computer Vision, pp. 119–137. Springer (2013)Google Scholar
  17. 17.
    Kuznetsova, A., Leal-Taixe, L., Rosenhahn, B.: Real-time sign language recognition using a consumer depth camera. In: Proceedings of the IEEE International Conference on Computer Vision Workshops, pp. 83–90 (2013)Google Scholar
  18. 18.
    Lorussi, F., Scilingo, E.P., Tesconi, M., Tognetti, A., De Rossi, D.: Strain sensing fabric for hand posture and gesture monitoring. IEEE Transactions on Information Technology in Biomedicine 9(3), 372–381 (2005)CrossRefGoogle Scholar
  19. 19.
    Mesch, J.: Signed conversations of deafblind peopleGoogle Scholar
  20. 20.
    Oikonomidis, I., Kyriazis, N., Argyros, A.A.: Efficient model-based 3d tracking of hand articulations using kinect. In: BMVC, pp. 1–11 (2011)Google Scholar
  21. 21.
    Raspopovic, S., Capogrosso, M., Petrini, F.M., Bonizzato, M., Rigosa, J., Di Pino, G., Carpaneto, J., Controzzi, M., Boretius, T., Fernandez, E., Granata, G., Oddo, C.M., Citi, L., Ciancio, A.L., Cipriani, C., Carrozza, M.C., Jensen, W., Guglielmelli, E., Stieglitz, T., Rossini, P.M., Micera, S.: Restoring natural sensory feedback in real-time bidirectional hand prostheses. Science Translational Medicine 6(222), 222ra19 (2014)Google Scholar
  22. 22.
    Rehg, J.M., Kanade, T.: Digiteyes: Vision-based hand tracking for human-computer interaction. In: Proceedings of the 1994 IEEE Workshop on Motion of Non-Rigid and Articulated Objects, pp. 16–22. IEEE (1994)Google Scholar
  23. 23.
    Rodriguez-Galiano, V., Ghimire, B., Rogan, J., Chica-Olmo, M., Rigol-Sanchez, J.: An assessment of the effectiveness of a random forest classifier for land-cover classification. ISPRS Journal of Photogrammetry and Remote Sensing 67, 93–104 (2012)CrossRefGoogle Scholar
  24. 24.
    Shotton, J., Sharp, T., Kipman, A., Fitzgibbon, A., Finocchio, M., Blake, A., Cook, M., Moore, R.: Real-time human pose recognition in parts from single depth images. Communications of the ACM 56(1), 116–124 (2013)CrossRefGoogle Scholar
  25. 25.
    Stenger, B., Thayananthan, A., Torr, P.H., Cipolla, R.: Model-based hand tracking using a hierarchical bayesian filter. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(9), 1372–1384 (2006)CrossRefGoogle Scholar
  26. 26.
    Walkler, R.: Developments in dextrous hands for advanced robotic applications. In: Proc. the Sixth Biannual World Automation Congress, Seville, Spain. pp. 123–128 (2004)Google Scholar
  27. 27.
    Wang, R., Paris, S., Popović, J.: 6d hands: markerless hand-tracking for computer aided design. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, pp. 549–558. ACM (2011)Google Scholar
  28. 28.
    Wang, R.Y., Popović, J.: Real-time hand-tracking with a color glove. ACM Transactions on Graphics (TOG) 28, 63 (2009)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.Politecnico di TorinoTurinItaly
  2. 2.Institute of Electronics, Computer and Telecommunication EngineeringNational Research Council of ItalyPadovaItaly
  3. 3.Politecnico di MilanoMilanoItaly
  4. 4.The BioRobotics Institute, Scuola Superiore Sant’AnnaPisaItaly

Personalised recommendations