Gesture Recognition and Machine Learning Applied to Sign Language Translation

  • Luis A. Estrada Jiménez
  • Marco E. Benalcázar
  • Nelson Sotomayor
Conference paper
Part of the IFMBE Proceedings book series (IFMBE, volume 60)


In this paper we propose an intelligent system for translating sign language into text. This approach consists of hardware and software. The hardware is formed by flex, contact, and inertial sensors mounted on a polyester-nylon glove. The software consists of a classification algorithm based on the k-nearest neighbors, decision trees, and the dynamic time warping algorithms. The proposed system is able to recognize static and dynamic gestures. This system can learn to classify the specific gesture patterns of any person. The proposed system was tested at translating 61 letters, numbers, and words from the Ecuadorian sign language. Experimental results demonstrate that our system has a classification accuracy of 91.55%. This result is a significant improvement compared with the results obtained in previous related works.


Sign language translation Gesture recognition Machine learning Pattern classification 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Organización Mundial de la Salud (2011) Informe mundial sobre la discapacidad, MaltaGoogle Scholar
  2. 2.
    Kumar V, Goudar R, and Desai V (2015) Sign Language Unification: The Need for Next Generation Deaf Education. International Conference on Intelligent Computing, Communication and Convergence, Odisha, India, 2015, pp 673-678Google Scholar
  3. 3.
    Kanwal K, Abdullah S, Ahmed Y, and Jafri A (2014) Assistive Glove for Pakistani Sign Language Translation. IEEE 17th International Multi-Topic Conference (INMIC), Pakistan, 2014, pp 173-176Google Scholar
  4. 4.
    Shukor A, Miskon M, and Jamaluddin M (2015) A New Data Glove Approach for Malaysian Sign Language Detection. Procedia Computer Science 76:60-67Google Scholar
  5. 5.
    Microsoft Research (2013) Kinect sign language translator expands communication possibilities for the deafGoogle Scholar
  6. 6.
    Li K, Lothrop K, Gill E, and Lau S (2011) A Web-Based Sign Language Translator Using 3D Video Processing. The 14th International Conference on Network-Based Information Systems, Tirana, Albania, 2011, pp 356-361Google Scholar
  7. 7.
    Plouffe G and Cretu A (2016) Static and Dynamic Hand Gesture Recognition in Depth Data Using Dynamic Time Warping. IEEE transactions on instrumentation and measurement 65:305-316Google Scholar
  8. 8.
    Funasaka M, Yu I, Takata M, and Joe K (2015) Sign Language Recognition using Leap Motion Controller. The 21st International Conference on Parallel and Distributed Processing Technques and Applications, Las Vegas, Nevada, USA, 2015, pp 263-269Google Scholar
  9. 9.
    Cong G and Tuan D (2016) Similarity search for numerous patterns over multiple time series streams under dynamic time warping which supports data normalization. Vietnam Journal of Computer Science 3:181-196Google Scholar
  10. 10.
    Consejo Nacional de Igualdad de Discapacidades de Ecuador (2016) Diccionario de Lengua de Señas Ecuatoriano “Gabriel RománGoogle Scholar
  11. 11.
    Rouviere H and Delmas A (2005) Anatomía humana, descriptiva, topográfica y funcional: miembros, Elsevier, EspañaGoogle Scholar
  12. 12.
    Dong Y (2014) Universal Consistency of k-NN classifier, University of Ottawa, CanadaGoogle Scholar
  13. 13.
    Duda R, Hart P, and Stork D (2012) Pattern Classification, Wiley, New York, USAGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2017

Authors and Affiliations

  • Luis A. Estrada Jiménez
    • 1
  • Marco E. Benalcázar
    • 2
  • Nelson Sotomayor
    • 1
  1. 1.Departamento de Automatización y Control IndustrialEscuela Politécnica NacionalQuitoEcuador
  2. 2.Departamento de Informática y Ciencias de la ComputaciónEscuela Politécnica NacionalQuitoEcuador

Personalised recommendations