Automatic recognition of the American sign language fingerspelling alphabet to assist people living with speech or hearing impairments

Original Research

Abstract

Sign languages are natural languages used mostly by deaf and hard of hearing people. Different development opportunities for people with these disabilities are limited because of communication problems. The advances in technology to recognize signs and gestures will make computer supported interpretation of sign languages possible. There are more than 137 different sign languages around the world; therefore, a system that interprets them could be beneficial to all, especially to the Deaf Community. This paper presents a system based on hand tracking devices (Leap Motion and Intel RealSense), used for signs recognition. The system uses a Support Vector Machine for sign classification. Different evaluations of the system were performed with over 50 individuals; and remarkable recognition accuracy was achieved with selected signs (100% accuracy was achieved recognizing some signs). Furthermore, an exploration on the Leap Motion and the Intel RealSense potential as a hand tracking devices for sign language recognition using the American Sign Language fingerspelling alphabet was performed.

Keywords

American sign language Leap motion Intel RealSense Support vector machine Automatic sign language recognition Natural user interfaces 

Notes

Acknowledgements

This work was partially supported by the Escuela de Ciencias de la Computación e Informática at Universidad de Costa Rica (ECCI-UCR) grant No. 320-B5-291, by Centro de Investigaciones en Tecnologías de la Información y Comunicación de la Universidad de Costa Rica (CITIC-UCR), and by Ministerio de Ciencia, Tecnología y Telecomunicaciones (MICITT) and Consejo Nacional para Investigaciones Científicas y Tecnológicas (CONICIT) of the Government of Costa Rica.

References

  1. Caridakis G, Asteriadis S, Karpouzis K (2014) Non-manual cues in automatic sign language recognition. Pers Ubiquit Comput 18(1): 37–46. doi: 10.1007/s00779-012-0615-1 CrossRefGoogle Scholar
  2. Carter M, Newn J, Velloso E, Vetere F (2015) Remote gaze and gesture tracking on the microsoft kinect. In: Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction on - OzCHI’15 (pp 167–176). New York: ACM Press. doi: 10.1145/2838739.2838778
  3. Chang CC, Lin CJ (2001) LIBSVM: a library for support vector machines. Retrieved from http://www.csie.ntu.edu.tw/?cjlin/libsvm
  4. Chuan CH, Regina E, Guardino C (2014) American sign language recognition using leap motion sensor. 2014 13th International Conference on Machine Learning and Applications, 541–544. doi: 10.1109/ICMLA.2014.110
  5. Cortes C, Vapnik V (1995). Support-vector networks. Mach Learn, 20, 273–297. doi: 10.1111/j.1747-0285.2009.00840.x MATHGoogle Scholar
  6. Costello E (2008) Random House Webster’s Compact American Sign Language Dictionary (3 Compact). Random House ReferenceGoogle Scholar
  7. Elakkiya R, Selvamani K (2015) An active learning framework for human hand sign gestures and handling movement epenthesis using enhanced level building approach. Procedia Computer Science, 48(Iccc): 606–611. doi: 10.1016/j.procs.2015.04.142 CrossRefGoogle Scholar
  8. Elons AS, Ahmed M, Shedid H, Tolba MF (2014) Arabic sign language recognition using leap motion sensor. In: 2014 9th International Conference on Computer Engineering & Systems (ICCES), IEEE, pp 368–373 doi: 10.1109/ICCES.2014.7030987
  9. Geetha M, Manjusha C, Unnikrishnan P, Harikrishnan R (2013) A vision based dynamic gesture recognition of Indian Sign Language on Kinect based depth images. Proceedings—2013 International Conference on Emerging Trends in Communication, Control, Signal Processing and Computing Applications, IEEE-C2SPCA 2013. doi: 10.1109/C2SPCA.2013.6749448
  10. Guna J, Jakus G, Pogačnik M, Tomažič S, Sodnik J (2014) An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors (Switzerland), 14(2), 3702–3720. doi: 10.3390/s140203702 CrossRefGoogle Scholar
  11. Hanson V (2007) Computing technologies for deaf and hard of hearing users. In: Sears A, Jacko J (eds) Human computer interaction handbook: fundamentals, evolving technologies and emerging applications, 2nd edn. CRC Press, pp 885–893Google Scholar
  12. Ibarguren A, Maurtua I, Sierra B (2010) Layered architecture for real time sign recognition: Hand gesture and movement. Eng Appl Artif Intell, 23(7): 1216–1228. doi: 10.1016/j.engappai.2010.06.001 CrossRefGoogle Scholar
  13. Jiang F, Zhang S, Wu S, Gao Y, Zhao D (2015) Multi-layered gesture recognition with kinect. J Machine Learn Res 16:227–254MathSciNetMATHGoogle Scholar
  14. Karami A, Zanj B, Sarkaleh AK (2011). Persian sign language (PSL) recognition using wavelet transform and neural networks. Expert Syst Appl, 38(3): 2661–2667. doi: 10.1016/j.eswa.2010.08.056 CrossRefGoogle Scholar
  15. Knerr S, Personnaz L, Dreyfus G (1990). Single-layer learning revisited: a stepwise procedure for building and training a neural network. Neurocomputing, Springer Berlin Heidelberg, Berlin, pp 41–50. doi: 10.1007/978-3-642-76153-9_5
  16. Leap Motion Home Page (2015) Retrieved from https://www.leapmotion.com/
  17. Lewis P, Simons G, Fennig C (2014) Ethnologue: languages of the world, 17th edn. SIL International, DallasGoogle Scholar
  18. Liwicki S, Everingham M (2009) Automatic recognition of fingerspelled words in british sign language. 2009 IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2009, (iv), 50–57. doi: 10.1109/CVPR.2009.5204291
  19. Mohandes M, Aliyu S, Deriche M (2014) Arabic sign language recognition using the leap motion controller. 2014 IEEE 23rd International Symposium on Industrial Electronics (ISIE), 960–965. doi: 10.1109/ISIE.2014.6864742
  20. Ong SCW, Ranganath S (2005) Automatic sign language analysis: a survey and the future beyond lexical meaning. IEEE Trans Pattern Anal Mach Intell, 27(6): 873–891. doi: 10.1109/TPAMI.2005.112 CrossRefGoogle Scholar
  21. Oszust M, Wysocki M (2013) Polish sign language words recognition with Kinect. 2013 6th International Conference on Human System Interactions, HSI 2013, 219–226. doi: 10.1109/HSI.2013.6577826
  22. Paudyal P, Banerjee A, Gupta SKS (2016) SCEPTRE: a pervasive, non-invasive, and programmable gesture recognition technology. Proceedings of the 21st International Conference on Intelligent User Interfaces, 282–293. doi: 10.1145/2856767.2856794
  23. Potter L, Araullo J, Carter L (2013) The Leap Motion controller: a view on sign language. Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration, 175–178. doi: 10.1145/2541016.2541072
  24. Scikit-learn Developers (2015) Scikit-learn: Machine Learning in Python. Retrieved from http://scikit-learn.org/stable/
  25. Stefan A, Athitsos V, Alon J, Sclaroff S (2008) Translation and scale-invariant gesture recognition in complex scenes. Proceedings of the 1st ACM International Conference on PErvasive Technologies Related to Assistive Environments - PETRA’08, 1.doi:  10.1145/1389586.1389595
  26. Sun C, Zhang T, Bao B-K, Xu C (2013) Latent support vector machine for sign language recognition with Kinect. IEEE International Conference on Image Processing, 6(2), 4190–4194. doi: 10.1109/ICIP.2013.6738863 Google Scholar
  27. Ten Holt GA, Reinders MJT, Hendriks EA, De Ridder H, Van Doorn AJ (2009) Influence of handshape information on automatic sign language recognition. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 5934 LNAI, 301–312. doi: 10.1007/978-3-642-12553-9_27
  28. Wignor D, Wixon D (2011) Brave NUI world: designing natural user interfaces for touch and gesture, 1st edn. Morgan Kaufmann Publishers Inc., San FranciscoGoogle Scholar
  29. Zafrulla Z, Brashear H, Starner T, Hamilton H, Presti P (2011) American sign language recognition with the kinect. Proceedings of the 13th International Conference on Multimodal Interfaces, 279–286. doi: 10.1145/2070481.2070532

Copyright information

© Springer-Verlag Berlin Heidelberg 2017

Authors and Affiliations

  1. 1.Computer Science and Informatics Department (ECCI)University of Costa RicaMontes de OcaCosta Rica
  2. 2.Research Center for Communication and Information Technologies (CITIC)University of Costa RicaMontes de OcaCosta Rica

Personalised recommendations