Towards EMG Based Gesture Recognition for Indian Sign Language Interpretation Using Artificial Neural Networks

  • Abhiroop KaginalkarEmail author
  • Anita Agrawal
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 528)


There are several techniques of data measurement for gesture recognition, with applications ranging from prosthetic or autonomous control to human-computer interfacing. Most of the typical techniques depend on image processing, and might face portability hurdles. This paper discusses a method to classify gestures based on the surface EMG (sEMG) readings, thereby allowing user portability. These sEMG readings acquired from the upper forearm provide a direction towards gesture recognition for Indian Sign Language (ISL) interpretation. An Artificial Neural Network (ANN) based on the Scaled Conjugate Gradient (SCG) assisted learning is used to process the data and classify gestures with an accuracy of 97.5 %. The training involved 120 samples corresponding to four distinct wrist gestures. Additionally, the foundations for user-independent adaptability have been laid in this paper.


Human-computer interaction Biomedical electronics Artificial neural networks Sign language interpretation EMG 


  1. 1.
    Ong, S.C.W., Ranganath, S.: Automatic sign language analysis: a survey and the future beyond lexical meaning. IEEE Trans. Pattern Anal. Mach. Intell. 27(6), 873–891 (2005)CrossRefGoogle Scholar
  2. 2.
    Rajam, P.S., Balakrishnan, G.: Real time indian sign language recognition system to aid deaf-dumb people. In: 2011 IEEE 13th International Conference on Communication Technology (ICCT), pp. 737–742 (2011)Google Scholar
  3. 3.
    Adithya, V., Vinod, P.R., Gopalakrishnan, U.: Artificial neural network based method for Indian sign language recognition. In: 2013 IEEE Conference on Information and Communication Technologies (ICT), vol., no., pp. 1080–1085 (2013)Google Scholar
  4. 4.
    Agrawal, S.C., Jalal, A.S., Bhatnagar, C.: Recognition of Indian Sign Language using feature fusion. In: 2012 4th International Conference on Intelligent Human Computer Interaction (IHCI), pp. 1−5 (2012)Google Scholar
  5. 5.
    Mitra, S., Acharya, T.: Gesture recognition: a survey. IEEE Trans. Syst. Man Cybern. Part C-Appl. Rev. 37(3), 311–324 (2007)CrossRefGoogle Scholar
  6. 6.
    Yun, L., Chen, X., et. al.: Automatic recognition of sign language subwords based on portable accelerometer and EMG sensors. In: Proceedings of the International Conf. on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction, 17 (2010)Google Scholar
  7. 7.
    Kim, J., Mastnik, S., Andre, E.: EMG-based hand gesture recognition for realtime biosignal interfacing. In: Proceedings of the 13th International Conference on Intelligent User Interfaces, pp. 30–39 (2008)Google Scholar
  8. 8.
    Shroffe, E.H., Manimegalai, P.: Hand gesture recognition based on EMG signals using ANN. Int. J. Comput. Appl. 2(3), 31–39 (2013)Google Scholar
  9. 9.
    Moller, M.F.: A scaled conjugate gradient algorithm for fast supervised learning. J. Neural Netw. 6(4), 525–533 (1993)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Birla Institute of Technology and Science – Pilani, K.K Birla Goa CampusGoaIndia

Personalised recommendations