Skip to main content

A Review on Hand Gesture and Sign Language Techniques for Hearing Impaired Person

  • Chapter
  • First Online:
Machine Learning Techniques for Smart City Applications: Trends and Solutions

Abstract

Sign language has been used by deaf communities for over three centuries. It serves as a platform for messaging and communicating. New sign languages are emerging in deaf communities all over the world. Movement and orientation of hands, arms body and facial expression can be used to represent one’s thought in sign languages. However, only a small percentage of public is aware of sign language. As a result, those who use sign language for everyday communication may have difficulty in communicating with normal people. Hearing aid devices have been introduced as a consequence of remarkable technology advancements to assist the hearing-impaired community in communicating with others. Hearing aid devices also would help individuals who have not entirely lost their hearing loss, while others who have hearing impairments will have to rely on sign language to communicate with one another. This paper will discuss review on hand gesture studies in providing sign language used in the hand gesture and sign language recognition process. It is hoped that this study may provide readers with a direction on the field of gesture and sign language recognition for further future work with regards to hearing impaired subjects.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Abbreviations

2D:

2 Dimensional

3D:

3 Dimensional

API:

Application Programming Interface

ARM:

Advanced RISC Machine

ASL:

American Sign Language

BLE:

Bluetooth Low Energy

CNN:

Convolution Neural Network

Faster R-CNN:

Faster Region-based Convolution Neural Network

FRF:

Random Forest Regression

IMU:

Inertial Measurement Unit

IR 4.0:

Industry Revolution 4.0

KCF:

Kernelized Correlation Filters

KNN:

K-nearest Neighbor

LMC:

Leap Motion Controller

LSTM:

Long Short-Term Memory

MLP:

Multilayer Perceptron

R-CNN:

Recursive-Convolutional Neural Network

RISC:

Reduced Instruction Set Computer

RNN:

Recurrent Neural Network

ROI:

Region of Interest

RPN:

Region Proposal Network

SDK:

Software Development Kit

sEMG:

Surface Electromyography

SL:

Sign Language

SLR:

Sign Language Recognition

SVM:

Support Vector Machine

VGG:

Visual Geometry Group

References

  • Ahmed, M., Idrees, M., Abideen, Z. U., Mumtaz, R., & Khalique, S. (2016). Deaf talk using 3D animated sign language: A sign language interpreter using Microsoft's Kinect v2. SAI Computing Conference, pp. 330–335.

    Google Scholar 

  • Ahmed, S., Kallu, K. D., Ahmed, S., & Cho, S. H. (2021). Hand gestures recognition using radar sensors for human-computer-interaction: A review. Remote Sensing, 13, 527. https://doi.org/10.3390/rs13030527

  • Alom, M. S., Hasan, M. J., & Wahid, M. F. (2019). Digit recognition in sign language based on convolutional neural network and support vector machine. International Conference on Sustainable Technologies for Industry, 4, 1–5.

    Google Scholar 

  • Baker, S. (2010) The importance of fingerspelling for reading research brief. VL2 Integration of Research and Education.

    Google Scholar 

  • Bantupalli, K., & Xie, Y. (2018). American sign language recognition using deep learning and computer vision. EEE International Conference on Big Data, pp. 4896–4899.

    Google Scholar 

  • Barczak, A., Reyes, N., Abastillas, M., Piccio, A., & Susnjak, T. (2011). A new 2D static hand gesture colour image dataset for ASL gestures. Research Letters in the Information and Mathematical Sciences, 15, 12–20.

    Google Scholar 

  • Brezulianu, A., Geman, O., Zbancioc, M. D., Hagan, M., Aghion, C., Hemanth, D. J., & Son, L. H. (2019). IoT based heart activity monitoring using inductive sensors. Sensors, 19, 3284. https://doi.org/10.3390/s19153284

  • Chan, L., Hsieh, C.-H., Chen, Y.-L., Yang, S., Huang, D.-Y., Liang, R.-H., & Chen, B.-Y. (2015). Cyclops: Wearable and single-piece full-body gesture input devices. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 3001–3009.

    Google Scholar 

  • Chung, H.-Y., Chung, Y-L., Tsai, W.-F. (2019). An efficient hand gesture recognition system based on deep CNN. IEEE International Conference on Industrial Technology (ICIT), pp. 853–858.

    Google Scholar 

  • Ciotti, S., Battaglia, E., Carbonaro, N., Bicchi, A., Tognetti, A., & Bianchi, M. (2016). A synergy-based optimally designed sensing glove for functional grasp recognition. Sensors, 1–17.

    Google Scholar 

  • Das, P., Ahmed, T., & Ali, M. F. (2020). Static hand gesture recognition for American sign language using deep convolutional neural network. 2020 IEEE Region 10 Symposium (TENSYMP), pp. 1762–1765.

    Google Scholar 

  • ElBadawy, M. A., Elons, S., Shedeed, H. A., & Tolba, M. F. (2017). Arabic sign language recognition with 3D convolutional neural networks. Eighth International Conference on Intelligent Computing and Information Systems, pp. 66–71.

    Google Scholar 

  • Fasihuddin, H., Alsolami, S., Alzahrani, S., Alasiri, R., & Sahloli, A. (2018). Smart tutoring system for Arabic sign language using leap motion controller. International Conference on Smart Computing and Electronic Enterprise, pp. 1–5.

    Google Scholar 

  • Gonçalves, A. R., Gouveia, E. R., Cameirão, M. S., & Bermúdez I Badia, S. (2015). Automating senior fitness testing through gesture detection with depth sensor. IET International Conference on Technologies for Active and Assisted Living (TechAAL), pp. 1–6.

    Google Scholar 

  • Hafit, H., Xiang, C. W., Mohd Yusof, M., Wahid, N., & Kassim, S. (2019). Malaysian sign language mobile learning application: A recommendation app to communicate with hearing-impaired communities. International Journal of Electrical and Computer Engineering, 5512–5518.

    Google Scholar 

  • Haron, H., Samad, H., Diah, F. M., & Yusof, H. (2019). E-learning approach using mobile apps: Malaysian sign language for dumb and deaf. International Journal of Advanced Research in Technology and Innovation, 1, 1–7.

    Google Scholar 

  • Haroon, N., & Malik, A. N. (2016). Multiple hand gesture recognition using surface EMG signals. Journal of Biomedical Engineering and Medical Imaging, 3, 1–8.

    Article  Google Scholar 

  • He, S. (2019). Research of a sign language translation system based on deep learning. International Conference on Artificial Intelligence and Advanced Manufacturing, pp. 392–396.

    Google Scholar 

  • Hirota, K., & Tagawa, K. (2019). Interaction with virtual object using deformable hand. IEEE Virtual Reality, 49–56.

    Google Scholar 

  • How IoT Medical Devices Are Changing Health Care Today. https://www.link-labs.com/blog/iot-healthcare/

  • Jeong, U., & Cho, K.-J. (2016). A novel low-cost, Large curvature bend sensor based on a Bowden-Cable. Sensors, 16, 1–20.

    Google Scholar 

  • Ji, Y., Kim, S., & Lee, K. (2017). Sign language learning system with image sampling and convolutional neural network. First IEEE International Conference on Robotic Computing, pp. 371–375.

    Google Scholar 

  • Jones, S. B. R., Kumar, N., & Paul, S. S. (2020). Health monitoring wearable glove. International Journal of Engineering Research and Technology, 13(12), 4199–4205. ISSN 0974-3154.

    Google Scholar 

  • Kajan, S., Pernecký, D., & Hammad, A. (2015) Hand gesture recognition using multilayer perceptron network, trapped charged particles and fundamental physics.

    Google Scholar 

  • Kaur, H., & Rani, J. (2016). A review: study of various techniques of hand gesture recognition. IEEE 1st International Conference on Power Electronics, Intelligent Control and Energy Systems, pp. 1–5.

    Google Scholar 

  • Kolivand, H., Joudaki, S., Sunar, M. S., et al. (2021). A new framework for sign language alphabet hand posture recognition using geometrical features through artificial neural network (part 1). Neural Computing and Applications, 33, 4945–4963.

    Article  Google Scholar 

  • Le, T., Tran, T., & Pham, C. (2019). The Internet-of-Things based hand gestures using wearable sensors for human machine interaction. 2019 International Conference on Multimedia Analysis and Pattern Recognition (MAPR), pp. 1–6. https://doi.org/10.1109/MAPR.2019.8743542

  • Lee, B. R., Cadeddu, J. A., Stoianovici, D., & Kavoussi, L. R. (1999). Telemedicine and surgical robotics: Urologic applications. Reviews in Urology, 1(2), 104–120.

    Google Scholar 

  • Lee, B. G., Chong, T.-W., & Chung, W.-Y. (2020). Sensor fusion of motion-based sign language interpretation with deep learning. Sensors, 1–17.

    Google Scholar 

  • Lu, Z., Xiang, C., Li, Q., Zhang, X., & Zhou, P. (2014). A hand gesture recognition framework and wearable gesture-based interaction prototype for mobile devices. Human-Machine Systems, IEEE Transactions, 44, 293–299.

    Article  Google Scholar 

  • Naglot, D., & Kulkarni, M. (2016). Real time sign language recognition using the leap motion controller. International Conference on Inventive Computation Technologies, pp. 1–5.

    Google Scholar 

  • O’Connor, T. F., Fach, M. E., Miller, R., Root, S. E., Mercier, P. P., & Lipomi, D. J. (2017). The language of glove: Wireless gesture decoder with low-power and stretchable hybrid electronics. PLoS ONE, 12, 1–12.

    Google Scholar 

  • Oh, J., Kim, B., Kim, M., Kang, S., Kwon, H., Kim, I., & Song, Y. (2017). Avatar-based sign language interpretations for weather forecast and other TV programs. SMPTE Motion Imaging Journal, 126, 57–62.

    Article  Google Scholar 

  • Pasquale, G., & Mastrototaro, L. (2018). Glove-based Systems for medical applications: Review of recent advancements. Journal of Textile Engineering Fashion Technology, 4(3), 253–262. https://doi.org/10.15406/jteft.2018.04.00153

  • Patil, N. M., & Patil, S. R. (2017). Review on real-time EMG acquisition and hand gesture recognition system. International Conference Electronics on Communication and Aerospace Technology, pp. 694–696.

    Google Scholar 

  • Pisharady, P. K., & Saerbeck, M. (2014). Gesture recognition performance score: A new metric to evaluate gesture recognition systems. In C. V. Jawahar, & S. Shan (Eds.), Computer Vision—ACCV 2014 Workshops—Revised Selected Papers (Vol. 9008, pp. 157—173). Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Springer-Verlag. https://doi.org/10.1007/978-3-319-16628-5_12

  • Sign Language Alphabets from Around the World, https://www.ai-media.tv/sign-language-alphabets-from-around-the-world

  • Sima, V., Gheorghe, I. G., Subić, J., & Nancu, D. (2020). Influences of the industry 4.0 revolution on the human capital development and consumer behavior: A systematic review. Sustainability, 12, 4035. https://doi.org/10.3390/su12104035

  • Tamiru, H. G., Ren, S., & Duan, H. (2018). Vision-based hand gesture recognition for mobile service robot control. 8th International Conference on Manufacturing Science and Engineering, pp. 48–55.

    Google Scholar 

  • Tripathi, K., Baranwal, N., & Nandi, G. C. (2015). Continuous dynamic Indian Sign Language gesture recognition with invariant backgrounds. International Conference on Advances in Computing, Communications and Informatics, pp. 2211–2216.

    Google Scholar 

  • What is Industry 4.0—the Industrial Internet of Things (IIoT)?, https://www.epicor.com/en-my/resource-center/articles/what-is-industry-4-0/

  • Wu, J., Sun, L., & Jafari, R. (2016). A wearable system for recognizing American sign language in real-time using IMU and surface EMG. IEEE Journal of Biomedical and Health Informatics, 20, 1281–1290.

    Article  Google Scholar 

  • Yasen, M., & Jusoh, S. (2019). A systematic review on hand gesture recognition techniques, challenges and applications. Peer Journal of Computer Science, 5, e218. https://doi.org/10.7717/peerj-cs.218. PMID: 33816871; PMCID: PMC7924500.

  • Zaidi, M. F. A., & Belal, H. M. (2019). A preliminary study to understand the SMEs’ readiness on IoT in Malaysia. International Journal of Accounting, Finance and Business (IJAFB), 4(19), 1–12.

    Google Scholar 

  • Zhou, S., Fei, F., Zhang, F. F. G., Mai, J. D., Liu, Y., Liou, J. Y. J., & Li, W. J. (2014). 2D human gesture tracking and recognition by the fusion of MEMS inertial and vision sensors. EEE Sensors Journal, 14, 1160–1170.

    Article  Google Scholar 

Download references

Acknowledgements

This research was supported by Universiti Tun Hussein Onn Malaysia (UTHM) and Ministry of Higher Education (MOHE) through Fundamental Research Grant Scheme for Research Acculturation of Early Career (FRGS-RACER) (RACER/1/2019/TK04/UTHM//5).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Muhammad Mahadi Abdul Jamil .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Salim, S., Jamil, M.M.A., Ambar, R., Wahab, M.H.A. (2022). A Review on Hand Gesture and Sign Language Techniques for Hearing Impaired Person. In: Hemanth, D.J. (eds) Machine Learning Techniques for Smart City Applications: Trends and Solutions. Advances in Science, Technology & Innovation. Springer, Cham. https://doi.org/10.1007/978-3-031-08859-9_4

Download citation

Publish with us

Policies and ethics