Abstract
This study proposes a wearable glove for sign language expressions based on deep learning. Wearable technology has made many advances in fields, such as medicine and education. In addition, research on recognizing sign language expressed by the deaf using wearable technology is actively underway. It is difficult for a deaf person who is learning sign language for the first time, or someone who has just became deaf to express themselves using sign language. Therefore, we design the wearable glove and manufacture a prototype based on this design to confirm that it is possible to control a finger using it. The proposed wearable glove controls movement of the exoskeleton with a DC motor. For sign language recognition and expression of the wearable glove, a deep learning model designed for expressing 20 Korean words is trained. As sign language requires movement changes over time and expresses meaning based on the movements, the deep learning model for sign language recognition must be capable of learning over time. Therefore, in this study, three deep learning models, Simple Recurrent Neural Network (SimpleRNN), Long Short-Term Memory (LSTM), and Gated Recurrent Unit (GRU), were used for sign language recognition of the wearable gloves. The training results of the three models are compared, and training-performance comparison experiments are conducted according to the sequence length of the training data. Based on the experimental results, GRU is the most effective sign language-recognition model for the proposed wearable gloves.
Similar content being viewed by others
Data availability
The data sets generated during the current study are available from the corresponding author on reasonable request.
References
Abraham E, Nayak A, Iqbal A (2019) Real-time translation of Indian sign language using LSTM. In: 2019 Global conference for advancement in technology (GCAT). p 1–5. doi: https://doi.org/10.1109/GCAT47503.2019.8978343
AI-hub (2022) https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=120&topMenu=100&aihubDataSe=extrldata&dataSetSn=264. Accessed 21 Nov 2022
Almusawi H, Durugbo C, Bugawa A (2021) Innovation in physical education: teachers’ perspectives on readiness for wearable technology integration. Comput Educ. https://doi.org/10.1016/j.compedu.2021.104185
Barak O (2017) Recurrent neural networks as versatile tools of neuroscience research. Curr Opin Neurobiol 46:1–6. https://doi.org/10.1016/j.conb.2017.06.003
Chaikaew A, Somkuan K, Yuyen T (2021) Thai sign language recognition: an application of deep neural network. In: Joint international conference on digital arts, media and technology with ECTI northern section conference on electrical, electronics, computer and telecommunication engineering. p 128–131. doi: https://doi.org/10.1109/ECTIDAMTNCON51128.2021.9425711.
Choi S (2020) Deaf people’s own perspective with participants to education for students with hearing impairment. Korean J Political Sci 28:145–170. https://doi.org/10.34221/KJPS.2020.28.2.7
Dey R, Salem F (2017) Gate-variants of gated recurrent unit (GRU) neural networks. In: 2017 IEEE 60th international Midwest symposium on circuits and systems (MWSCAS). p 128–131. doi: https://doi.org/10.1109/ECTIDAMTNCON51128.2021.9425711.
Donchev R, Pescara E, Beigl M (2021) Investigating retention in passive haptic learning of piano songs. Proc ACM Interact Mob Wearable Ubiquitous Technol 5:1–14. https://doi.org/10.1145/3463513
Emmens A, Asseldonk E, Masciullo M, Arquilla M, Pisotta I, Tagliamonte NL, Tamburella F, Molinari, M, Kooij H (2018) Improving the standing balance of paraplegics through the use of a wearable exoskeleton. In: 2018 7th IEEE international conference on biomedical robotics and biomechatronics (Biorob). p 707–712. doi: https://doi.org/10.1109/BIOROB.2018.8488066.
Esposito D, Centracchio J, Andreozzi E, Savino S, Gargiulo GD, Naik GR, Bifulco P (2022) Design of a 3D-printed hand exoskeleton based on force-myography control for assistance and rehabilitation. Machines 10:57. https://doi.org/10.3390/machines10010057
Ferreira J, Fernandes C, Rammal H, Veiga P (2021) Wearable technology and consumer interaction: a systematic review and research agenda. Comput Hum Behav. https://doi.org/10.1016/j.chb.2021.106710
Heo P, Gu G, Lee S, Rhee K, Kim J (2012) Current hand exoskeleton technologies for rehabilitation and assistive engineering. Int J Precis Eng Manuf 13:807–824. https://doi.org/10.1007/s12541-012-0107-2
Kusters A, Lucas C (2022) Emergence and evolutions: introducing sign language sociolinguistics. J Socioling 26:84–98. https://doi.org/10.1111/josl.12522
Lee G, Lee S (2018) Smart wearable hand device for sign language interpretation system with sensors fusion. IEEE Sens J 18:1224–1232. https://doi.org/10.1109/JSEN.2017.2779466
Lee B, Chong T, Chung W (2020) Sensor fusion of motion-based sign language interpretation with deep learning. Sensors. https://doi.org/10.3390/s20216256
Lemos J, Hernandez A, Soto-Romero G (2017) An instrumented glove to assess manual dexterity in simulation-based neurosurgical education. Sensors. https://doi.org/10.3390/s17050988
Long Y, Du Z, Wang W, Dong W (2018) Human motion intent learning based motion assistance control for a wearable exoskeleton. Robot Comput-Integr Manuf 49:317–327. https://doi.org/10.1016/j.rcim.2017.08.007
Malvezzi M, Baldi T, Villani A, Ciccarese F, Prattichizzo D (2020) Design, development, and preliminary evaluation of a highly wearable exoskeleton. In: 2020 29th IEEE international conference on robot and human interactive communication (RO-MAN). p 1055–1062. doi: https://doi.org/10.1109/RO-MAN47096.2020.9223604.
MediaPipe Holistic (2022) https://google.github.io/mediapipe/solutions/holistic.html. Accessed 21 Nov 2022
Meghdari A, Alemi M, Zakipour M, Kashanian S (2019) Design and realization of a sign language educational humanoid robot. J Intell Rob Syst 95:3–17. https://doi.org/10.1007/s10846-018-0860-2
Ming Y, Cao S, Zhang R, Li Z, Chen Y, Song Y, Qu H (2017) Understanding hidden memories of recurrent neural networks. In: 2017 IEEE conference on visual analytics science and technology (VAST). p 13–24. doi: https://doi.org/10.1109/VAST.2017.8585721.
Mittal A, Kumar P, Roy P, Balasubramanian R, Chaudhuri B (2019) A modified-LSTM model for continuous sign language recognition using leap motion. IEEE Sens J 19:7056–7063. https://doi.org/10.1109/JSEN.2019.2909837
O’Neill C, Phipps N, Cappello L, Paganoni S, Walsh C (2017) A soft wearable robot for the shoulder: design, characterization, and preliminary testing. In: 2017 International Conference on Rehabilitation Robotics (ICORR). p 1672–1678. doi: https://doi.org/10.1109/ICORR.2017.8009488.
Ravanelli M, Brakel P, Omologo M, Bengio Y (2018) Light gated recurrent units for speech recognition. IEEE Trans Emerg Top Comput Intell 2:92–102. https://doi.org/10.1109/TETCI.2017.2762739
Sarac M, Solazzi M, Frisoli A (2019) Design requirements of generic hand exoskeletons and survey of hand exoskeletons for rehabilitation, assistive, or haptic use. IEEE Trans Haptics. https://doi.org/10.1109/TOH.2019.2924881
Sherstinsky A (2020) Fundamentals of recurrent neural network (RNN) and long short-term memory (LSTM) network. Phys D Nonlinear Phenom. https://doi.org/10.1016/j.physd.2019.132306
Shi Q, Dong B, He T, Sun Z, Zhu J, Zhang Z, Lee C (2020) Progress in wearable electronics/photonics—moving toward the era of artificial intelligence and internet of things. InfoMat 2:1191–1162. https://doi.org/10.1002/inf2.12122
Wu Q, Chen B, Wu H (2019) Neural-network-enhanced torque estimation control of a soft wearable exoskeleton for elbow assistance. Mechatronics. https://doi.org/10.1016/j.chb.2021.106710
Acknowledgements
This research was funded by a 2021 Research Grant from Sangmyung University (2022-A000-0391).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Kim, HJ., Baek, SW. Implementation of wearable glove for sign language expression based on deep learning. Microsyst Technol 29, 1147–1163 (2023). https://doi.org/10.1007/s00542-023-05454-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00542-023-05454-5