Skip to main content
Log in

Implementation of wearable glove for sign language expression based on deep learning

  • Technical Paper
  • Published:
Microsystem Technologies Aims and scope Submit manuscript

Abstract

This study proposes a wearable glove for sign language expressions based on deep learning. Wearable technology has made many advances in fields, such as medicine and education. In addition, research on recognizing sign language expressed by the deaf using wearable technology is actively underway. It is difficult for a deaf person who is learning sign language for the first time, or someone who has just became deaf to express themselves using sign language. Therefore, we design the wearable glove and manufacture a prototype based on this design to confirm that it is possible to control a finger using it. The proposed wearable glove controls movement of the exoskeleton with a DC motor. For sign language recognition and expression of the wearable glove, a deep learning model designed for expressing 20 Korean words is trained. As sign language requires movement changes over time and expresses meaning based on the movements, the deep learning model for sign language recognition must be capable of learning over time. Therefore, in this study, three deep learning models, Simple Recurrent Neural Network (SimpleRNN), Long Short-Term Memory (LSTM), and Gated Recurrent Unit (GRU), were used for sign language recognition of the wearable gloves. The training results of the three models are compared, and training-performance comparison experiments are conducted according to the sequence length of the training data. Based on the experimental results, GRU is the most effective sign language-recognition model for the proposed wearable gloves.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19

Similar content being viewed by others

Data availability

The data sets generated during the current study are available from the corresponding author on reasonable request.

References

Download references

Acknowledgements

This research was funded by a 2021 Research Grant from Sangmyung University (2022-A000-0391).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Soo-Whang Baek.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kim, HJ., Baek, SW. Implementation of wearable glove for sign language expression based on deep learning. Microsyst Technol 29, 1147–1163 (2023). https://doi.org/10.1007/s00542-023-05454-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00542-023-05454-5

Navigation