Advertisement

A Portable Finger Language Translator Based on Deep Learning with Leap Motion

  • Dahye Jin
  • Yumi Lee
  • Ermal Elbasani
  • Jae Sung Choi
  • Hyun LeeEmail author
Conference paper
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 536)

Abstract

Hearing impaired people are less able to communicate than ordinary people, and experience many inconveniences in everyday life, hospitals, and government offices. To solve this problem, various geophysical translators and sign language translators are being developed. However, most of the wearable translator has the disadvantage that the accuracy of output data is low. Thus, we try to solve this problem by using leap motion with smart phone, which are hand motion devices, instead of wearable translator. Particularly, we tried to increase the recognition rate of geographical interpreters by applying a deep learning model with multiple perceptron. Experimental results show that the average recognition rate is almost 94.9% separately.

Keywords

Portable language translator Geophysical translator Leap motion Deep learning Smart phone 

Notes

Acknowledgments

This work is result of a study on the “Leader in Industry-University Cooperation +” Project, which is supported by the Korean Ministry of Education.

References

  1. 1.
    Nod: The resource is available at https://nod.com
  2. 2.
    Myo-Gesture Control Armband: The resource is available at https://www.myo.com/
  3. 3.
    Kim, S., Kim, J., Ahn, S., Koo, B., Kim, Y.: An armband-type finger language recognition system based on ensemble artificial neural network. J. Korean Soc. Precis. Eng. 35(1), 13–18 (2018)CrossRefGoogle Scholar
  4. 4.
    Lee, J.W.: A study on the improvement of communication accessibility for the hearing impaired. In: Ministry of Health and Welfare Research Report (December 2013)Google Scholar
  5. 5.
    Jeong, P.S., Cho, Y.H.: Design and implementation of finger language translation system using raspberry pi and leap motion. J. Korea Inst. Inf. Commun. Eng. 19(9), 2006–2013 (2015)CrossRefGoogle Scholar
  6. 6.
    Jo, J.H., Kim, Y.R., Kim, H.J., Lee, S.K., Ro, K.H.: A research on a portable sign language translator with a smart device and a leap motion. Korea Intell. Inf. Syst. Soc. 772–775 (2016)Google Scholar
  7. 7.
    Kim, J.Y., Lee, J.G., Kim, D.J., Suh, Y.J.: Deep learning-based sign language recognition method using WiFi channel state information pattern. In: Proceedings of Symposium of the Korean Institute of Communication and Information Sciences, pp. 1435–1436 (2018)Google Scholar
  8. 8.
    Nowicki, M., Pilarczyk, O., Wasikowski, J., Zjawin, K.: Gesture recognition library for leap motion controller. Bachelor’s Thesis in Poznan University of Technology (2014)Google Scholar
  9. 9.
    McCartney, R., Yuan, J., Bischof, H.-P.: Gesture recognition with the leap motion controller. In: International Conference on IP, Computer Vision and Pattern Recognition (IPCV 2015), pp. 3–9 (2015)Google Scholar
  10. 10.
    Hisham, B., Hamouda, A.: Arabic static and dynamic gestures recognition using leap motion. J. Comput. Sci. 13(8), 337–354 (2017)CrossRefGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  • Dahye Jin
    • 1
  • Yumi Lee
    • 1
  • Ermal Elbasani
    • 1
  • Jae Sung Choi
    • 1
  • Hyun Lee
    • 1
    Email author
  1. 1.Division of Computer Science and EngineeringSun Moon UniversityAsanRepublic of Korea

Personalised recommendations