Abstract
Nowadays a commercial product for sign language translation is still not available. This paper presents our latest results towards this goal, presenting a functional prototype called Talking Hands. Talking Hands uses a data-glove to detect the hand movements of the user, and a smartphone application to gather all the data and translates them into voice, using a speech synthesizer. Talking Hands implements the most suitable solutions for a massive production without penalizing its reliability. This paper presents the improvements of the last prototype in terms of hardware, software and design, together with a preliminary analysis for the translation of dynamic gestures through this device.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Perkins, R., Battle, T., Edgerton, J., Mcneill, J.: A survey of barriers to employment for individuals who are deaf. J. Am. Deaf. Rehabil. Assoc. 49(1), 66–85 (2015)
Kim, H., Lee, S., Lee, D., Choi, S., Ju, J., Myung, H.: Real-time human pose estimation and gesture recognition from depth images using superpixels and SVM classifier. Sensors (Switzerland) 15(6), 2410–12427 (2015)
Hirafuji Neiva, D., Zanchettin, C.: Gesture recognition: a review focusing on sign language in a mobile context. Expert Syst. Appl. 103, 159–183 (2018)
Cooper, H., Pugeault, N., Bowden, R.: Reading the signs: a video based sign dictionary. In: IEEE International Conference Computer Vision workshops, ICCV 2011, Barcelona (2011)
Starner, T., Weaver, J., Pentland, A.: Real time american sign language recognition using desk and wearable computer based video. IEEE Trans. Pattern Anal. Mach. Intell. 20(12) (1998)
Kelly, D., McDonald, J., Markham, C.: A person independent system for recognition of hand postures used in sign language. Pattern Recognit. Lett. 31, 1359–1368 (2010)
Yoon, H.S., Soh, J., Bae, Y.J., Seung Yang, H.: Hand gesture recognition using combined features of location, angle and velocity. Pattern Recognit. 37(4), 1491–1501 (2001)
Ahmed, M.A., Zaidan, B.B., Zaidan, A.A., Salih, M.M., Bin Lakulu, M.M.: A review on systems-based sensory gloves for sign language recognition state of the art between 2007 and 2017. Sensors (Switzerland) 18(7) (2018)
Bajpai, D., Porov, U., Srivastav, G., Sachan, N.: Two way wireless data communication and American sign language translator glove for images text and speech display on mobile phone. In: Proceedings 2015 5th International Conference on Communication Systems and Network Technologies. CSNT 2015, pp. 578–585 (2015)
Bukhari, J., Rehman, M., Malik, S.I., Kamboh, A.M., Salman, A.: American sign language translation through sensory glove: SignSpeak. Int. J. u- e-Serv. Sci. Technol. 8, 131–142 (2015)
Shukor, A.Z., Miskon, M.F., Jamaluddin, M.H., Bin Ali Ibrahim, F., Asyraf, M.F., Bin Bahar, M.B.: A new data glove approach for malaysian sign language detection. In: 2015 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS), vol. 76, pp. 60–67 (2015)
Seymour, M., Tsoeu, M.: A mobile application for South African Sign Language (SASL) recognition, pp. 1–5 (2015)
Kau, L.J., Su, W.L., Yu, P.J., Wei S.J.: A real-time portable sign language translation system. In: 2015 IEEE 58th International Midwest Symposium on Circuits and Systems (MWSCAS), pp. 1–4 (2015)
Devi, S., Deb, S.: Low cost tangible glove for translating sign gestures to speech and text in Hindi language. In: 3rd International Conference on Computational Intelligence & Communication Technology (CICT), pp. 1–5 (2017)
Pezzuoli, F., Corona, D., Corradini, M.L., Cristofaro, A.: Development of a wearable device for sign language translation. In: International Workshop on Human-Friendly Robotics (HFR2017), pp. 115–126 (2017)
Akhmadeev, K., Rampone, E., Yu, T., Aoustin, Y., Le Carpentier E.: A testing system for a real-time gesture classification using surface EMG. In: 20th IFAC World Congress, pp. 11498–11503 (2017)
Kouichi, M., Hitomi, T.: Gesture recognition using recurrent neural networks. In: ACM Conference on Human factors in computing systems: reaching through technology (1999)
Vogler, C.: American sign language recognition: reducing the complexity of the task with phoneme-based modeling and parallel hidden markov models. University of Pennsylvania (2003)
Li, X.: Gesture recognition based on fuzzy C-Means clustering algorithm. Department of Computer Science, The University of Tennessee, Knoxville
Nagi, J., et al.: Max-pooling convolutional neural networks for vision-based hand gesture recognition. In: 2011 International Conference on Signal and Image Processing and Applications (ICSIPA), pp. 342–347 (2011)
Huynh, D.Q.: Metrics for 3D rotations: comparison and analysis. J. Math. Imaging Vis. 35, 155–164 (2009)
Ong, S.C.W., Ranganath, S.: Automatic sign language analysis: a survey and the future beyond lexical meaning. IEEE Trans. Pattern Anal. Mach. Intell. 27(6) (2005)
Acknowledgments
This work is supported by Limix S.r.l. (www.limix.it). Limix is an Italian start-up and spin-off of the University of Camerino. The intellectual property of Talking Hands and its different parts (hardware, software, design) is of Limix S.r.l.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Pezzuoli, F., Corona, D., Corradini, M.L. (2020). Improvements in a Wearable Device for Sign Language Translation. In: Ahram, T. (eds) Advances in Human Factors in Wearable Technologies and Game Design. AHFE 2019. Advances in Intelligent Systems and Computing, vol 973. Springer, Cham. https://doi.org/10.1007/978-3-030-20476-1_9
Download citation
DOI: https://doi.org/10.1007/978-3-030-20476-1_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-20475-4
Online ISBN: 978-3-030-20476-1
eBook Packages: EngineeringEngineering (R0)