HCI Using Gestural Recognition for Symbol-Based Communication Methodologies
Sign language in itself is the only tool of communication for the society which is not able to hear voices and speak words. Using sign language, they can express their emotions and thoughts and can convey what they want to say. But not everyone understands sign language, only the people who require it do. So people with such kinds of handicaps need a translator with them in order to convert their language to a common tongue and that is the main reason of sign language recognition becoming such a crucial task. Since sign language consists of different movements and positions of the hand, therefore, the accuracy of sign language depends on how accurately the machine could recognize the gesture. We are trying to develop such a system what we call translating HCI for sign language. In this system, the user has to place their hand in front of the webcam performing sign gestures and in real time, the system will read your hand gesture and will return the respective character/alphabet on the screen. Utilizing the proposed system normal people can understand sign language and can easily communicate with hearing-impaired people.
KeywordsDeep learning Image processing HCI Machine learning Convolutional neural network
- 1.Konstantinidis, D., Dimitropoulos, K., Daras, P.: Sign language recognition based on hand and body skeletal data. (Research Gate)Google Scholar
- 2.Sahoo, A.K., Mishra, G.S., Ravulakollu, K.K.: Sign Language Recognition: State of the Art (2014)Google Scholar
- 3.Kagalkar, R.M., Gumaste, S.V.: ANFIS Based Methodology for Sign Language Recognition and Translating to Number in Kannada Language (2018)Google Scholar
- 5.Jain, S., Sameer Raja, K.V., Mukerjee, A.: Indian Sign Language Character Recognition. Indian Institute of Technology, Kanpur Course Project-CS365A (2016)Google Scholar