Abstract:
The deaf and dumb population across the Indian subcontinent predominantly communicate with each other and with other people using the Indian Sign Language (ISL). ISL empl...Show MoreMetadata
Abstract:
The deaf and dumb population across the Indian subcontinent predominantly communicate with each other and with other people using the Indian Sign Language (ISL). ISL employs signs consisting of hand gestures, facial expressions or body postures to convey the desired message and emotion. It is a full-fledged natural language with its own grammar and lexicon. In order to lessen this significant communication gap between the hearing and speaking impaired community and the normal population, there is a need for translation systems. We propose an end-to-end human interface framework that is capable of recognizing and interpreting spoken language and then act out the corresponding ISL gestures to facilitate a very convenient, real time form of conversation between the disabled community and the rest of the population. We made use of the Microsoft Xbox Kinect 360s depth sensing and motion capturing abilities to capture motion data for all the different ISL gestures and then used Unity3D to set up all the animations and then finally bundle everything into an Android application.
Published in: 2021 International Conference on Computing, Communication, and Intelligent Systems (ICCCIS)
Date of Conference: 19-20 February 2021
Date Added to IEEE Xplore: 12 April 2021
ISBN Information: