Indian Student Creates AI Model to Translate American Sign Language

Education International

AI Model to Translate American Sign Language: A fourth-year computer science student in India has created an AI model that can translate American sign language (ASL) into English. Priyanjali Gupta, who is studying at the Vellore Institute of Technology, was inspired to create the model after her mother challenged her to use her engineering degree to help others.

Gupta’s model uses Tensorflow Object Detection to translate a few basic ASL signs, including “hello,” “I love you,” “thank you,” “please,” “yes,” and “no.” She created the model by manually collecting images of herself signing these words, and then using transfer learning from a pre-trained model created by Nicholas Renotte.

Renotte, an Australian accountant turned data scientist and AI specialist, has been working on developing AI models to translate sign language for several years. His models have achieved a high level of accuracy, up to 99% for certain signs.

Gupta and Renotte believe that their work has the potential to improve accessibility for deaf and hard-of-hearing people. “This is only scratching the surface of what’s possible,” Renotte said. “There is so much more that’s possible with these technologies. That is why I’m so passionate about it.”

Gupta’s model is still in the early stages of development, but she hopes to eventually make it available to the public. She believes that it could be a valuable tool for people who are learning ASL, as well as for deaf and hard-of-hearing people who want to communicate with hearing people.

“I think it’s really important for people to be able to communicate with each other,” Gupta said. “And I think this model can help to break down some of the barriers that exist between hearing and deaf people.”

Also read: Breaking the Chains: Unveiling the Persistent Biases Against Women Worldwide

Leave a Reply

Your email address will not be published. Required fields are marked *