“Now is the time to do something useful while studying engineering. We can say that he heard his mother’s words.
A 20-year-old student from India said he had been working for a year on the Artificial Intelligence (AI) model of sign language translation.
As we see in his video, when someone stands in front of a webcam, the software recognizes the gesture and translates it into English.
“Standardize Sign Languages”
Then Priyanjali Gupta Shared on LinkedIn Its formation at the beginning of the month. His post now has over 63,000 likes and 1,200 comments.
“I created this model using the Tensorflow Object Detection API. The sample translates some signs of the American language into English and “refers to the student, using a specific algorithm.” Huffpost.
“In my opinion, researchers and developers are doing everything they can to find a solution that works. However, I think the first step would be to standardize sign languages and other communication methods and try to bridge the communication gap,” the young woman added to the site. Interesting engineering.
There are many sign languages around the world. The system proposed by Priyanjali Gupta currently enables translation into English and American Sign Languages.
Only a few words like “hello” (hello), “I love you” (I love you), “yes” (yes), “no” (no), “thank you” (thank you “will be processed by the software and you” please “, But the tool is in development and wants to add words to its vocabulary.
Faced with the buzz created by the young woman’s video, some people stealthily cry. According to an Internet user, such a system already exists and the student would simply take it.
“Beeraholic. Friend of animals everywhere. Evil web scholar. Zombie maven.”