A Class 12 student has developed SignFlow, an AI tool that enables two-way communication between sign language and text
RAMANDEEP KAUR | NT KURIOCITY
In 2023, a brief exchange outside his home left Harsh Marathe with a question he could not shake off.
“I met a deaf person trying to communicate using sign language, but no one could understand him, including me,” says the 16-year-old. “He seemed anxious, and I didn’t know how to respond.”
Then, a Class 9 student preparing for the Western India Science Fair, he was still searching for a project idea when this incident changed his course. It also made him consider what daily life is like when communication is difficult. He looked for tools to translate between sign language and speech, but found none that worked well for everyday use. He decided to build one and called it SignFlow.
The first version, developed in 2023, was a basic HTML-based website. It converted typed text into sign language through pre-recorded videos and used a simple gesture recognition model, trained using Google Teachable Machine, to interpret signs into text. “It was basic, but it showed that the idea could work,” says Marathe, who has been coding since the age of nine before moving on to AI, machine learning and accessibility projects.
His first version placed fourth at the state level of the Western India Science Fair and qualified for the national round, though the Goa team could not participate further that year.
Over the next two years, he rebuilt SignFlow into a more capable system. By 2025, he developed a version using Python, with MediaPipe and OpenCV to detect hand landmarks and track finger movements. “The system matches these inputs with gesture patterns to interpret sign language, with machine learning improving accuracy,” he explains.
He also worked on the reverse process. Using Blender, he designed a 3D character and with Unity, built an application that generates sign language animations from typed text in real time. The tool now works both ways, converting sign language into text and text into sign language.
In 2024, he visited Sanjay School with his teammate Aatmik Pilgaonkar, supported by his teacher Nishad Hatle, to interact with deaf and mute students. “We used the application, and it translated our words into sign language animations. That helped the students understand us.”
At that stage, the tool handled only individual words and short sentences. Students and teachers pointed out errors in some signs and guided him to more accurate Indian Sign Language resources. He used this to refine the system.
The improved version returned to the Western India Science Fair in 2025. He secured third place at the state level and went on to win first place at the zonal round, where it was judged the best project across Western India. He also led a team that secured fourth place in Asia at the Made to Move Communities challenge organised by Otis Elevator Company.
Now in Class 12 at Dr K B Hedgewar High School, he credits his teachers, including Saishwar Asolkar, for their support. He sees SignFlow being used in hospitals, government offices, schools and customer service settings. He says, “It doesn’t require any special hardware, just a camera and a device. I want to keep it free to use.”
“Building a large dataset for Indian Sign Language and handling variations in gestures are major challenges,” he says. He is also working to improve real-time performance on mobile devices.
SignFlow currently runs on a computer using a standard camera, converting trained gestures into text and generating sign language animations from typed input. Its vocabulary is still evolving, and performance can vary with lighting, camera quality and individual signing styles.
Ahead of further user testing, he is focusing on improving accuracy, expanding gesture recognition, adding support for more languages and is considering patenting the system.