We are big fans of CHAGEE, and in NUS we actually have a CHAGEE signing store, where the staff working there are deaf.
But when we went there, we realised that the interactions were actually very minimal. Most of the time, we ordered using the CHAGEE app, our number got called on the app, and we barely interacted with the staff at all.
If they needed to communicate with us, they had to whip out their phones and type to us. It works, but it doesn’t feel very natural, and honestly it doesn’t feel very inclusive either.
So that got us thinking — if this is a signing store, shouldn’t communication feel more natural?
That’s why we came up with the idea of a sign language support app.
The idea is that you can sign to a camera, and the system will detect the signed words and then help generate sentence suggestions so communication becomes easier.
Now, one might ask — why don’t we just do full sign-to-text translation?
The problem is that sign language is actually very complex. It has its own grammar, meaning depends a lot on facial expression, speed, and movement, and there are also many different sign languages and variations. Because of this, right now it’s very hard to achieve high-accuracy, real-time translation, which we think is really important for making the interaction feel natural. This is still an ongoing research problem.
So instead of forcing full translation, this is our workaround.
How we built it
For computer vision, we used OpenCV and Google MediaPipe to process the camera feed and extract the hand/pose landmarks. For the backend server, we used Flask to handle the pipeline and send results through an API. For the frontend, we used React to display the detected words and sentence suggestions in a simple UI. And for the sentence suggestion part, we use the OpenAI API to generate natural sentence options based on the detected keywords.
What we’re trying to achieve with this is basically convenience.
You can sign directly to your phone camera or a webcam, there’s no special setup, everything runs in real time, and it helps deaf users have a more natural conversation with someone who doesn’t know sign language.
And ultimately, we hope this can help better integrate the deaf community into our society, by making everyday interactions more natural, respectful, and inclusive.
Log in or sign up for Devpost to join the conversation.