We noticed that there is no ASL app that translates the language in real-time. Dictionaries and learning apps are the only ones that exist. Therefore, we had the idea to create the interface of an app with real-time ASL translation enhanced by AI-driven emotion and tone detection, with the addition of personalized data to increase the accuracy of one's tone.

ExpressHands captures ASL gestures in real-time, detects the user's emotional tone, and translates the signing into text that matches the speaker’s true feelings and communication style. It creates a more personal and authentic translation experience.

We designed a mobile interface that simulates real-time video and voice capture and emotional tagging. The UI prioritizes accessibility, emotional richness, and seamless user experience. We also outlined how AI would personalize each user's signing style over time.

ExpressHands reimagines accessibility by recognizing the depth behind every conversation. One of the biggest challenges we faced was trying to implement AI features into the app. We are proud of building the interface that focuses on not just simply translating the words, but capturing human connections through emotions and tone. We learned that ASL is more than just hand signs. Facial expressions, emotions, and the person all deeply impact their meaning.

Built With

Share this project:

Updates