Inspiration: When we were discussing our project ideas, we talked about applications of the Myo armband, and were inspired to combine the armband and sign language, using recognition of each sign to "translate" ASL into written text. Come the day of the Hackathon, we decided to switch to Leap Motion, since with the armband, we would be limited to only one arm, and many signs require both hands, or even arm motions.
How it works: In progress.
Challenges I ran into: Leap Motion makes it extremely difficult to create custom gestures that we would need to create our database of signs. 2-3 hours were spent searching on how to do this with no result.
Accomplishments that I'm proud of: Mapping hand movements, in progress.
What I learned: I learned about Leap Motion.
What's next for Sign Translator: If one can find a way to create a collection of signs – the alphabet and certain words, for example – and then put any signs inputted through filters to determine which of these signs it matches, then we can identify these signs and effectively "translate" them. If this could be achieved, there could be bigger and better applications – after all, what good is Voice Control for a smart TV if you can't speak? It could be used as an Accessibility feature for people who require Sign Language to communicate.
Log in or sign up for Devpost to join the conversation.