Inspiration
To make ASL more accessible!
What it does
Currently it recognizes alphabets of the english language.
How we built it
We collected data from the Leap Motion, processed the data so that the absolute position of the hand doesn't affect the result. We tested different Machine Learning models like SVM and Neural Networks to evaluate their performance. At last we chose SVM as it gave us the best balance of performance and accuracy.
Challenges we ran into
Leap Motion is still very buggy. It takes a lot of training to even get one letter recognized.
Accomplishments that we're proud of
Getting it to recognize whatever it does at the moment!
What we learned
The demos of the devices online most of the time send a very wrong message, that you can accomplish anything where you might actually not be able to.
What's next for Leap Interpreter
We will be improving it to recognize gestures and maybe include other devices such as Myo Armband to make it more robust.
Log in or sign up for Devpost to join the conversation.