Inspiration
Having a hearing impairment can impede many aspects of your life. Even the basic luxuries of like like dining out can be ruined due to the lack of accessible services out in place. We saw an example of a deaf couple in Dublin who were unable to order at a drive thru. Regardless of the couples efforts to write down their order (which had worked in the past) the overwhelmed employee was unwilling to accommodate and cited busy hours as the reason why. They tried to explain multiple times that they were deaf but the worker made no attempt to communicate with them, as she spoke with her face mask pulled up. The interaction turned hostile quickly with swear words being thrown. Companies always claim to train their employees, but the laziness of the general public to learn a new skill, and the stigma surrounding efforts to cater to deaf and hard of hearing people make this impossible to sustain.
What it does
We wanted to create a service that would be a live sign translation tool to be fit into service providers and retail stores. The product would go both ways, the customer can sign to the employee and have their words translated, and the employee can speak and have a visual signing back to the customer.
How we built it
For implementing the code, we made a machine learning algorithm that had 3 stages:
- Data collection by taking 100 sample of each letter of the alphabet. Creating the dataset for was implemented using code provided by Felipe- Computer Vision Engineer on YouTube.
- We then trained the model with a Random Forest Classifier. We used 80% to train and 20% to test.
- Finally, we used the model in real time to translate sentences and output speech using gTTS.
Challenges we ran into
We couldn’t translate full words of ASL or any letter that required motion (Z) and could only do the model for still motions like letters of the alphabet. But in the future we want to expand this so that there is a dictionary of words and not just letters to spell things out. We also used ASL as the signs were a bit simpler, but we of course would want to develop for BSL.
Accomplishments that we're proud of
We're proud of our efforts to set a new standard when it comes to accessibility services, and challenge people's viewpoints on this matter. We're trying to normalise the social model of the world and use of sign language as any other.
What we learned
We learnt a lot about the hardships deaf people and those who are hard of hearing go through. As far as the tech goes, we learnt a lot about image classification and got to touch up our machine learning skills.
What's next for Rise 'n' Sign
Develop on our shortcomings such as video-based recognition and include all types of sign language. And of course make this a viable product for people to use! <3
Built With
- cv2
- gtts
- mediapipe
- python
- scikit-learn
Log in or sign up for Devpost to join the conversation.