Inspiration
There are 300 million deaf people around the world, and it is estimated to increase by 2025 to 900 million deaf people. Meanwhile, recent international studies show the high unsatisfactory feelings of deaf people who get health services.
What it does
It converts the hand gesture (from deaf people) and extracts useful information to fill their information in the database, helping the doctors save time and analyze their reports efficiently. It also enable doctor to talk directly to the deaf people.
How we built it
We use figma to design the medium. We tried to built it using Tensorflow, OpenCV and Mediapipe.
Challenges we ran into
We were not able to code the back-end completely but we are still pitching our idea to raise awareness!
What we learned
We learned about some of the machine learning algorithms and how the back-end actually works, which was very cool! This project also enhance our skill in designing UI/UX!
Built With
- brain
- figma
- mediapipe
- python
- sleeping-hours
- tears
- tensorflow
Log in or sign up for Devpost to join the conversation.