-
The code for camera input/output of movement
-
The code for hand movement tracking through camera
-
The code for camera permission
-
The code for database management with SQLite
-
The code for database management with SQLite
-
The code for database management with SQLite
-
The code for database management with SQLite
-
screenshot of database management
-
screenshot of database management
-
screenshot of database management
-
screenshot of database management
-
Logo base
Inspiration : We've always noticed a social barrier between those that are mute and the rest of society. We have experienced times where we wanted to communicate with someone who was unable to speak. Unfortunately, not everybody knows sign language, so that divides those suffering from mutism with everybody else.We sought out to destroy these barriers.
What it does : Our application is designed to translate sign language into either text or speech. The camera is used to record sign language live and breaks it up frame by frame in order to translate instantaneously. The app also offers a tutorial for those who would like to learn ASL (American Sign Language).
How we built it : We used Android Studio to create the application. We added google's pre-trained tensor flow lite models which were trained to track hand movements and return the data of each joint movement in the hands.
Challenges we ran into : Our biggest challenge was implementing the tensor flow lite models into our code because the integration of the code into our existing application proved to be more difficult than we had expected. While we were able to use SQLite to store images into a database, we had issues attempting to retrieve the values from the database.
Accomplishments that we're proud of : We discovered new technologies that we didn't know existed. We improved our skills in Android application development and gained valuable experience. We set high expectations for ourselves and created a concept for a life changing app that we will continue to develop beyond the Hackathon.
What we learned : We learned how to create mobile databases with SQlite using images as a stored data type. We learned that google, as well as other companies, allows free access to technologies that can be used in many benevolent, real world applications with lasting effects. We learned that human capability is not limited to physical disabilities and there are ways for everyone to thrive and have equal opportunities.
What's next for Synage : Continued development into application for deployment. An IOS version coded in swift. More research into hand tracking for other uses.
Log in or sign up for Devpost to join the conversation.