Inspiration
In the previous Hack the Bias, our team worked on a healthcare-related project, and we found the bias towards disabled individuals very interesting. We did not pursue it the previous year, so we focused on the systemic bias rooted in ableism. Considering the theme of noise and silence, we instantly began thinking of the challenges faced by the deaf community. Language is something used by every individual, every day, yet sign language is known by less than 1% of the world’s population. We felt that this was an issue of communication bias and audism, caused by the education systems and treatment of sign language as a crutch for the Deaf. Currently, gamification offers an engaging way for individuals to learn languages, with apps like Duolingo. However, these systems favour spoken languages, offering no means to teach a visual language like American Sign Language (ASL).
The systemic bias we aim to address with our project is audism, the discrimination against individuals who are deaf or hard of hearing. There are over 12 million deaf or severely hard of hearing individuals in the US and Canada with over 90% of deaf children born to hearing parents who don’t know sign language. Less than 70% of parents can hold a conversation with their deaf child in ASL and less than 20% of deaf people can actually use ASL as a form of communication.
Many parents feel the need to prioritize getting hearing therapy or getting cochlear implants for their deaf children, rather than their education.
This leads to Language Deprivation syndrome, where children face permanent cognitive delays due to a lack of language access in their formative years. Additionally, deaf individuals face massive barriers in health care and over 50% of deaf people face unemployment due to the communication barrier. 70% of deaf children experience language deprivation syndrome which leaves them falling behind on their education.
While hearing Therapy may help with their child’s hearing, it is also important to prioritize their language education. It is understandable for parents to want their child to be able to hear things normally, however, by only trying to prioritize “fixing” their child instead of communicating with them without barriers, it contributes to the societal belief that the ability to hear makes one superior, resulting in a world designed exclusively for the hearing.
This bias leaves millions of people isolated and silenced around all the noise, even within their own families, and creates dangerous communication gaps in critical services like emergency response and medicine.
What it does
SignEd aims to create a friendly environment for individuals to learn and practice sign language. Using an algorithm that recognizes gestures and translates them into common languages, SignEd enables easy ASL learning and a live translator during video calls. SignEd offers personal lessons where learners can learn new words and practice their signing accuracy by themselves. This utilizes our algorithm to identify what words a learner is trying to sign. These lessons range from the basics of learning the alphabet to teaching ASL grammar.
How we built it
SignEd was designed by us and generated by AI. Our team used our idea on the basic design of Duolingo and Khan Academy. We noticed their gamified systems for developing learning and making it enjoyable, and based our idea on the same process. We designed a basic framework for a curriculum and user experience, and we taught the algorithm ourselves. We took time to learn a few ASL signs, identified the hardest parts of learning, and implemented solutions within our project. AI tools were used to make sure our ideas were quickly and effectively implemented.
Challenges we ran into
One of the challenges we ran into was with our sign identification algorithm. Our algorithm identifies the sign per frame, meaning it does not allow for signs with movement. This means that J and Z, which involve finger movement, cannot be recognized. Due to time constraints and this being a demo project, we chose to focus on other features and stick with non-moving signs for now.
Accomplishments that we're proud of
Our proudest achievement is the machine learning aspect of our project. We successfully trained an algorithm to identify basic signs with high consistency and a high success rate. By analyzing user input, we created a system that enables live translation of signs.
What we learned
Throughout this process, we learnt how to use AI tools efficiently to quickly develop our product while still maintaining the human aspects of the project. Beyond the technical, we also learnt about the societal impact on the deaf and hard-of-hearing community. Society is based on the assumption that all humans are the same, and this significantly reduces the support that deaf and hard-of-hearing children have. We learnt about language deprivation, which is how deaf children often struggle to communicate with others because others do not know how to communicate with them. Our insight into this topic enabled us to create the best possible product that meets the genuine needs of the deaf and hard-of-hearing community.
What's next for SignEd
SignEd’s goals go far beyond what is seen on the screen. We want to develop full curricula, not only in ASL but also in other sign languages (e.g., Chinese, British Sign Language). We recognize that ASL is not the only sign language and should be treated as such. We started with ASL as it is the standard for our current location. Another key goal for us is to expand into other platforms, such as iOS and Android, to make sure that the ability to learn ASL is as convenient as possible.
Log in or sign up for Devpost to join the conversation.