Problem Statement

The current communication gap between deaf and hearing individuals due to the lack of access to sign language interpretation services creates exclusion and isolation of the deaf community. This issue results in a lack of access to essential services, information and opportunities, as well as social and economic disadvantages for the deaf community.

Solution

Our app aims to address this issue by providing a real-time solution that detects sign language gestures, translates them into written text, and facilitates communication between deaf and hearing individuals, breaking down communication barriers and creating a more inclusive world.

Development Process

Our design process for the sign language translation system began with the creation of detailed wireframes and prototypes in Figma. We then selected the appropriate tools and technologies, opting for Python and TensorFlow library that integrates computer vision. To create a user-friendly interface, we designed a React.js frontend that integrates the camera for real-time sign language detection and identification. This front-end uses the ML model hosted on a Flask server to translate sign language into English. To extend the functionality of our application, we developed a React Native app that allows users to send messages back to the Reach app in English, which is then translated back into sign language. We integrated APIs to enable real-time communication between all applications and to update data as needed. Overall, our design process involved careful consideration of user needs, selection of appropriate technologies, and the integration of multiple components to create a functional and efficient sign language translation system.

Impact

  • Improved communication: The app enables deaf individuals to communicate more effectively with those who can hear, breaking down communication barriers and allowing for more seamless communication.
  • Increased access to information: The app allows deaf individuals to access information that was previously unavailable to them due to the language barrier. This includes accessing educational materials, job opportunities, and social and cultural events.
  • Enhanced social inclusion: The app promotes social inclusion by allowing deaf individuals to fully participate in society and interact with their hearing peers.

Challenges

Our biggest challenge was integrating different components of the app to accurately translate sign language gestures into written text in real time.

Successes

Our greatest success is the social factor of the app as it has the potential to create a positive societal impact by promoting inclusion, reducing stigma, and increasing understanding and empathy toward the deaf community.

Future Plans for Signify

  • Expansion of language support: Currently, the app is designed to detect and translate sign language gestures in English. However, it could be extended to support multiple sign languages and written languages, making it more accessible to a wider audience.
  • Integration with other apps and platforms: The app could be integrated with other communication platforms and apps, such as social media, email, and messaging apps. This would allow for seamless communication between deaf and hearing individuals across various platforms.

Built With

Share this project:

Updates