Inspiration

We wanted to help blind and deaf people

What it does

Speaks a description of a blind person's environment, shows on the screen speech around the deaf person

How we built it

On vscode, with opencv + openai + ultralytics

Challenges we ran into

Commits not working properly, not being able to share files between teammates

Accomplishments that we're proud of

Making this in such a short time

What we learned

Coding is hard

What's next for Perception

Making a frontend

Built With

Share this project:

Updates