Inspiration

We were inspired by a situation that one of our members faced while walking around the campus of the school. He encountered a visually-impaired person who wanted to cross a bridge and knew that there was a puddle ahead but did not know exactly where. The group member proceeded to help the visually-impaired person cross the bridge and avoid the puddle and reflected that vision is something that is taken for granted very often.

What it does

It aims to create a feedback system for visually impaired people by using a phone camera to analyze the path ahead for hazards. It would let them know if there is an obstruction in their path with haptic feedback (distinct vibration patterns would differentiate directionality in obstructions) or stereo sound. Also, if hazards like puddles are detected a text to speech engine would let them know of the nature of the hazard.

How we built it

We used AndroidStudio to achieve these tasks in parallel.

  • Getting the images from the camera and caching them for upload
  • Uploading the images to the Azure Computer Vision API for analysis
  • Interpreting the responses from the Azure Computer Vision API
  • User experience, safety features such as quick call, texting coordinates and haptic feedback/sound design

Challenges we ran into

Since all of us were very unfamiliar with Android development, it was very difficult to get started with the hack. We tried to follow tutorials but ran into deprecation issues with a lot of the advice on the internet. Getting the images from the camera was much more challenging than expected and we didn’t end up finishing it because we had issues with saving the image after it was taken.

Accomplishments that we're proud of

  • Learning AndroidStudio and how to make a basic app
  • Getting the GPS and emergency text features to work reliably

What we learned

  • Basic Android Studio programming
  • Sharpened up Java skills

What's next for Aether

  • Finish implementation!
Share this project:

Updates