Inspiration

The inspiration came from a use-case scenario. One of our team member’s mother, who is legally blind, has trouble reading nutrition labels. This gave us the inspiration to create an app tailored for those who are visually impaired to help them read nutrition labels.

What it does

Nutroscan, a mobile application, reads nutrition labels with an audio feature for the visually impaired, and is built with higher contrast and letter emphasis for the visually impaired to view.

How we built it

The team coded using React Native and Node JS for iOS and Android. We used existing packages with Azure computing engine that were capable of recognizing the item and retrieving data. We also used Google Auth to create a personal account and to save data.

Why we built it in the way we did

We decided to make a mobile application for our idea as it is easier to scan items with a phone.

Challenges we ran into

More than half of our team was inexperienced with React Native which resulted in a big learning curve, and we were all in different timezones, so it was a challenge to coordinate time to talk. Training the food detection model and getting the nutritional information about detected food was a bit challenging.

Accomplishments that we're proud of

We were able to come up with a simple but functional design that is easy to use for someone who is visually impaired.

What we learned

We were able to learn enough React Native in a limited amount of time and also to design a UI tailored for the visually impaired.e Learned how to setup and deploy deep learning models on Azure and how to use google oauth to verify users.

What's next for Nutroscan

We plan to do user testing with visually impaired users to help improve the user interface and user experience of the mobile application.

Share this project:

Updates