Inspiration
We wanted to streamline information and make important, but obscure information easier to access for everyone. Inspired by Qualcomm's/UCSD Health's sensor challenge to highlight the unhighlighted. Our project can aid hearing-impaired persons know their food better.
What it does
Nutrition Mission helps bring attention to and demystify mysterious ingredients in store-bought food. It is designed to read a label and highlight unfamiliar ingredients. It can also bring attention to more sought after + healthier ingredients.
User uploads picture of nutrition label to webapp, reads the label and converts to electronic information. Said information is displayed alongside the uploaded nutrition label. Users can find more detail about each ingredient by clicking/tapping on the ingredient name.
How we built it
Our backend is written in Python using Flask, and our front end is written with ReactJS, HTML, and CSS (Bootstrap). We also incorporated Google Cloud's text recognition API to read nutrition labels, and Algolia's search API to quickly sift through the FDA food substance database to pull more information about ingredients.
Challenges we ran into, accomplishments, and lessons learned
None of us do webdev, so this is our first exposure to setting up our own webpage + webapp. We had a lot of difficulty integrating all the different parts. This was also our first time using the React library, and being full stack developers for our own project. Basically, in 36 hours, we were introduced to fullstack webdev, then had to master it.
What's next for Nutrition Mission
An AR mobile map would make the visualization element of this app stronger and help hearing-impaired (but not vision-impaired) individuals better. This idea originally stemmed from an idea to quickly scan patient medical profile papers, input that info into an electronic database, and then allow doctors to quickly access that information in a visual form that can be quickly digested.

Log in or sign up for Devpost to join the conversation.