Inspiration
Recipe for Disaster stemmed from a group of hungry, bored college students.
Since living by ourselves for the first time, we quickly found ourselves slightly overwhelmed by the sheer amount of responsibility. Jobs ranging from cooking to general housekeeping were now delegated to specific roommates in an attempt to achieve some degree of efficiency. However, due to limited experience with ingredients and cooking, we quickly realized we were buying the same products every shopping trip and ultimately making the same kinds of foods out of those products every day. Though we enjoyed cooking, we fell into a monotonous routine that made eating boring.
Thus, "Recipe for Disaster" was born.
What it does
Recipe for Disaster is an iOS mobile application that allows users to take a picture of a shopping list of ingredients and outputs many healthy recipes tailored to your food preferences using such ingredients.
How I built it
We first used Cocoa Pods, a package installer, to install many dependencies we would need throughout the entire project (Firebase Core, Analytics, etc.). We then used XCode's storyboarding feature to edit the user interface and experience. After the basic UI/UX was finished, we implemented the backbones of our application: Machine Learning Kit and Spoonacular (a natural language processor). ML Kit would serve as the library that aided us in identifying written or typed words on a list, while Spoonacular would parse through the string obtained by ML Kit. After Spoonacular processes through the list of ingredients, it will return healthy recipes that are related to the ingredients given.
Challenges I ran into
Configuring the application's use of the camera was our first challenge. We overthought and attempted to build an overly complex camera application, rather than sticking with a simpler solution of using the native camera app. Cocoa Pods was our next challenge. We were unable to properly install several of our dependencies and had to request external help from a software engineer from Panera (thank you Nick!). We then had a lot of difficulty finding which framework we actually wanted to use for identifying text from a picture. We ultimately decided on ML Kit because there was good documentation and tutorials for implementation online. API calls to Spoonacular also served a significant roadblock, as there was poor documentation for Swift programmers.
Throughout the whole experience, we also inevitably faced the universal issue for programmers: fixing countless bugs.
Accomplishments that I'm proud of
We are extremely proud of being able to implement ML Kit and extracting usable data from a picture. We are also proud of being able to create the basic structure of an iOS application despite not having any prior experience of creating iOS applications before.
What I learned
We all learned many, many things this Hackathon. All of us had a taste of Swift, a programming language none of us had ever worked with before. We learned essential programming skills, such as calling APIs and using libraries.
What's next for Recipe for Disaster
We hope to continue to improve UI/UX and implementing more ways users can modify or specify their diets. We added an "Other" feature to the restrictions that users can input text to, but that functionality is incomplete. In the future, we would like to add additional functionality and improve the preceding features.

Log in or sign up for Devpost to join the conversation.