Inspiration

In a world full of flavors, cuisines, cultures, and people; society sometimes allows people with food related ailments and dietary restrictions to get lost in the fold. Inspired by a team member with celiac disease, this project aims to allow users with dietary restrictions to quickly identify restaurants that they will be able to eat at after a simple questionnaire and restaurant input.

What it does

"Can I Eat Here?" is an AI-powered food safety web application that helps users with dietary restrictions and allergies safely navigate restaurant menus. The system uses Google's Agent Development Kit (ADK) to deploy specialized AI agents that intelligently analyze restaurant menu data through web scraping and cross-reference it against user-specified dietary needs. Users input their restrictions (gluten-free, dairy free, allergies, etc.) through an interactive web interface, select a restaurant, and receive real-time "CAN EAT" or "CANNOT EAT" recommendations with detailed explanations. The multi-agent architecture includes a restaurant agent for menu data retrieval, an ingredient agent for food composition analysis, and an allergen agent for safety validation, all orchestrated through a Flask backend that delivers personalized dining recommendations to help users make informed food choices.**

How we built it

"Can I Eat Here?" leverages Google's Agent Development Kit (ADK) Runtime to orchestrate a sophisticated multi-agent system for personalized food safety analysis. The architecture deploys three specialized LLM Agents with distinct instruction prompts and tool integrations: a Restaurant Agent equipped with Google Search tools for real-time menu data retrieval, an Ingredient Agent for comprehensive food composition analysis, and an Allergen Agent for cross-referencing dietary restrictions against ingredient profiles. These agents operate through ADK's Agent Config framework, utilizing Gemini-2.5-Flash models with structured output schemas and tool calling capabilities. The system implements a sequential agent workflow where the Restaurant Agent's output serves as context input for downstream agents, creating a multi-step reasoning pipeline. The Flask backend serves as the Agent Engine orchestrator, managing agent lifecycle, context propagation, and response streaming through ADK's run_live() async generators. Users interact through a web interface that triggers agent execution chains, with each agent contributing specialized analysis that culminates in deterministic safety recommendations based on LLM-powered reasoning over real-world restaurant data."

Challenges we ran into

We had a lot of difficulty trying to learn how to use the A2A protocol and using the agents to talk to each other, and with the thought in mind of wanting to present a fullstack application with a functioning frontend, and a proper simulation of what a final backend would look like, we decided to pivot and instead use Flask to run our backend and orchestrate our program. Because of this however, it was difficult to send and recieve prevalent information between our agents, and we decided to create a workflow that would instead simulate the menus of restaurants that we input into the program in a more general way as to display functionality. To this end, we have a prototype of the program that uses the agents to respond to prompts in a simulated manner, and have continued developing other avenues to achieve web scraping and program simulation, such as a model using the SerpAPI to complete web Scraping and input menu data into the model.

Accomplishments that we're proud of

Having a full stack application to display our concept in prototype that classifies menu items based off of the user's restrictions implementing ADKs. A promising final result proving that there is a foundation for future refinement and the tangible final product is both feasible and practical.

What we learned

We learned about implementing python with HTML/CSS/JS and flask, and gained an appreciation for ADKs as an aspect of computer science that will undoubtedly be a pillar of software development in the near future.

What's next for Can I Eat Here?

We would like to fully implement ADKs and finish optimizing the ability to scrape the web for restaurant menus. Features we want to explore are having the AI understand food items off of a picture of a restaurant's menu and being able to add custom allergies or dietary restrictions in a text field for an Agent to interpret.

Built With

Share this project:

Updates