Inspiration
As we arrived on the Grounds early Saturday morning, we were inspired by the uniquely designed trash and recycling cans we spotted around campus. People often seemed to find choosing between the two options confusing, and using the QR codes on them was inconvenient and time-consuming, so we thought: Why can't we build a better way to identify waste and promote sustainability at the same time? That's how tracye.tech was born!
What it does
Trayce is a smart waste-sorting assistant and nutrition tracker that uses computer vision to identify and classify food tray contents into trash, recycle, compost, or dish return — making disposal effortless and eco-friendly. We utilized Google Gemini's API and YOLOv11 to implement the computer vision functionality of detecting and classifying items as well as providing a tray analysis.
How it works
Upload a Picture or Use a Webcam: Users take a photo of their tray after a meal.
Gemini 2.0 Vision API: We apply Gemini’s spatial understanding to identify all visible items and draw bounding boxes.
Food & Object Classification: Gemini also helps distinguish which items are food and which are packaging, containers, utensils, or dishes.
USDA Keyword Matching: If the object is food, we optionally match it with the USDA food database to provide nutritional insight (e.g., calories per 100 grams).
Waste Classification: Each object is labeled as compostable, recyclable, trash, or dish return, using the Google Gemini API.
User Auth & History: We use Auth0 for secure authentication and store each user’s sorting history in SQLite for review and tracking sustainability impact.
How we built it
Backend: Flask and Python.
Gemini 2.0 Vision API: For detecting all objects on a tray with high spatial precision.
Gemini Natural Language API: To determine food vs non-food and help map items to the appropriate waste category.
USDA Food Database: For food-related calorie estimation.
Auth0: For user login and profile management.
SQLite: Chosen for its simplicity and ability to log user-specific sorting histories for data visualization or gamified feedback later.
Challenges we ran into
Ambiguous food containers: Some materials (e.g., bioplastics) are tricky to classify without context, but switching to Gemini's Spatial Understanding made this material identification extremely simple.
Bounding box clutter: Too many detected objects sometimes created overlap in object classification.
Accomplishments that we're proud of
Accomplishing accurate waste stream classification and meal nutrition tracking from a single image using real-time vision.
What we learned
We learned a lot about using Gemini, computer vision models, full-stack development.
What's next for Trayce
Integrate image depth estimation to better judge item sizes for smarter classification and calorie count.
Add gamification: show users how much waste they’ve diverted from landfills over time.
Build a campus dashboard for facilities management to track overall compost/recycle/dish return volumes.
Deploy as a kiosk or mobile web app at dining halls or cafes.
Log in or sign up for Devpost to join the conversation.