Inspiration

28 million Americans struggle with eating disorders, where every meal becomes a source of anxiety and fear. We discovered a powerful insight from clinical research: people with eating disorders often care for others before themselves. This sparked an idea—what if we could redirect that nurturing instinct to support their own recovery?

We wanted to transform one of the most challenging aspects of recovery—eating meals—from a moment of stress into something rewarding and positive. By combining the emotional power of pet companionship with cutting-edge AR technology and AI agents, we created Sproutify to make recovery feel less like a battle and more like raising a friend.

What it does

Sproutify is a gamified AR companion system that transforms eating disorder recovery into a rewarding experience.

For Patients:

  • AR Cat Companion "Sprout" appears through Snap Spectacles during meals
  • Real-time AI Detection: Uses depth sensing and Google Gemini to identify food items, track macros, and measure consumption
  • Gamification System: Earn points for healthy eating behaviors: Eating speed and consistency, Meal completion rates, Nutritional balance (macros), Total time taken to eat
  • Rewards & Customization: Spend points on toys and cosmetics for Sprout
  • Intelligent Support: Sprout provides adaptive encouragement when hesitation is detected
  • Predictive AI: Detects concerning patterns and provides gentle interventions before relapse

For Doctors:

  • Automated Progress Reports generated by Fetch.ai agents
  • Long-term Behavior Analysis: Track eating patterns, nutritional progress, and consistency
  • Early Warning System: AI flags concerning behaviors for clinical intervention
  • Data-Driven Decisions: Objective metrics to inform treatment adjustments

The Result: A complete support ecosystem that makes recovery feel rewarding instead of restrictive.

How we built it

We used Spectacles and Lens Studio as our main platform for AR interactivity with the outside world. Our algorithm accesses our custom Fetch.ai endpoint to access the long-term data of the user and returns a JSON object to our Lens client. We also use local object recognition algorithms to identity objects locally.

Challenges we ran into

One challenge is real-time Object Detection: Achieving accurate food recognition and portion estimation through Spectacles’ limited onboard compute required optimization and hybrid inference (local + cloud via Google Gemini).

Another challenge is balancing gamification with clinical sensitivity was difficult—we had to ensure the app motivates recovery without triggering anxiety or unhealthy behaviors.

Accomplishments that we're proud of

  • Functional AR Prototype: We successfully created an interactive AR companion that detects food items in real-time, reacts emotionally, and evolves with the user’s progress.
  • We also built a full End-to-End AI Pipeline: from food detection to behavior analysis, our Fetch.ai agents and Gemini-powered models enable seamless monitoring and insights for clinicians. We are also proud of our clinically inspired design: We conducted psychological research on recovery motivation, turning empathy and care into measurable progress through gamified behavior reinforcement.

What we learned

Technical Skills

  • AR Development: How to build immersive experiences for Snap Spectacles
  • Depth Sensing: Understanding and leveraging 3D computer vision
  • Multi-agent Systems: Orchestrating specialized AI agents with Fetch.ai
  • Multimodal AI: Using Gemini's vision capabilities for food analysis
  • Real-time Systems: Balancing latency, accuracy, and user experience

Design Insights

  • Gamification Psychology: How to motivate behavior change without triggering
  • Healthcare UX: Building interfaces that work for both patients and clinicians
  • AR Interaction: Creating intuitive controls in 3D space

What's next for Sproutify

Sproutify hopes to continue the transition o making good from a negative to a positive experience for all patients struggling with anorexia. We hope to fully integrate nutritional data in the form of subtle suggestions through Sprout into Sprout’s verbal and visual cues. We also hope to export our nutritional analysis in a format suitable to be used in medical reports to track and predict past and future recovery trends to assist medical professionals.

Built With

Share this project:

Updates