Inspiration

We wanted to make AR feel alive, not just a visual effect, but something that stays, remembers, and interacts with your space. What if your wall showed live weather, your floor became a calm breathing zone, or your desk turned into a game board? That’s how Spidgets were born : spatial widgets that live in the real world.


What it does

Spidgets bring your surroundings to life through interactive AR experiences. In this demo, you’ll discover three unique examples — each showing how the concept can scale endlessly. The Weather Spidget anchors to your wall and displays live weather with dynamic, real-time visuals. The Zen Zone Spidget transforms your floor into a calming space for mindful breathing, activating when you step in and fading as you leave. Finally, the Memory Game Spidget turns your table into an AR mini-game where you flip and match cards right on your real surface. Each Spidget is persistent — they stay where you placed them and reappear in the same spot every time you open the lens.

This is just the beginning, the Spidget concept is fully scalable, opening the door to countless interactive experiences across any surface or space..

Inspiration Image


How we built it

We built the entire experience in Snap Lens Studio, using the new Snap Cloud integration to connect with Supabase.

  • Supabase Edge Functions fetch real-time weather and reverse-geocode the city and state.
  • Supabase Storage holds all weather status icons like sunny, cloudy, or rainy that the Spidget downloads dynamically.
  • Supabase Database stores anchor-to-Spidget mappings so that each widget respawns right where it was placed.
  • Surface Placement and World Anchors give each Spidget a sense of space and persistence.

We also developed a BaseSpidget framework — a reusable TypeScript system that handles placement, attachment to anchors, and live Supabase sync. This makes it super easy to create new Spidgets in the future with minimal setup.


Challenges we ran into

Getting world anchors and Supabase records to sync perfectly took a lot of trial and error. We faced issues with asynchronous updates, permissions, and data mismatch while testing on Spectacles. Debugging cloud logs while wearing AR glasses is a unique experience — you could say we achieved Zen Zone patience the hard way.


Accomplishments that we're proud of

We built a living modular spatial ecosystem, where every spidget feels like its own lens, alive and aware in the world around you. Each one speaks to the cloud, anchors naturally in real space, and remembers its state as if it never left. With Supabase, Snap Cloud, and Lens Studio forming a powerful trio, we finally saw persistent, connected AR come to life, dynamic, self-aware, and seamlessly woven into reality..


What we learned

Supabase isn’t just a backend, it’s the perfect match for AR. With Edge Functions, Storage, and Realtime, we managed live data, media, and persistence effortlessly, without managing extra servers. But the real leap came from building our first-ever sophisticated system of spidgets, spatial widgets that could work, think, and communicate together in real time. It was our first attempt at designing such an interconnected spatial network, where every spidget felt like a living lens existing in shared space. Along the way, we learned deeply about world anchors, spatial placement, and what it takes to make AR feel genuinely there rather than just on screen.


What's next for Spidgets - The Spatial Widgets

This is just the beginning. Our dream is to create an entire ecosystem of Spidgets — tiny digital tools that live in your space. Imagine a calendar that hangs on your wall, a coffee counter that remembers your brews, or a digital pet that reacts to the weather, a WebXR webview portal on the wall. With Supabase as our brain and Snap Cloud as our bridge, we can scale infinitely and bring a new generation of living, persistent AR experiences to life.

Built With

Share this project:

Updates