Glimpse
Inspiration
Location-specific photo mural where memory persists.
In the physical world, places hold memories. But today’s social interactions are mostly detached from location and time—optimized for close friends or fast-scrolling feeds. We saw an opportunity to create a persistent, location-based, anonymous social layer on top of the real world—one that grows richer the more people participate.
- Instead of swiping, discover anonymous people’s memories by walking around.
- Instead of typing, voice out remixes of other people’s moments.
Sometimes that remix is playful—a time-gap meme or visual joke. Other times, it’s meaningful—like visualizing rising sea levels along a coastline or tracking how a place changes through climate events.
What it does
Discover
As users walk around, they see photo bubbles anchored to locations, synced from Snap Cloud. Scroll through the mural to explore what others captured there.React
Tap to enlarge, double-tap to like. All interactions are saved to Snap Cloud in real time.Remix
Hit Remix on any photo, voice-prompt our Gemini AI integration, and create a transformed version. The original stays; the remix appears as a new bubble.Capture
Take photos, optionally AI-transform them, and add them to the location’s permanent collection.
Over time, Glimpse becomes more valuable not because of virality, but because it accumulates collective context.
How we built it
Using Snap Spectacles + Snap Cloud, we turned fleeting interactions into a living, evolving photo mural—a shared memory space where every action permanently alters the world state.
We began by clearly defining the problem, then moved quickly into designing a detailed user flow. This helped us:
- Establish the project scope
- Identify core MVP features
- Separate MVP from nice-to-have enhancements
Once the MVP was solidified, we explored ideas for future expansion and scalability.
Technical stack & approach
- Built in Lens Studio as a mixed-reality application on Snap Spectacles
- Persistent spatial data and interactions saved to Snap Cloud, leveraging PostgreSQL database, real-time websocket, and Deno edge functions.
- Location-aware content and spatial interactions blended with the real world using Snap's Location API.
- Nano Banana integrated for image generation and enhancement during remix workflows
Challenges we ran into
Amanda
- Scoping and topic selection
- Narrowing ideas (sketch, music, photo) into final MVP user steps
- Presentation framing and video storytelling
- Overall system architecture and experience cohesion
- Scoping and topic selection
Jonathan
- Integrating location-based photo grouping
- Distinguishing and ranking most-liked photos
- Integrating location-based photo grouping
Sarah
- Gaining confidence working with Cursor and connecting it to Lens Studio AI
- Navigating uncertainty around AI tooling due to a non-traditional software background
- Overcoming skepticism toward AI reliability with the help of teammates and Snap engineers
- Gaining confidence working with Cursor and connecting it to Lens Studio AI
Jingle
- Lack of familiarity with coding and scripting in Lens Studio
- Understanding syntax and development logic
- Successfully building required features using vibe coding tools despite limited prior experience
- Lack of familiarity with coding and scripting in Lens Studio
Team
- First-time experience with Lens Studio for some members
- Tooling friction and setup issues on Windows laptops
- First-time experience with Lens Studio for some members
Accomplishments that we’re proud of
- Rapidly learning and applying vibe-coding tools (Cursor, Claude Code) and Snap Cloud within Lens Studio
- Designing and implementing the full end-to-end user flow:
- UI
- Voice-to-text
- Databasing
- Physics-based spatial interactions
- Successfully integrating Nano Banana for user-requested image modification
- Delivering a fully working, functional (and not broken!) prototype
What we learned
- The importance of fleshing out story, purpose, and user flow early
- How strong upfront alignment enables:
- Clear feature prioritization
- Better work distribution
- Faster end-to-end prototyping
- How to translate prior Unity experience into Lens Studio to build functional mixed-reality features quickly
What’s next for Glimpse
Deeper persistence & time layers
Let users scrub through time to see how a location’s mural evolved over days, months, or years.Richer remix tools
Expand voice-driven remixing into multi-step transformations—visual annotations, generative overlays, and environmental simulations.Collective storytelling modes
Introduce themed prompts (e.g. first memory here, before & after, climate change) to guide meaningful contributions.Scalability & performance
Improve spatial clustering, ranking, and loading for high-density locations.Community governance
Lightweight moderation, reputation signals, and opt-in curation—without breaking anonymity.
Glimpse’s long-term vision is a shared, anonymous memory layer for the physical world—where places don’t just exist, they remember.
Built With
- deno
- lensstudio
- nanobanana
- snapspectacles
- supabase
- typescript
- voicetospeech
- websockets

Log in or sign up for Devpost to join the conversation.