Clip

Your memories, always within reach. Check out our linked video for a live demo!

Say "Clip that" to save the last 30 seconds. Say "Hey Clip" to ask what happened. Built for Meta Ray-Ban glasses.

The Problem

You're with friends, something hilarious happens, and by the time you reach for your phone, the moment's gone. For someone whose memory isn't as reliable, it might be lost forever. People with memory-related conditions (or people who just have a bad memory--we're familiar!) face anxiety and feel isolated when it's so easy to lose track of conversation and daily moments.

What Clip Does

Wear your Meta Ray-Ban glasses and live life as usual. When something worth clipping happens, "clip that"! The last 30 seconds will be saved to your device, and semantically searchable if you ever want to find that moment again. Later, if you ever can't remember something, just ask! "Hey clip, what did I do yesterday?". Clip searches your captured memories, finds the relevant one, and speaks back to you to remind you.

Use Cases

For Accessibility

  • Alzheimer's/dementia support — Replay recent conversations, remember names and faces
  • Caregiver tool — "What did the doctor say?" answered instantly
  • Anxiety reduction — Knowing you can always check what happened

For Fun

  • Capture spontaneous jokes without ruining the moment
  • Save travel moments while staying present
  • Record conversations you want to remember
  • Build a searchable archive of your life's highlights

Tech Stack

Layer Tech
Hardware Meta Ray-Ban glasses
iOS App SwiftUI, iOS 26 Liquid Glass
Wake Word Native SFSpeechRecognizer (on-device, offline)
Video/Audio 30-second rolling buffer, AVFoundation
Transcription ElevenLabs
Vector DB Milvus (semantic embeddings)
Embeddings CLIP (Contrastive Language-Image Pre-Training)
LLM Google Gemini
Voice ElevenLabs TTS
Backend FastAPI + SQLite
Audio + Video Processing FFmpreg

Team

  • Cameron Lee
  • Julia Zhong
  • Erik Lin
  • Jay Park

Built With

+ 4 more
Share this project:

Updates