Inspiration

CLAIR was born from a handful of facts that are impossible to ignore: scientists are now finding microplastics in human placentas. The air I breathe and the water I drink carry a synthetic burden. The story of this contamination is written in plain sight on the labels of the products we use every day, but its language is obscure.

I became obsessed with the idea of building a translator—a tool that could read this hidden language and give consumers the one thing they lack: a meaningful choice. I wanted to create not just a piece of technology, but a storytelling instrument that could frame a planetary problem at a human scale, and then hand the user a way to act.

What it does

CLAIR is a lens for the invisible. In a world where microscopic plastics now permeate our blood, our lungs, and even the cradle of life, CLAIR turns passive awareness into active agency. It uses a sophisticated AI pipeline to scan a product’s ingredient list or packaging and instantly reveal its hidden polymer content. Each scan delivers a moment of clarity—a quick visualization of the synthetic materials we unknowingly consume—along with safer alternatives.

Alongside the functional tool, I created a 90-second cinematic short film to tell the story behind CLAIR. Using AI-generated narration in the style of Sir David Attenborough, synchronized music from Epic Mountain Music, and emotionally powerful nature footage, the video bridges science and emotion to make the crisis visible.

How I built it

CLAIR is an end-to-end system designed for accuracy, speed, and trust, running on a resilient and scalable architecture.

At its core, a Cloudflare Worker acts as the central nervous system. When an image is uploaded, it’s sent to Google’s Gemini Vision, which performs optical character recognition (OCR) to extract text and visual context. This data then flows into Gemini’s language model, guided by a rigorously engineered prompt and strict JSON schema to extract ingredient and material information.

A curated local database of known polymers (PET, acrylates, etc.) provides a rapid first-pass analysis. To add actionable value, I built a lightweight RAG pipeline using Serper to search for verified safer alternatives, which Gemini then summarizes.

For speed, each analyzed product is assigned a SHA-256 fingerprint that allows cached results to be returned instantly on repeat scans. The system also computes a microplastic percentage score:

$$\text{microplasticPct} = \mathrm{round}!\left(100 \cdot \frac{|\text{flags}|}{|\text{ingredients}|}\right)$$

For the storytelling component, I used Adobe Premiere Pro to edit and score the 90-second film, syncing narration, music, and visuals for emotional impact. This video plays directly before the live demo, creating a complete narrative-to-technology experience.

Challenges I ran into

Building a lens to see the invisible meant tackling three major challenges:

  1. Seeing with precision: The AI could confuse photos or misread text. I mitigated this with an ocrConfidence metric and conservative fallback logic.
  2. Speaking the truth: Hallucinations break trust. I enforced strict JSON schemas, cross-checked flagged terms against a verified polymer list, and traced every flag back to its evidence source.
  3. Acting with speed: Awareness tools must feel instant. I implemented 8-second timeouts on vision calls and a smart fallback to keyword analysis if the model response lagged.

On the creative side, the biggest challenge was producing a cinematic video within hours—matching narration beats, color-grading nature footage, and syncing the emotional flow to the music. It was exhausting but worth every second.

Accomplishments that I'm proud of

I’m proud of building a full, working end-to-end AI system in a weekend—one that feels both scientific and poetic. CLAIR runs smoothly, delivers reliable results, and tells a story that resonates deeply with viewers.

I’m especially proud of the 90-second film I edited and scored myself. Seeing the final cut—with Sir David Attenborough-style narration, music, and visuals synchronized perfectly—moved me to tears. That emotional connection reminded me why technology must serve empathy, not just efficiency.

What I learned

This project taught me that:

  • Provenance is the bedrock of trust. A black box is meaningless without transparency. Showing users why something was flagged builds credibility.
  • Unifying pipelines reduces chaos. Integrating image and text analysis into a single predictable flow stabilized the whole system.
  • Experience is part of engineering. A well-crafted story makes technical work memorable. The video, UI, and sound design aren’t decoration—they’re communication.

What's next for CLAIR

Next, I plan to expand CLAIR into a mobile app and browser extension, allowing instant scanning of product labels in real time. I also want to grow the polymer database through community contributions and integrate scientific data sources for higher precision.

Finally, I’ll continue developing the cinematic and educational side of the project—turning CLAIR into a living awareness campaign that blends art, science, and AI to give humanity the clearest lens it’s ever had on its own creation.

Built With

Share this project:

Updates