Inspiration
We wanted to build something that connects AI and real-time data to ocean health and community action. Too often, sustainability tools are either data-heavy dashboards with no clear next step, or one-off campaigns with no lasting impact. We were inspired by the idea of an AI that doesn't just report problems. It helps coordinate cleanups, briefs volunteers by voice, and makes drone-based monitoring tangible and interactive. Riptide, our in-app AI assistant, and the rest of the platform are our answer: one place where monitoring, cleanup operations, education, and voice-enabled outreach all work together for social good.
What it does
Riptide is an AI-powered platform for ocean health and social good. The dashboard gives a live view of drone scans, alerts, cleanup operations, and city-level ocean and marine metrics including kelp, trash, and water quality, with predictions and tracks. Through the cleanup operations module, users can create and manage cleanups, track donations, and post AI-generated job listings. Volunteers can request a voice briefing for any operation: the system calls them via Twilio, plays an ElevenLabs TTS summary, and can run a short conversational Q&A.
The 3D drone simulation puts you on an interactive Mapbox map of the California coast with 3D terrain, satellite or streets view, and an animated patrol path. Markers show pollution hotspots (plastic, oil, algae, ghost gear, sewage) with short descriptions. A Follow Drone mode moves the camera with the drone for an immersive patrol view. Live Ops handles incidents, claim and resolve workflows, a leaderboard, missions, and crisis and demo modes for coordination and gamification. The Riptide AI chatbot is an in-app assistant running on Groq or OpenAI with access to live stats and data, ready to answer questions about sustainability, ocean health, and the platform. Rounding it out, the impact and education section tracks metrics, reports, and analytics, and includes a school scoreboard where schools earn points for real-world actions including cleanups, donations, and missions, with an admin review flow and badges.
How we built it
The backend runs on Node.js, Express 5, and TypeScript, with a REST API covering scans, alerts, cleanup, donations, Twilio webhooks, TTS via ElevenLabs, call logs, live ops, game and scoreboard logic, and a Mapbox token proxy. We used PostgreSQL with Drizzle ORM and Drizzle Zod for validation, sharing the schema between client and server. On the frontend, we went with React 18, Vite 7, Wouter for routing, and TanStack Query for server state, styled with Tailwind CSS and Radix UI components for a consistent light/dark UI.
For maps and 3D, Mapbox GL JS and react-map-gl power the drone simulation, including terrain, styles, GeoJSON layers for the path and hotspots, and the follow-drone camera, while Leaflet and react-leaflet handle other map views. The voice pipeline uses Twilio for outbound calls and webhooks, ElevenLabs for TTS, and a real-time WebSocket/HTTP bridge so the voice agent shares the same AI as the chatbot. AI throughout the app comes from Groq (primary) or OpenAI and drives the chatbot, cleanup job generation, game briefings, and other text features. Live data like scans and cleanup status gets injected into chatbot context so answers stay relevant.
Challenges we ran into
The 3D map stack gave us early headaches. We first built the drone simulation with Cesium and Resium, but in our React 18 setup, Resium's viewer touched a React 19-only internal called recentlyCreatedOwnerStacks, which was undefined and crashed the page. Switching to Mapbox and a Cesium-free implementation gave us a reliable 3D experience without fighting React internals.
Making the simulation feel meaningful was another challenge. Early feedback was that colored dots labeled "plastic" or "oil" at seemingly random locations didn't tell a story. We reframed them as pollution hotspots with clear names like "Plastic accumulation, Point Reyes" and short descriptions, and added the Follow Drone camera so the patrol path actually feels like a real mission.
Wiring the voice pipeline together was fiddly. Getting Twilio, ElevenLabs, and our AI to agree on webhooks, state, and timing took careful work, but once we nailed down a clear flow (answer, gather speech, AI, TTS, respond), everything clicked into place.
On the ops side, a few small things tripped us up during the final push to GitHub: an unclosed quote in a commit message, password auth that needed to switch to a Personal Access Token, a repo name typo (Riptie vs Riptide), and an HTTP 400 from a large push. Fixing the remote URL and bumping http.postBuffer sorted it out.
Accomplishments that we're proud of
We're most proud of building a single, cohesive platform. One app that ties together monitoring, cleanups, donations, voice briefings, a 3D drone sim, live ops, education and scoreboard, and an AI assistant, rather than a collection of disconnected tools. The Mapbox-based California coast view with real terrain, toggleable map styles, an animated patrol path, and pollution hotspots is both engaging and genuinely informative. The voice pipeline works end-to-end: volunteers can request a call, hear an ElevenLabs-generated briefing, and have a short AI-driven Q&A, with transcripts and call logs visible right in the Cleanup UI. Riptide itself is a data-aware assistant. Because it draws on live platform data, answers are grounded in what's actually happening rather than generic sustainability copy. And we put real effort into design and UX, with a consistent theme, clear hierarchy, and a layout that feels like a finished product rather than a prototype.
What we learned
Choosing the right map stack matters more than it might seem. Cesium and Resium were powerful on paper but introduced React version and bundling issues that weren't worth fighting; Mapbox gave us a simpler, more predictable path to a working 3D experience. We also learned that narrative and context are what turn raw data into something people understand. Reframing detection types as named pollution hotspots with descriptions made the drone simulation click for anyone who saw it.
Voice plus AI is powerful but genuinely fiddly. Aligning Twilio, ElevenLabs, and our AI model on webhooks, state, and timing required a clear mental model of the full flow before the pieces started cooperating. And small operational details really do matter: unclosed quotes, typos in repo names, and HTTP buffer limits can block a push at the worst possible moment.
What's next for Riptide
We want to connect the 3D simulation to live or historical drone and scan data so the patrol path and hotspots reflect actual missions rather than demo content. Extending the Mapbox simulation beyond the California coast to other coastlines and marine areas is a natural next step. On the voice side, we'd like to add multi-turn conversations, optional language support, and a cleaner "request a callback" flow from the Cleanup page. We're also planning Progressive Web App support and offline-friendly views for field volunteers. Finally, we want to deepen the education loop by tying scoreboard actions to curriculum and classroom missions, with more badges and verification options for schools.
Built With
- css
- elevenlabs
- gpt4-0mini
- grok
- groq
- html5
- mapbox
- mongodb
- ngrok
- postgresql
- react
- solana
- twilio
- typescript
Log in or sign up for Devpost to join the conversation.