Inspiration
Every solopreneur has the same problem: you have the vision, but you're one person. You can't research competitors, build a landing page, draft investor emails, AND make outreach calls — all at the same time. What if you could just say what you need, and a team of AI agents executed it in parallel, collaborating with each other like real employees?
We were inspired by Paperclip AI (open-source orchestration for zero-human companies) and the OMI wearable's real-time transcription capabilities. We asked: what if we stripped Paperclip down to its core and made it work from your wrist?
What it does
Interstice is a multi-agent AI orchestration system controlled by voice through an OMI wearable. You speak a command, and a CEO agent decomposes it into subtasks, delegates to specialist agents (Research, Communications, Developer, Call), they execute in parallel — actively sharing information with each other — and the CEO synthesizes results and responds back through your wrist.
Key demo: You say "I need a competitive analysis of the AI wearable market, a landing page for Interstice, and an outreach email to investors." Three agents spin up simultaneously. The Research Agent searches the web via Perplexity, then posts findings to a shared channel. The Developer Agent reads those findings and builds a landing page with real market data — not placeholder copy. The Communications Agent reads the same findings and drafts a data-driven investor email. The CEO synthesizes everything and reports back through OMI.
Agents don't just run in parallel — they collaborate. That's the difference.
The Call Agent can even make real phone calls mid-demo using ElevenLabs Conversational AI + Twilio, with an approval gate so nothing fires without your say-so.
How we built it
- Agent execution: Each agent is a Claude CLI subprocess (
claude --print --output-format stream-json --resume [sessionId]). No API keys needed — sessions persist across heartbeats via--resume. - Real-time backend: Convex handles the database, task queue, inter-agent message bus, and real-time subscriptions — zero WebSocket boilerplate.
- Inter-agent communication: Agents post to a shared findings channel in Convex. Other agents subscribe and incorporate results into their own outputs.
- Task queue: Atomic checkout prevents duplicate work. Approval gates pause execution for sensitive actions (sending emails, making calls).
- Skill system: Drop a TypeScript file in
skills/, register it to an agent. Each skill declares whether it needs human approval. - Voice I/O: OMI wearable streams transcripts via webhook. Responses go back through OMI proactive notifications + ElevenLabs TTS.
- Dashboard: React + Convex subscriptions. Live org chart with glowing agents, animated message lines between collaborating agents, streaming Claude output, and an approval queue.
Challenges we ran into
- Session persistence across heartbeats: Getting Claude CLI's
--resumeflag to reliably maintain agent context between wakeups required careful session ID management in Convex. - Inter-agent timing: The Developer Agent needs Research Agent findings before it can write real copy. We had to build a subscription-based waiting mechanism rather than simple polling.
- Atomic task checkout: Preventing two agents from grabbing the same task required Convex mutation-level atomicity — no optimistic locking hacks.
- OMI transcript buffering: Raw transcript segments arrive mid-sentence. Detecting command completion without cutting off the user required careful debouncing.
What we learned
- Claude CLI as a subprocess runner is surprisingly powerful — session persistence means agents maintain full context without managing conversation history ourselves.
- Convex's real-time subscriptions eliminated an entire class of infrastructure problems (WebSockets, polling, state sync).
- The gap between "agents running in parallel" and "agents actually collaborating" is where the real value lives — and it's not that hard to bridge with a shared message bus.
What's next for Interstice
- More specialist agents (Finance, Legal, Sales)
- Multi-user support with role-based approval chains
- Persistent company memory across sessions
- Mobile dashboard for approvals on the go
- Integration with more wearables beyond OMI
Built With
- claude-cli
- convex
- elevenlabs
- next.js
- node.js
- omi-wearable-sdk
- perplexity-api
- react
- twilio
- typescript
Log in or sign up for Devpost to join the conversation.